8 min read

Guide to the latest in Spatial Computing (WWDC24)

Guide to the latest in Spatial Computing (WWDC24)


In a wearable, highly sensitive information access platform where consumers and customers frequently question what the actual use cases are, how do you maintain a balance between protecting privacy and enabling all of the potential use cases that developers come up with?

That is exactly what WWDC24 is about. With an apparent light of updates visionOS 2.0 release, beneath there is an overwhelming amount of new features for developers, a compilation of answers to all of the feedback and requests provided during the first year.

some Numbers

Metric differences between editions don't tell the whole story, but they do help to see where the work is being focused.

Charts comparing the number of sessions, sample projects, and total duration between last year's WDDC and this one
Edition Sessions Duration Sample projects
2023 37 12:49:55 4
2024 30 10:06:28 10
Note that these values, particularly for sample projects, may vary due to the overlap across technologies (e.g., Metal, SwiftUI, visionOS)

Keep in mind that some sample projects were added right before WWDC24. All of the original ones have been updated, and some of them now need visionOS 2.0 as a minimum target.

TL;DR: A lot of energy went into documentation and practical resources.

some Words

Here are the first words of the names of this year's sessions, arranged by how often they appear.

  • What’s new (x6)
  • Create (x3)
  • Explore (x3)
  • Build (x2)
  • Design (x2)
  • Discover (x2)
  • Enhance (x2)
  • Meet (x2)
  • Optimize (x2)
  • Break into
  • Bring
  • Compose
  • Customize
  • Dive deep
  • Get started
  • Introducing
  • Migrate
  • Render
  • Work with

Titles continue to be action-oriented, with marketing touches of innovation and creativity. There is also a strong emphasis on onboarding and developer engagement, with less focus on foundational presentation; Apple wants developers to not just use but deeply understand and leverage these new technologies.


When categorizing the talks based on their content or purpose, it becomes evident that there is a significant focus on entertainment and gaming. Specifically, about one-third of the sessions are dedicated to this topic.


1/3 of the sessions dedicated to the topic

Tooling and pipelines, with several sessions that dealt with the current common problem for teams of how to go from idea to product (what tools?, what formats?, how to present?). Details the process with many helpful tips, elements in between, optimization techniques, debugging, and best practices combined with real-world experiences from other developers. With much more references and statements than ever before.

Then there are consistent evolutionary steps on shared activities (including custom Persona templates and FaceTime simulation)

Screenshot from the "Customize spatial Persona templates in SharePlay" session where a spatial persona can be positioned on a specific seat (in this example in front of the user) when a game starts
Custom Persona Templates example

A compelling chunk focused on Open standards this year, with outstanding videos about OpenUSD, MaterialX, WebXR, and all the great work done to include new input modes to the standards (transient pointer et al.).

The introduction of HealthKit for visionOS something to pay close attention to in the future.

Eye tracking data is health data. That means it needs to be protected as such, but not necessarily treated as radioactive.
Avi Bar-Zeev
Slide from the introduction of HealthKit for visionOS. It can be read: HealthKit capabilities in visionOS: - Query and write health data - Aggregate and compute statistics - Register for updates - Authorizations are managed in Settings - Health data syncs between devices
It is still unclear what type of samples can be obtained from the device.

Sound had kept the spot and delivered once again, including Web Audio APIs and WebSpeech API support.

Then there was the backbone but relatively small amount (less than 10% of the total) of sessions dedicated to API expansions.


I would be puzzled and would have thought that there was already a different stance this year for the AVP if it weren't for the short but extremely important release of the Enterprise APIs for visionOS

Slide from the introduction of Enterprise APIS. It can be read: Enterprise APls for visionOS: - Main camera access - Passthrough in-screen capture - Apple Neural Engine access - Spatial barcode and QR code scanning - Object tracking parameter adjustment - Increased performance headroom

...and the ARKit advancements in object tracking, plane detection, and room tracking.

An initial public iteration that validates the closed-by-default policy by only exposing what is explicitly (and loudly) asked for. It's also interesting how various APIs relocated the level, from easing access to Spatial tracking (now possible from RealityKit), extruding SwiftUI paths, to piercing to lower with LowLevelMesh. All of them unexpected and very welcome changes.

Missing bits


Dispersed throughout many of the sessions but not addressed directly (on the spatial computing topic at least, although there is a general session). Also lets keep in mind that just a few weeks ago, new features were announced for the Global Accessibility Awareness Day.

Apple Intelligence

However, there is mention of CreateML, custom CV, and access to Neural Engine through enterprise APIs.


But the way that object capture scans areas looks a lot like Luma Labs Capture, which makes me think that this will be coming soon.


Last year, we had some fascinating knowledge from cognitive scientists that were notably absent this year.

Other details

  • Advancements in guest users, AirPlay, and calibration process
  • Ergonomics all around
  • Allowing compatible apps to live outside of their folder indicates a high use and a more permissive mindset toward requiring platform adoption
  • Node programming appears to be here to stay
  • 3D Animation is coming together
  • Reality Composer (not Pro) appears to be absorbed by "Actions" together with Shortcuts and Preview features
  • We need to learn Blender
  • Custom hover effects (⭐)
  • If there is restricted world tracking (i.e., low light contexts), a new orientation-based tracking is available at the system level
  • Hand Prediction API will improve many experiences
  • We now know that the dinosaurs are named Izzy and Roger
  • Ornament for volumes


A shorter, practical edition that delivered a more open and uniform set of tools with clear direction and expansion. Developer feedback has been heard and included, with a general stance for the ecosystem that I believe will spark a new wave of powerful and stunning apps while keeping the platform's core values intact.

External view of Apple Vision Pro highlighting the array of cameras and sensors
External view of Apple Vision Pro highlighting the array of cameras and sensors


Actionable Steps

  1. Explore and leverage Enterprise APIs
    1. Integrate advanced features: Using the new enterprise APIs is a great opportunity. Keep in mind that while these applications will not be available to end users, there are several potential ways to improve your business with capabilities such as custom computer vision and access to the Neural Engine, which will allow for more advanced data processing and analytics.
    2. Security and Compliance: Defines the appropriate type of device to use for a Spatial Computing product. Apple Vision Pro leads here.
  2. Improve development and pipelines
    1. Streamline development: Use the new tooling and pipelines shared at WWDC24 to accelerate your product development cycle, from concept to delivery. Focus on using improved technologies such as RealityKit for spatial tracking and SwiftUI for more frictionless UI integration and product creation overall.
    2. Adopt open standards: Use standards such as OpenUSD and MaterialX to ensure interoperability and collaboration across multiple platforms and technologies.
  3. Prioritize Accessibility
    1. Embed accessibility from the start: Given the absence of direct emphasis in sessions, its important to remember the high priority from the beginning and include accessibility features to satisfy a variety of user demands. Adopt Apple's recently announced upgrades for Global Accessibility Awareness Day.
    2. Consult accessibility experts: Work with professionals to guarantee that your applications meet accessibility requirements and deliver a smooth experience for all users.
  4. Leverage HealthKit for visionOS
    1. Develop health-focused apps: With HealthKit, visionOS opens up new potential to create applications that boost user wellbeing while harnessing spatial computing's unique features.
    2. Data privacy compliance: Health-focused applications adhere to privacy standards, and managing sensitive data such as eye tracking with diligence and security opens up new opportunities in highly sensitive businesses.
  5. Re-think entertainment, and improve productivity
    1. Entertainment apps: Take advantage of visionOS 2.0's emphasis on entertainment and games to create rich, engaging customer experiences.
    2. Productivity solutions: Use the most recent APIs and technologies to develop powerful productivity apps that improve workflow and user efficiency, including ARKit advances for real-time object and plane identification.

visionOS 2 brings new spatial computing experiences to Apple Vision Pro
Apple today previewed visionOS 2, a major update to Apple Vision Pro that enhances how users engage with spatial computing.
WWDC24 visionOS guide - Discover - Apple Developer
The infinite canvas is waiting for you.
visionOS | Apple Developer Documentation
Create a new universe of apps and games for Apple Vision Pro.
RealityKit | Apple Developer Documentation
Simulate and render 3D content for use in your augmented reality apps.
ARKit | Apple Developer Documentation
Integrate hardware sensing features to produce augmented reality apps and games.
Sample Code - WWDC24 - Apple Developer