4 min read

Diffuse reflection UV computation tool

Diffuse reflection: A video screen with a soft glow reflects colors and light gently onto the ground plane

In visionOS, there's something about docking videos in immersive environments (anchoring them like a screen in a movie theater) that makes them feel much more present and impressive. Even though this has being limited to a select few apps and the system has the control over the location, I've found that practically everyone is amazed by this simple trick when I demo the device. Apple seems to think the same, and this year in the session Enhance the immersion of media viewing in custom environments, the DockingRegionComponent is introduced. This new component allows any app to customize a docking region, thus achieving the same or better effect.

Docking Region Component

The width property defines the maximum size for media playback using a 2.4:1 aspect ratio

⚠️
Videos that are larger than the maximum width will be scaled to fit the bounding regions.

One of the things that caught my interest was the attention to detail in media reflections. Thinking of reflections as a viewing affordance for media is a lovely idea; it reminds me of the early Keynote styles. The session includes all necessary details on how to set it up in Reality Composer Pro, but maybe an slightly unclear part is that the reflection texture mapping itself necessitates a well-defined set of UVs on the model, particularly when looking for extra customized geometry for the surroundings that will reflect the screen.

Conveniently enough, if you look carefully at the resources, you can find the only download of this year, a Python script (that works on top of the existing USD Python library) with an algorithm that computes emitter and attenuation UVs.

Screenshot of a custom made WWDC data explorer that helps me find this type of resources

Besides that, it comes with a great PDF that explains the process and basically lists the needs and possible solutions for something that I can see becoming part of RCP itself in the future.

Here's an overview of how it works (a combination of RCP and the terminal):

python3 computeDiffuseReflectionUVs.py <input.usd> -o <output.usd> -p / -r true

Resulting in an unintended donut house of mirrors


Enhance the immersion of media viewing in custom environments - WWDC24 - Videos - Apple Developer
Extend your media viewing experience using Reality Composer Pro components like Docking Region, Reverb, and Virtual Environment Probe…
DockingRegionComponent | Apple Developer Documentation
A component that allows dockable scenes to be docked in an immersive space within a self-defined docking region.

https://developer.apple.com/sample-code/ar/WWDC_2024_Diffuse_Reflection_UV_Computation_Tool.zip

Create custom environments for your immersive apps in visionOS - WWDC24 - Videos - Apple Developer
Discover how to create visually rich and performant customized app environments for Apple Vision Pro. Learn design guidelines, get expert…
Enabling video reflections in an immersive environment | Apple Developer Documentation
Create a more immersive experience by adding video reflections in a custom environment.
Destination Video | Apple Developer Documentation
Leverage SwiftUI to build an immersive media experience in a multiplatform app.