Discuss spatial computing on Apple platforms and how to design and build an entirely new universe of apps and games for Apple Vision Pro.

All subtopics
Posts under Spatial Computing topic

Post

Replies

Boosts

Views

Activity

CompositorServices Or RealityKit
I have been concentrating on developing the visionOS application. While I am currently quite familiar with RealityKit, CompositorServices has also captured my attention. I have not yet acquired knowledge of CompositorServices. Could you please clarify whether it is essential for me to learn CompositorServices? Additionally, I would appreciate it if you could provide insights into the advantages of RealityKit and CompositorServices.
2
0
796
Mar ’25
ARKit sessionInterruptionEnded never called in Window Mode.
Hi 26 beta guys, I have apps using ARKit. In iPadOS 26 beta, ARKit stops working after switching to other apps. how to: Enable WindowMode in iPadOS 26 Launch my app and start ARSession Switch to another app (preference app, etc.) Switch back to my app AR stops updating camerafeed. I debug printed ARSessionDelegate, and found that after sessionWasInterrupted was called, sessionInterruptionEnded was never called. sessionInterruptionEnded is called if WindowMode disabled. Is this just a bug for 26 beta? I suspect there is similar problem with non-AR camera. Any idea?
2
0
121
Jun ’25
Do you retain a reference to your content events in RealityView?
Do you retain a reference to your content (RealityViewContent) events? For example, the Manipulation Events docs from Apple use _ to discard the result. In theory the event should keep working while the content is alive. _ = content.subscribe(to: ManipulationEvents.WillBegin.self) { event in event.entity.components[ModelComponent.self]?.materials[0] = SimpleMaterial(color: .blue, isMetallic: false) } _ = content.subscribe(to: ManipulationEvents.WillEnd.self) { event in event.entity.components[ModelComponent.self]?.materials[0] = SimpleMaterial(color: .red, isMetallic: false) } We could store these events in state. I've seen this in a few samples and apps. @State var beginSubscription: EventSubscription? ... beginSubscription = content.subscribe(to: ManipulationEvents.WillBegin.self) { event in event.entity.components[ModelComponent.self]?.materials[0] = SimpleMaterial(color: .blue, isMetallic: false) } The main advantage I see is that we can be more explicit about when we remove the event. Are there other reasons to keep a reference to these events?
1
0
590
Sep ’25
ARKit Planes do not appear where expected on visionOS
I'm using ARKitSession and PlaneDetectionProvider to detect planes. I have a basics process to create an entity for each detected plane. Each one will get a random color for the material. Each plane is sized based on the bounds of the anchor provided by ARKit. let mesh = MeshResource.generatePlane( width: anchor.geometry.extent.width, depth: anchor.geometry.extent.height ) Then I'm using this to position each entity. entity.transform = Transform(matrix: anchor.originFromAnchorTransform) This seems to be the right method, but many (not all) planes are not where they should be. The sizes look OK, but the X and Y positions off. Take this large green plane on the wall. It should span the entire wall, but it is offset along the X position so that it is pushed to the left from where the center of the anchor is. When I visualize surfaces using the Xcode debugging tools, that tool reports the planes where I'd expect them to be. Can you see what I'm getting wrong here? Full code below struct Example068: View { @State var session = ARKitSession() @State private var planeAnchors: [UUID: Entity] = [:] @State private var planeColors: [UUID: Color] = [:] var body: some View { RealityView { content in } update: { content in for (_, entity) in planeAnchors { if !content.entities.contains(entity) { content.add(entity) } } } .task { try! await setupAndRunPlaneDetection() } } func setupAndRunPlaneDetection() async throws { let planeData = PlaneDetectionProvider(alignments: [.horizontal, .vertical, .slanted]) if PlaneDetectionProvider.isSupported { do { try await session.run([planeData]) for await update in planeData.anchorUpdates { switch update.event { case .added, .updated: let anchor = update.anchor if planeColors[anchor.id] == nil { planeColors[anchor.id] = generatePastelColor() } let planeEntity = createPlaneEntity(for: anchor, color: planeColors[anchor.id]!) planeAnchors[anchor.id] = planeEntity case .removed: let anchor = update.anchor planeAnchors.removeValue(forKey: anchor.id) planeColors.removeValue(forKey: anchor.id) } } } catch { print("ARKit session error \(error)") } } } private func generatePastelColor() -> Color { let hue = Double.random(in: 0...1) let saturation = Double.random(in: 0.2...0.4) let brightness = Double.random(in: 0.8...1.0) return Color(hue: hue, saturation: saturation, brightness: brightness) } private func createPlaneEntity(for anchor: PlaneAnchor, color: Color) -> Entity { let mesh = MeshResource.generatePlane( width: anchor.geometry.extent.width, depth: anchor.geometry.extent.height ) var material = PhysicallyBasedMaterial() material.baseColor.tint = UIColor(color) let entity = ModelEntity(mesh: mesh, materials: [material]) entity.transform = Transform(matrix: anchor.originFromAnchorTransform) return entity } }
3
0
199
Apr ’25
Can iOS and visionOS Devices Share the Same Spatial World in a Multiuser AR Session?
Hey everyone, I'm working on an object viewer where users can place objects in a real room using AR, and I want both visionOS (Apple Vision Pro) and iOS devices (iPad, iPhone) to participate in the same shared spatial experience. The idea is that a user with a Vision Pro can place an object, and peers using iPhones/iPads can see the same object in the same position in their AR view. I've looked into ARKit's Shared ARWorldMap and MultipeerConnectivity, but I'm not sure if this extends seamlessly to visionOS or if Apple has an official way to sync spatial data between visionOS and iOS devices. Has anyone tried sharing a spatial world between visionOS and iOS? Are there any built-in frameworks that allow for a shared multiuser AR session across these devices? If not, what would be the best way to sync object positions between them? Would love to hear if anyone has insights or experience with this! 🚀 Thanks!
2
0
552
Mar ’25
Ornaments in Presentations
We can add ornaments to popovers shown by PresentationComponent, but I’m not sure if we should. While working on the editor for entities in a Volume-based app, I had the idea to add ornaments to the presented views. The entire app exists inside a volume. A user can tap a item to present a popoverUI to edit it. This is displayed using the new PresentationComponent in visionOS 26. Ornaments have a new attachment anchor option this year: .parent(). .ornament(attachmentAnchor: .parent(.top), ornament: {...}) This works well in the Simulator. We can add ornaments around this popover view just like we would with a window. Unfortunately, when I run this on device I get a different experience. Any part of the ornament that overlaps with the popover content isn’t rendered correctly. Sometimes it entirely disappears, other times it becomes partially transparent. We could use content alignment to try to make sure the ornament doesn’t overlap the popover content. .ornament(attachmentAnchor: .parent(.top), contentAlignment: .bottom, ornament: {...}) This works sometimes–but not all the time. It’s not clear if this is a bug or not, because I’m not sure if we are even supposed to be able to use ornaments in this way. Here is my hierarchy: An app opens as a Volume Volume presenting a RealityView, with its own ornament using .scene() anchor Multiple Entities with Presentation Component show an edit view The view uses .parent() anchor to add ornaments. What makes me unsure is that other methods for drawing UI in RealityView don’t seem to work with ornaments. For example, if I add an attachment to show a view with the ornament–even when I use the .parent() anchor–the ornament is anchor to the volume, not the attachment view. So what do we think? Is this a rendering bug? Are ornaments intended to work with attachments and presentations?
2
0
363
Aug ’25
visionOS Widget Bug
When I was developing the visionOS 26beta Widget, I found that it could not work normally when the real vision OS was running, and an error would appear. Please adopt container background api It is worth mentioning that this problem does not occur on the visionOS virtual machine. Does anyone know what the reason and solution are, or whether this is a visionOS error that needs Feedback? Thank you!
1
0
449
Sep ’25
How to play blend shape animations or morph animations exported from blender in Vision Pro apps using Reality Kit
So I am exporting a .usdc file from blender that already has some morph animations. The animations play well in blender but when I export I cannot seem to play them in RealityKit or RCP. Entity.availableAnimations is an empty array. Not of the child objects in the entity hierarchy has an animation library component with it. Maybe I am exporting it wrong but I tried multiple combinations but doesn't seem to work. Here are my export settings in blender The original file I purchased is an FBX file that has the animation but when I try to directly get it in RealityConverter it doesn't seem to play animations.
2
0
204
Jun ’25
Rendering scene in RealityView to an Image
Is there any way to render a RealityView to an Image/UIImage like we used to be able to do using SCNView.snapshot() ? ImageRenderer doesn't work because it renders a SwiftUI view hierarchy, and I need the currently presented RealityView with camera background and 3D scene content the way the user sees it I tried UIHostingController and UIGraphicsImageRenderer like extension View { func snapshot() -> UIImage { let controller = UIHostingController(rootView: self) let view = controller.view let targetSize = controller.view.intrinsicContentSize view?.bounds = CGRect(origin: .zero, size: targetSize) view?.backgroundColor = .clear let renderer = UIGraphicsImageRenderer(size: targetSize) return renderer.image { _ in view?.drawHierarchy(in: view!.bounds, afterScreenUpdates: true) } } } but that leads to the app freezing and sending an infinite loop of [CAMetalLayer nextDrawable] returning nil because allocation failed. Same thing happens when I try return renderer.image { ctx in view.layer.render(in: ctx.cgContext) } Now that SceneKit is deprecated, I didn't want to start a new app using deprecated APIs.
3
0
1.1k
Sep ’25
Apple Vision Pro Developer Strap: Video Out and Recording Time Limit
Any way to extend the video recording time in Reality Composer Pro from 3:00 to any longer value, such as editing preferences in Terminal or other workaround? Is there any way to use the strap and a USB-C cable as a live video stream input source that would mirror to Quicktime or some other video capture tool? I am assuming there is no online documentation or user manual for the strap, but please correct me if I'm wrong. Thank you.
1
0
176
Jun ’25
Unwanted file changes in Reality Composer Pro project when using Git
Hello, I'm working on a visionOS project that uses Reality Composer Pro, and we are managing our project files with Git. We've noticed that simply opening and closing the Reality Composer Pro application consistently generates changes in the following files, even when no explicit modifications have been made by the developer: {ProjectName}/Packages/RealityKitContent/Package.realitycomposerpro/PluginData/*******/ShaderGraphEditorPluginID/ShaderGraphEditorPluginID {ProjectName}/Packages/RealityKitContent/Package.realitycomposerpro/WorkspaceData/SceneMetadataList.json Could you please clarify the purpose of these files? Why do they appear as modified when no direct changes are made from our end? More importantly, is it safe to add these files to our .gitignore to prevent them from being tracked by Git? We are concerned that ignoring these files might lead to unexpected issues or inconsistencies when other team members pull the latest changes, especially if these files contain critical project metadata or state that needs to be synchronized. Any insights or recommended best practices for managing Reality Composer Pro projects with Git would be greatly appreciated. Thank you for your time and assistance.
1
0
243
Jun ’25
Bones/joints data issue - USD file export from Blender to RCP
Hi, I'm developing a prototype VisionOS game. How to access the bones or joints information when exporting a USD file from Blender to RCP? The animation in RCP works fine and the joints' information is correctly embedded in the USDA file (with usdchecker). However, RCP does not read it in USDA, USDC or USDZ. It should be possible based on Apple WWDC24 (Compose Interactive 3D content in RCP). I want to attach and detach an entity to a particular bone in certain moments. So I need the bones' data. They are standard mixamo animations. My mesh is a single unified mesh. Using Blender 4.4
4
0
754
Sep ’25