hello apple through this message i want to draw you attention to some problems with gptk and rosetta some games like marvel spiderman 2 have broken animations and t pose issues and other like uncharted and the last of us have severe memory leak issues so its my request please fix it asap
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi Apple,
In VisionOS, for real-time streaming of large 3D scenes, I plan to create Metal buffers and textures in multiple threads and then use a compute shader on the main thread to copy the Metal resources into RealityKit, minimizing main thread usage. Given that most of RealityKit's default APIs require execution on the main actor (main thread), it is not ideal for streaming data. Is this approach the best way to handle streaming data and real-time rendering?
Thank you very much.
after launching a nearby exoerience on quick look or inside our app, all the user in the group watch sometimes teh model blinking abd becoming transparet...
... just one user hasnt the issue, either the one who launched shareplay or the user who force align the immersive space in front
weird
Hi, I'm trying to set the displayScale environment value for a RealityView, so it renders at 2x instead of 3x on the iPhone, but it seems to have no effect.
.environment(\.displayScale, 2.0)
Is this expected behavior, or a bug?
The reason I want it to render at 2x and not at the default 3x is for game optimization and performance.
We are developing a standalone AI avatar application for hospital reception kiosks using Mac mini (M2/M4). The app runs on SwiftUI + RealityKit, displays on a 75-inch monitor, and utilizes a USB-connected 4K camera and external sensors (LiDAR/mmWave).
We have several technical concerns regarding the transition from iPadOS to macOS. Could you please provide insights on the following?
ARKit/Vision Framework on macOS with External Camera On iPadOS, ARKit provides robust Face Tracking. On macOS with an external USB 4K camera:
Can we achieve real-time face tracking (expression/gaze/depth) with Vision framework or ARKit comparable to iPadOS performance?
Are there any specific limitations for accessing the Neural Engine via Vision framework for real-time 4K video analysis on macOS?
Accessing External Hardware (LiDAR/Sensors) in Sandbox We plan to connect external LiDAR and mmWave sensors (e.g., Akara) via USB/Bluetooth.
Is it feasible to communicate with these custom drivers/devices within the App Sandbox environment?
Would DriverKit be required, or can we use standard serial communication APIs?
On-Device LLM (MLX) & Thermals We intend to run a local LLM (e.g., Llama 3 using MLX framework) for offline conversation, alongside 3D rendering.
With the M2/M4 Mac mini fan design, is there a risk of thermal throttling during 10+ hours of continuous operation (simultaneous CoreML + 3D rendering)?
Is the Mac Studio recommended over the Mac mini for this thermal profile?
Long-running Speech API
Are there any known issues (memory leaks, API limits) when using Spherch framework and AVSpeechSynthesizer continuously for over 10 hours daily?
3D Display Output
Are there any macOS constraints for rendering a SwiftUI window in a specific 3D format (e.g., Side-by-Side) and outputting it via HDMI to a 3D digital signage display (fixed refresh rate/resolution)?
Thank you for your assistance.
Topic:
Graphics & Games
SubTopic:
RealityKit
Problem Summary
After upgrading to iOS 26.1 and 26.2, I'm experiencing a particle positioning bug in RealityKit where ParticleEmitterComponent particles render at an incorrect offset relative to their parent entity. This behavior does not occur on iOS 18.6.2 or earlier versions, suggesting a regression introduced in the newer OS builds.
Environment Details
Operating System: iOS 26.1 & iOS 26.2
Framework: RealityKit
Xcode Version: 16.2 (16C5032a)
Expected vs. Actual Behavior
Expected: Particles should render at the position of the entity to which the ParticleEmitterComponent is attached, matching the behavior on iOS 18.6.2 and earlier.
Actual: Particles appear away from their parent entity, creating a visual misalignment that breaks the intended AR experience.
Steps to Reproduce
Create or open an AR application with RealityKit that uses particle components
Attach a ParticleEmitterComponent to an entity via a custom system
Run the application on iOS 26.1 or iOS 26.2
Observe that particles render at an offset position away from the entity
Minimal Code Example
Here's the setup from my test case:
Custom Component & System:
struct SparkleComponent4: Component {}
class SparkleSystem4: System {
static let query = EntityQuery(where: .has(SparkleComponent4.self))
required init(scene: Scene) {}
func update(context: SceneUpdateContext) {
for entity in context.scene.performQuery(Self.query) {
// Only add once
if entity.components.has(ParticleEmitterComponent.self) { continue }
var newEmitter = ParticleEmitterComponent()
newEmitter.mainEmitter.color = .constant(.single(.red))
entity.components.set(newEmitter)
}
}
}
AR Setup:
let material = SimpleMaterial(color: .gray, roughness: 0.15, isMetallic: true)
let model = Entity()
model.components.set(ModelComponent(mesh: boxMesh, materials: [material]))
model.components.set(SparkleComponent4())
model.position = [0, 0.05, 0]
model.name = "MyCube"
let anchor = AnchorEntity(.plane(.horizontal, classification: .any, minimumBounds: [0.2, 0.2]))
anchor.addChild(model)
arView.scene.addAnchor(anchor)
Questions for the Community
Has anyone else encountered this particle positioning issue after updating to iOS 26.1/26.2?
Are there known workarounds or configuration changes to ParticleEmitterComponent that restore correct positioning?
Is this a confirmed bug, or could there be a change in coordinate system handling or transform inheritance that I'm missing?
Additional Information
I've already submitted this issue via Feedback Assistant(FB21346746)
I've tried out a ParticleEmitter in Reality Composer Pro to produce a burst of particles that don't move (i.e. speed close to zero).
When viewing from different angles, it clearly looks like the particles are rendered exactly in the wrong order, that is, front first and back last. In other words, back particles obscure front particles.
I would prefer it the correct way around.
I've only tried this interactively in Reality Composer Pro, not programmatically, but I assume I would get the same result.
My Reality Composer Pro "File" (zipped):
https://gert-rieger-edv.de/Posts/Post-1/RealityParticles.zip
Screenshot:
Click on the ParticleEmitter object, then on its Play button, then select the Particles tab and click on "Burst" a few times to get a few random particles.
Mac Studio 2025
Apple M4 Max
macOS 15.7.2 (24G325)
Reality Composer Pro
Version 2.0 (494.60.2)
I'm trying to use MTLBinaryArchive. I collected a BinaryArchive from one device and used metal-tt to translate it for all supported iPhone devices, ranging from iPhone 7 Plus to iPhone 16.
However, this BinaryArchive is quite large, around 1.5GB uncompressed, and about 500MB compressed in the IPA. I'm wondering how to address the size issue.
I watched the WWDC 2022 video, which mentioned that the operating system or app installation process would handle compatibility. Does this compatibility support different GPU chips? I tried installing an IPA with a BinaryArchive collected only from an iPhone 12 on an iPhone 13, but the BinaryArchive didn't take effect.
I also saw that Apple supports App Thinning. However, it seems that resources in the Asset Catalog cannot be accessed via URL, and creating an MTLBinaryArchive requires a URL. Is it possible for MTLBinaryArchive to be distributed through App Thinning?
The WWDC 2022 video also mentioned using the -Os optimization flag to reduce size. Can this give an estimate of how much compression it would achieve? Are there any methods to solve the BinaryArchive size issue without impacting performance?
Topic:
Graphics & Games
SubTopic:
Metal
I have code that captures a window and displays a cropped image. The problem is 2 fold. Kit doesn't seem to allow to modify stop and recapture image in window mode to capture a portion of the screen.
So this makes me having to crop and display the cropped image via a published variable. This all works find. But seems to stop after some time.
Using an M1 16gig ram. program is taking less than 100meg of mem with 40-70%cpu as the crow flies.
printing captured success in debug mode and sometimes frame isn't valid so guarding against it.
any ideas on how to improve my strategy?
My IOS app generates pdf files.
Every time my users open the generated pdf files, the autofill popup jumps out, but my pdf file is NOT for interacting.
I'm here to ask if there's a way to mark my pdf files as "not a form", like in metadata or anywhere else?
I'm updating an existing distributed game to add turn-based matches. When the Matchmaker ViewController Info Button next to a game is pressed, the results vary:
iOS 15.x - Button under avatar says "Accept Invite" or "View Game" (depending on if invite has already been accepted)
iOS 18.x - Button always says "App Store" - I assume that means it would lead one to the App store to install the game.
Both devices (iPad 15.x and iPhone 18.x) have the same version of the game installed. The results are the same when running in the simulator.
When the game is released, I assume this button will work properly, no?
Topic:
Graphics & Games
SubTopic:
GameKit
Hi everyone,
I’m currently learning about ParticleEmitterComponentParticleEmitterComponent and exploring the sample app provided in the Simulating particles in your visionOS app documentation.
In the sample app, when I set the EmitterPreset to fireworks from the settings panel on the left side of the window and choose SystemImage, I noticed two issues:
The image applied to mainEmitter appears clipped or cropped.
The image on spawnedEmitter does not update to the selected SystemImage.
What I want to achieve:
Apply the same SystemImage to both mainEmittermainEmitter and spawnedEmitterspawnedEmitter so that it displays correctly without clipping.
Remove the animation that changes the size of spawnedEmitterspawnedEmitter over time and keep it at a constant size.
Could someone explain which properties should be adjusted to achieve this behavior? Any guidance or examples would be greatly appreciated.
Thanks in advance!
iPhone(14 Pro Max)で端末の画面にリフレッシュレートを表示させたいのですが、どなたか方法をご存知ないでしょうか?
Topic:
Graphics & Games
SubTopic:
General
App Storeにある『浮遊時計 Premium』は1Hzごとか10Hzごと、または3段階以上のリフレッシュレート計測はできますか?
Topic:
Graphics & Games
SubTopic:
General
I do not understand how offline leaderboard submission is supposed to work in Game Kit:
While the documentation briefly states that offline submission is supported, how is that even possible when you first have to fetch a leaderboard object in order to then call its submitScore function? How can I get the leaderboard object in the first place when offline?
Can anyone enlighten me how this works? Or maybe point me to some relevant documentation?
The “explore spatial accessory input on visionOS” presentation from WDC25 interests me. I bought both the MUSE Logitech stylus and the PS VR2 sense controllers to try out with the sculpting app presented by the author, engineer Amanda Han. Unfortunately the app itself was not included. Could the app be made available for downloading as well as the Xcode project? I appreciate any assistance the author and your team could provide. Thank you.
Topic:
Graphics & Games
SubTopic:
RealityKit
Hi
I've noticed one issue in Metal HUD, but I'm not sure if it is a bug in the Metal HUD or if there is a purpose for this behavior.
Metal HUD has an option to send the data to system log in raw format where the numbers are like
metal-HUD: ,,,,,...,
https://developer.apple.com/documentation/xcode/monitoring-your-metal-apps-graphics-performance/
If the HUD is displayed, it works just fine, but it seems that when the HUD is hidden (with shift-F9), it still send the data to system log, but the numbers are the same all the time and are not updated while is still being updated.
I would expect that it should log the data no matter if the HUD is displayed or not, this of course leads to incorrect FPS calculations
Here is an example of the system log entries when the HUD is not visible:
Topic:
Graphics & Games
SubTopic:
General
I think if your buffer is less than 4k its recommended to use
setVertexBytes, the question I have is can I keep hammering on setVertexBytes as the primary method to issue multiple draw calls within a render buffer and rely on Metal to figure out how to orphan and replace the target buffer?
A lot of the primitives I am drawing are less than 4k and the process of wiring down larger segments of memory for individual buffers for each draw primitive call seems to be a negative.
And it's just simpler to copy, submit and forget about buffer synchronization.
Topic:
Graphics & Games
SubTopic:
Metal
I'm currently learning Metal. While reading the reference, I came across a strange description.
Page 78 in Version 4 Reference (2025-10-25) says:
It is legal to call the following set_indices functions to set the indices if the position in the index buffer is valid and if the position in the index buffer is a multiple of 2 (uchar2 overload) or 2 (uchar4 overload). The index I needs to be in the range [0, max_indices).
void set_indices(uint I, uchar2 v);
void set_indices(uint I, uchar4 v);
However, it seems that the uchar4 overload should be multiple of 4.
Furthermore, there is no explanation of what these methods actually do. I believe it involves setting two to four consecutive indices at once, but there is no mention of that here.
I would like to know if the above understanding is correct.
Hi. I'm a 3D designer, using Blender for most of my work. The most recent Blender conference discussed utilizing the Open Shading Language (OSL) in their latest versions, which allows designers to write custom shaders for their workflows.
At the moment, only Nvidia Optix GPU's can utilize this language for rendering (from what I understand), but Blender developers stated they are waiting on other GPU manufacturers to implement this feature as well. I'm not sure if there are any licensing issues here, but would this be something Apple could implement in Metal to make their hardware more attractive to the 3D design community?
Any help or knowledge on this topic would be greatly appreciated.