Hey there,
I tried to install GPTK again, since I had to reinstall the OS for irrelevant reasons. But every time I try to install the tool kit, it gives me theError: apple/apple/game-porting-toolkit 1.1 did not build error. Before that error occired, I had the Openssl error, which I fixed with the rbenv version of openssl. Is there any way to fix this error? Down bellow you'll find the full error message it gave me. The specs for my Mac are (if they are helpful in any way): M1 Pro MBP 14" with 16GB Ram and 512GB SSD.
Thanks!
``Error: apple/apple/game-porting-toolkit 1.1 did not build
Logs:
/Users/myuser/Library/Logs/Homebrew/game-porting-toolkit/00.options.out
/Users/myuser/Library/Logs/Homebrew/game-porting-toolkit/01.configure
/Users/myuser/Library/Logs/Homebrew/game-porting-toolkit/01.configure.cc
/Users/myuser/Library/Logs/Homebrew/game-porting-toolkit/wine64-build
If reporting this issue please do so at (not Homebrew/brew or Homebrew/homebrew-core):
https://github.com/apple/homebrew-apple/issues```
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Problem Summary
After upgrading to iOS 26.1 and 26.2, I'm experiencing a particle positioning bug in RealityKit where ParticleEmitterComponent particles render at an incorrect offset relative to their parent entity. This behavior does not occur on iOS 18.6.2 or earlier versions, suggesting a regression introduced in the newer OS builds.
Environment Details
Operating System: iOS 26.1 & iOS 26.2
Framework: RealityKit
Xcode Version: 16.2 (16C5032a)
Expected vs. Actual Behavior
Expected: Particles should render at the position of the entity to which the ParticleEmitterComponent is attached, matching the behavior on iOS 18.6.2 and earlier.
Actual: Particles appear away from their parent entity, creating a visual misalignment that breaks the intended AR experience.
Steps to Reproduce
Create or open an AR application with RealityKit that uses particle components
Attach a ParticleEmitterComponent to an entity via a custom system
Run the application on iOS 26.1 or iOS 26.2
Observe that particles render at an offset position away from the entity
Minimal Code Example
Here's the setup from my test case:
Custom Component & System:
struct SparkleComponent4: Component {}
class SparkleSystem4: System {
static let query = EntityQuery(where: .has(SparkleComponent4.self))
required init(scene: Scene) {}
func update(context: SceneUpdateContext) {
for entity in context.scene.performQuery(Self.query) {
// Only add once
if entity.components.has(ParticleEmitterComponent.self) { continue }
var newEmitter = ParticleEmitterComponent()
newEmitter.mainEmitter.color = .constant(.single(.red))
entity.components.set(newEmitter)
}
}
}
AR Setup:
let material = SimpleMaterial(color: .gray, roughness: 0.15, isMetallic: true)
let model = Entity()
model.components.set(ModelComponent(mesh: boxMesh, materials: [material]))
model.components.set(SparkleComponent4())
model.position = [0, 0.05, 0]
model.name = "MyCube"
let anchor = AnchorEntity(.plane(.horizontal, classification: .any, minimumBounds: [0.2, 0.2]))
anchor.addChild(model)
arView.scene.addAnchor(anchor)
Questions for the Community
Has anyone else encountered this particle positioning issue after updating to iOS 26.1/26.2?
Are there known workarounds or configuration changes to ParticleEmitterComponent that restore correct positioning?
Is this a confirmed bug, or could there be a change in coordinate system handling or transform inheritance that I'm missing?
Additional Information
I've already submitted this issue via Feedback Assistant(FB21346746)
I have implemented the Game Center for authentication and saving player's game data. Both authentication and saving player's data works correctly all the time, but there is a problem with fetching and loading the data.
The game works like this:
At the startup, I start the authentication
After the player successfully logs in, I start loading the player's data by calling fetchSavedGames method
If a game data exists for the player, I receive a list of SavedGame object containing the player's data
The problem is that after I uninstall the game and install it again, sometimes the SavedGame list is empty(step 3). But if I don't uninstall the game and reopen the game, this process works fine.
Here's the complete code of Game Center implementation:
class GameCenterHandler {
public func signIn() {
GKLocalPlayer.local.authenticateHandler = { viewController, error in
if let viewController = viewController {
viewController.present(viewController, animated: false)
return
}
if error != nil {
// Player could not be authenticated.
// Disable Game Center in the game.
return
}
// Auth successfull
self.load(filename: "TestFileName")
}
}
public func save(filename: String, data: String) {
if GKLocalPlayer.local.isAuthenticated {
GKLocalPlayer.local.saveGameData(Data(data.utf8), withName: filename) { savedGame, error in
if savedGame != nil {
// Data saved successfully
}
if error != nil {
// Error in saving game data!
}
}
} else {
// Error in saving game data! User is not authenticated"
}
}
public func load(filename: String) {
if GKLocalPlayer.local.isAuthenticated {
GKLocalPlayer.local.fetchSavedGames { games, error in
if let game = games?.first(where: {$0.name == filename}){
game.loadData { data, error in
if data != nil {
// Data loaded successfully
}
if error != nil {
// Error in loading game data!
}
}
} else {
// Error in loading game data! Filename not found
}
}
} else {
// Error in loading game data! User is not authenticated
}
}
}
I have also added Game Center and iCloud capabilities in xcode. Also in the iCloud section, I selected the iCloud Documents and added a container.
I found a simillar question here but it doesn't make things clearer.
My procedural animation using IKRig occasionally jitters and throws this warning. It completely breaks immersion.
getMoCapTask: Rig must be of type kCoreIKSolverTypePoseRetarget
Any idea where to start looking or what could be causing this?
我们想在游戏类 App 内接入 Game Center。用户可以在游戏内创建多个角色,若用户在游戏内创建了2个角色:角色1、角色2,请问:
当用户将角色1与 Game Center 绑定后,数据将上报至 Game Center。此时玩家想要将角色1与 Game Center 解除绑定,解绑后,再将角色2与 Game Center 绑定。那么这时角色1的数据是留存在 Game Center 中,还是将被移除?
I am trying to implement a ChacterControllerComponent using the following URL.
https://developer.apple.com/documentation/realitykit/charactercontrollercomponent
I have written sample code, but PhysicsSimulationEvents.WillSimulate is not executed and nothing happens.
import SwiftUI
import RealityKit
import RealityKitContent
struct ImmersiveView: View {
let gravity: SIMD3<Float> = [0, -50, 0]
let jumpSpeed: Float = 10
enum PlayerInput {
case none, jump
}
@State private var testCharacter: Entity = Entity()
@State private var myPlayerInput = PlayerInput.none
var body: some View {
RealityView { content in
// Add the initial RealityKit content
if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) {
content.add(immersiveContentEntity)
testCharacter = immersiveContentEntity.findEntity(named: "Capsule")!
testCharacter.components.set(CharacterControllerComponent())
let _ = content.subscribe(to: PhysicsSimulationEvents.WillSimulate.self, on: testCharacter) {
event in
print("subscribe run")
let deltaTime: Float = Float(event.deltaTime)
var velocity: SIMD3<Float> = .zero
var isOnGround: Bool = false
// RealityKit automatically adds `CharacterControllerStateComponent` after moving the character for the first time.
if let ccState = testCharacter.components[CharacterControllerStateComponent.self] {
velocity = ccState.velocity
isOnGround = ccState.isOnGround
}
if !isOnGround {
// Gravity is a force, so you need to accumulate it for each frame.
velocity += gravity * deltaTime
} else if myPlayerInput == .jump {
// Set the character's velocity directly to launch it in the air when the player jumps.
velocity.y = jumpSpeed
}
testCharacter.moveCharacter(by: velocity * deltaTime, deltaTime: deltaTime, relativeTo: nil) {
event in
print("playerEntity collided with \(event.hitEntity.name)")
}
}
}
}
}
}
The scene is loaded from RCP. It is simple, just a capsule on a pedestal.
Do I need a separate code to run testCharacter from this state?
Hello experts, I'm trying to implement a material with custom shader code, but I saw that visionOS doesn't allow you to inject custom Metal functions or use CustomMaterial like iOS/macOS, nor can you directly write Metal Shading Language (.metal) and use it through ShaderGraphMaterial. So my question is, if i want to implement your own shader code, how should i do it?
Hello, when I'm looking to customize the icons of my phone, the applications that are in the grouping genres without replacing with all-black images, I don't know what happens by changing the color of the applications in group of change no color throws just listen not the black stuff
Topic:
Graphics & Games
SubTopic:
General
Hi,
seems MSL is missing support for a clock() shader instruction available in other graphics APIs like Vulkan or OpenGL for example..
useful for counting cost in number of clock cycles of some code insider shader with much finer granularity than launching a micro kernel with same instructions and measuring cycles cost from CPU..
also useful for MoltenVK to support that extensions..
thanks..
Hello everyone! I am trying to wrap a ViewModifier inside a Swift Package that bundles a metal shader file to be used in the modifier. Everything works as expected in the Preview, in the Simulator and on a real device for iOS. It also works in Preview and in the Simulator for tvOS but not on a real AppleTV. I have tried this on a 4th generation Apple TV running tvOS 26.3 using Xcode 26.2.0.
Xcode logs the following: The metallib is processed and exists in the bundle.
Compiler failed to build request
precondition failure: pipeline error: custom_effect-fg2a5cia7fmha4: error: unresolved visible function reference: custom_fn
Reason: visible function not loaded
Compiler failed to build request
precondition failure: pipeline error: custom_effect-fg2a5cia7fmha4: error: unresolved visible function reference: custom_fn
Reason: visible function not loaded
Compiler failed to build request
precondition failure: pipeline error: custom_effect-fg2a5cia7fmha4: error: unresolved visible function reference: custom_fn
Reason: visible function not loaded
Compiler failed to build request
precondition failure: pipeline error: custom_effect-fg2a5cia7fmha4: error: unresolved visible function reference: custom_fn
Reason: visible function not loaded
Compiler failed to build request
precondition failure: pipeline error: custom_effect-fg2a5cia7fmha4: error: unresolved visible function reference: custom_fn
Reason: visible function not loaded
Compiler failed to build request
precondition failure: pipeline error: custom_effect-fg2a5cia7fmha4: error: unresolved visible function reference: custom_fn
Reason: visible function not loaded
Contents of Package.swift:
import PackageDescription
let package = Package(
name: "Test",
platforms: [
.iOS(.v17),
.tvOS(.v17)
],
products: [
.library(
name: "Test",
targets: [
"Test"
]
)
],
targets: [
.target(
name: "Test",
resources: [
.process("Shaders")
]
),
.testTarget(
name: "TestTests",
dependencies: [
"Test"
]
)
]
)
Content of my metal file:
#include <metal_stdlib>
using namespace metal;
[[ stitchable ]] float2 complexWave(float2 position, float time, float2 size, float speed, float strength, float frequency) {
float2 normalizedPosition = position / size;
float moveAmount = time * speed;
position.x += sin((normalizedPosition.x + moveAmount) * frequency) * strength;
position.y += cos((normalizedPosition.y + moveAmount) * frequency) * strength;
return position;
}
And my ViewModifier:
import MetalKit
import SwiftUI
extension ShaderFunction {
static let complexWave: ShaderFunction = {
ShaderFunction(
library: .bundle(.module),
name: "complexWave"
)
}()
}
extension Shader {
static func complexWave(arguments: [Shader.Argument]) -> Shader {
Shader(function: .complexWave, arguments: arguments)
}
}
struct WaveModifier: ViewModifier {
let start: Date = .now
func body(content: Content) -> some View {
TimelineView(.animation) { context in
let delta = context.date.timeIntervalSince(start)
content
.visualEffect { view, proxy in
view.distortionEffect(
.complexWave(
arguments: [
.float(delta),
.float2(proxy.size),
.float(0.5),
.float(8),
.float(10)
]
),
maxSampleOffset: .zero
)
}
}
.onAppear {
let paths = Bundle.module.paths(forResourcesOfType: "metallib", inDirectory: nil)
print(paths)
}
}
}
extension View {
public func wave() -> some View {
modifier(WaveModifier())
}
}
#Preview {
Image(systemName: "cart")
.wave()
}
Any help is appreciated.
I want to use reality to create a custom material that can use my own shader and support Mesh instancing (for rendering 3D Gaussian splating), but I found that CustomMaterial does not support VisionOS. Is there any other interface that can achieve my needs? Where can I find examples?
Topic:
Graphics & Games
SubTopic:
RealityKit
I want to fade objects in and out, and while setting an entity's OpacityComponent works, animating it doesn't seem to do anything.
In the following code the second sphere should fade out, but it keeps its initial opacity. On the other hand, the animation that changes its transform works. What am I doing wrong?
class ViewController: NSViewController {
override func loadView() {
let arView = ARView(frame: NSScreen.main!.frame)
let anchor = AnchorEntity(.world(transform: matrix_identity_float4x4))
arView.scene.addAnchor(anchor)
let sphere = ModelEntity(mesh: .generateSphere(radius: 0.5))
anchor.addChild(sphere)
sphere.components.set(OpacityComponent(opacity: 0.1))
let sphere2 = ModelEntity(mesh: .generateSphere(radius: 0.5))
sphere2.position = .init(x: 0.2, y: 0, z: 0)
anchor.addChild(sphere2)
sphere2.components.set(OpacityComponent(opacity: 0.1))
sphere.playAnimation(try! AnimationResource.makeActionAnimation(for: FromToByAction(to: 0, timing: .linear), duration: 1, bindTarget: .opacity))
sphere.playAnimation(try! AnimationResource.makeActionAnimation(for: FromToByAction(to: Transform(translation: SIMD3(x: 0.1, y: 0, z: 0)), timing: .linear), duration: 1, bindTarget: .transform))
view = arView
}
}
I've been using Metal compute shaders for lattice quantum chromodynamics simulations and wanted to share the experience in case others are doing scientific computing on Metal.
The workload involves SU(2) matrix operations on 4D lattice grids — lots of 2x2 and 3x3 complex matrix multiplies, reductions over lattice sites, and nearest-neighbor stencil operations. The implementation bridges a C++ scientific framework (Grid) to Metal via Objective-C++ .mm files, with MSL kernels compiled into .metallib archives during the build.
Things that work well:
Shared memory on M-series eliminates the CPU↔GPU copy overhead that dominates in CUDA workflows
The .metallib compilation integrates cleanly with autotools builds using xcrun
Float4 packing for SU(2) matrices maps naturally to MSL vector types
Things I'm still figuring out:
Optimal threadgroup sizes for stencil operations on 4D grids
Whether to use MTLHeap for gauge field storage or stick with individual buffers
Best practices for double precision — some measurements need float64 but Metal's double support varies by hardware
The application is measuring chromofield flux distributions between static quarks, ultimately targeting multi-quark systems. Production runs are on MacBook Pro M-series and Mac Studio.
Code: https://github.com/ThinkOffApp/multiquark-lattice-qcd
We are developing a video processing app that applies CIFilter chains to video frames. To not force the user to keep the app foregrounded, we were happy to see the introduction of BGContinuedProcessingTask to continue processing when backgrounded.
With iOS 26, I was excited to see the com.apple.developer.background-tasks.continued-processing.gpu entitlement, which should allow GPU access in the background. Even the article in the documentation provides "exporting video in a film-editing app" or "applying visual filters (HDR, etc) or compressing images for social media posts" as use cases. However, when I check BGTaskScheduler.shared.supportedResources.contains(.gpu) at runtime, it returns false on every iPhone I've tested (including iPhone 15 Pro and iPhone 16 Pro).
From forum responses I've seen, it sounds like background GPU access is currently limited to iPad only. If that's the case, I have a few questions:
Is this an intentional, permanent limitation — or is iPhone support planned for a future iOS release?
What is the recommended approach for GPU-dependent background work on iPhone? My custom CIKernels are written in Metal (as Apple recommends since CIKL is deprecated), but Metal CIKernels cannot fall back to CPU rendering. This creates a situation where Apple's own deprecation guidance (migrate to Metal) conflicts with background processing realities (no GPU on iPhone).
Should developers maintain deprecated CIKL kernel versions alongside Metal kernels purely as a CPU fallback for background execution? That feels like it defeats the purpose of the migration.
It seems like a gap in the platform: the API exists, the entitlement exists, but the hardware support isn't there for the most common device category. Any clarity on Apple's direction here would be very helpful.
I used xcode gpu capture to profile render pipeline's bandwidth of my game.Then i found depth buffer and stencil buffer use the same buffer whitch it's format is Depth32Float_Stencil8.
But why in a single pass of pipeline, this buffer was loaded twice, and the Load Attachment Size of Encoder Statistics was double.
Is there any bug with xcode gpu capture?Or the pass really loaded the buffer twice times?
Topic:
Graphics & Games
SubTopic:
Metal
I have been tasked with creating content for the Apple Vision Pro. Just the 3D content and animation, not the programming end of things.
I can't seem to get any kind of mesh deformation animation to import into Reality Composer Pro. By that I mean bones/skin, or even point cache.
I work on PC, and my main software is 3DS Max, but I'm borrowing an iMac for this job, and was instructed to use RCP on it for testing before handing things off to the programmer. My files open and play fine in other USD programs, like Omniverse, or USD View, just not Reality Composer Pro.
I've seen the dinosaur demo in AVP, so I know mesh deformation is possible. If there are other essential tools that might make this possible, I have not been made aware of them.
I am experimenting with bouncing things off of Blender, in case that exports better, but not really having luck there either -though my results are different.
Thanks.
Topic:
Graphics & Games
SubTopic:
RealityKit
I am working on a remote control application for macOS where I need to maintain two "virtual" cursors:
Remote Cursor: Follows the remote user's movements.
Local Cursor: Follows the local user's physical mouse/trackpad movements.
To move the system cursor (for the remote side), I use CGWarpMouseCursorPosition as follows:
void DualCursorMac::UpdateSystemCursorPosition(int x, int y) {
CGPoint point = CGPointMake(static_cast<CGFloat>(x), static_cast<CGFloat>(y));
// Warp the cursor to match remote coordinates
CGWarpMouseCursorPosition(point);
}
Meanwhile, I use a CGEventTap to monitor local physical movements to update my local virtual cursor's UI:
CGEventRef Mouse::MouseTapCallback(CGEventTapProxy proxy, CGEventType type, CGEventRef event, void *refcon) {
if (remoteControlMode) {
// We want to suppress system cursor movement but still read the delta
const int deltaX = static_cast<int>(CGEventGetIntegerValueField(event, kCGMouseEventDeltaX));
const int deltaY = static_cast<int>(CGEventGetIntegerValueField(event, kCGMouseEventDeltaY));
NSLog(@"MouseTapCallback: delta:(%d, %d)", deltaX, deltaY);
// Update local virtual cursor UI based on deltas...
return nullptr; // Consume the event
}
return event;
}
The Problem:
When CGWarpMouseCursorPosition is called frequently to update the system cursor, it interferes with the kCGMouseEventDeltaX/Y values in the Event Tap.
Specifically, if the local user moves the trackpad slowly (expecting deltas of 1 or 2), but a "Warp" occurs simultaneously (e.g., jumping the cursor from (100, 100) to (300, 300)), the deltaX and deltaY in the callback suddenly spike to very large values. It seems the system calculates the delta based on the new warped position rather than the pure physical displacement of the trackpad.
This makes the local virtual cursor "jump" erratically and makes it impossible to track smooth local movement during remote control.
My Question:
Is there a way to get the "raw" or "pure" physical relative movement (delta) from the trackpad/mouse that is independent of the system cursor's absolute position or warping?
Are there alternative APIs (perhaps IOKit or different CGEvent fields) that would allow me to get consistent deltas even when the cursor is being warped programmatically?
When I enter webp image to the assets catalog I get warrning:
/Users/..../Assets.xcassets The image set "Card-Back" references a file "Card-Back.webp", but that file does not have a valid extension.
It works, I see all my images perfect.
How can I fix the 200+ warrnings?
hi everyone,
我们发现了一个和Metal相关崩溃。应用中使用了Metal相关的接口,在进行性能测试时,打开了设置-开发者-显示HUD图形。运行应用后,正常展示HUD,但应用很快发生了崩溃,日志主要信息如下:
Incident Identifier: 1F093635-2DB8-4B29-9DA5-488A6609277B
CrashReporter Key: 233e54398e2a0266d95265cfb96c5a89eb3403fd
Hardware Model: iPhone14,3
Process: waimai [16584]
Path: /private/var/containers/Bundle/Application/CCCFC0AE-EFB8-4BD8-B674-ED089B776221/waimai.app/waimai
Identifier:
Version: 61488 (8.53.0)
Code Type: ARM-64
Parent Process: ? [1]
Date/Time: 2025-06-12 14:41:45.296 +0800
OS Version: iOS 18.0 (22A3354)
Report Version: 104
Monitor Type: Mach Exception
Exception Type: EXC_BAD_ACCESS (SIGBUS)
Exception Codes: KERN_PROTECTION_FAILURE at 0x000000014fffae00
Crashed Thread: 57
Thread 57 Crashed:
0 libMTLHud.dylib esfm_GenerateTriangesForString + 408
1 libMTLHud.dylib esfm_GenerateTriangesForString + 92
2 libMTLHud.dylib Renderer::DrawText(char const*, int, unsigned int) + 204
3 libMTLHud.dylib Overlay::onPresent(id<CAMetalDrawable>) + 1656
4 libMTLHud.dylib CAMetalDrawable_present(void (*)(), objc_object*, objc_selector*) + 72
5 libMTLHud.dylib invocation function for block in void replaceMethod<void>(objc_class*, objc_selector*, void (*)(void (*)(), objc_object*, objc_selector*)) + 56
6 Metal __45-[_MTLCommandBuffer presentDrawable:options:]_block_invoke + 104
7 Metal MTLDispatchListApply + 52
8 Metal -[_MTLCommandBuffer didScheduleWithStartTime:endTime:error:] + 312
9 IOGPU IOGPUNotificationQueueDispatchAvailableCompletionNotifications + 136
10 IOGPU __IOGPUNotificationQueueSetDispatchQueue_block_invoke + 64
11 libdispatch.dylib _dispatch_client_callout4 + 20
12 libdispatch.dylib _dispatch_mach_msg_invoke + 464
13 libdispatch.dylib _dispatch_lane_serial_drain + 368
14 libdispatch.dylib _dispatch_mach_invoke + 456
15 libdispatch.dylib _dispatch_lane_serial_drain + 368
16 libdispatch.dylib _dispatch_lane_invoke + 432
17 libdispatch.dylib _dispatch_lane_serial_drain + 368
18 libdispatch.dylib _dispatch_lane_invoke + 380
19 libdispatch.dylib _dispatch_root_queue_drain_deferred_wlh + 288
20 libdispatch.dylib _dispatch_workloop_worker_thread + 540
21 libsystem_pthread.dylib _pthread_wqthread + 288
我们测试了几个不同的机型,只有iPhone 13 Pro Max会发生崩溃。
Q1:为什么会发生这个崩溃?
Q2:相同的逻辑,为什么仅在iPhone 13 Pro Max机型上出现崩溃?
期待您的解答。
The sample code just draw a triangle and sample texture.
both sample code can draw a correct triangle and sample texture as expected. there are no error message from terminal.
Sample code using constexpr Sampler can capture and replay well.
Sample code using a argumentTable to bind a MTLSamplerState was crashed when using Metal capture and replay on Xcode.
Here are sample codes.
Sample Code
Test Environment:
M1 Pro
MacOS 26.3 (25D125)
Xcode Version 26.2 (17C52)
Feedback ID: FB22031701