We used below method to resize image while compress the image,
Below method is correct or need to do the correction in method or "CGBitmapContextCreate"
-(UIImage *)resizeImage:(UIImage *)anImage width:(int)width height:(int)height
{
CGImageRef imageRef = [anImage CGImage];
CGImageAlphaInfo alphaInfo = CGImageGetAlphaInfo(imageRef);
if (alphaInfo == kCGImageAlphaNone)
alphaInfo = kCGImageAlphaNoneSkipLast;
CGContextRef bitmap = CGBitmapContextCreate(NULL, width, height, CGImageGetBitsPerComponent(imageRef), 4 * width, CGImageGetColorSpace(imageRef), alphaInfo);
CGContextDrawImage(bitmap, CGRectMake(0, 0, width, height), imageRef);
CGImageRef ref = CGBitmapContextCreateImage(bitmap);
UIImage *result = [UIImage imageWithCGImage:ref];
CGContextRelease(bitmap);
CGImageRelease(ref);
return result;
}
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello Apple team,
I'm working on an iOS AR app using SwiftUI and RealityKit,
and I was wondering if the Cinematic API can be used with a RealityKit scene. I’d like to achieve a shallow depth of field while keeping the 3D asset in focus, and vice versa.
Thanks!
Matchmaking rules
https://developer.apple.com/documentation/gamekit/matchmaking-rules?language=objc
AppStoreConnectApi rules
https://developer.apple.com/documentation/appstoreconnectapi/rules?language=objc
・Environment
Unity 6000.2.2f1
XCode 16.1
iOS 26
3 iPhones
・AppStoreConnectApi rules
"type": "gameCenterMatchmakingRuleSets",
"id": "f6a88caf-85db-42bf-xxxxxxxxxxxxxxxxxxxx",
"attributes": {
"referenceName": "co.mygame.RuleSets.GvERandom34",
"ruleLanguageVersion": 1,
"minPlayers": 3,
"maxPlayers": 4
},
"type": "gameCenterMatchmakingRules",
"id": "6afa68ce-4d2c-496f-xxxxxxxxxxxxxxxxxxxx",
"attributes": {
"referenceName": "GameVersion",
"description": "Check Game Version. GvERandom34",
"type": "COMPATIBLE",
"expression": "requests[0].properties.gameVersion == requests[1].properties.gameVersion",
"weight": null
},
"type": "gameCenterMatchmakingQueues",
"id": "7fb645ef-4eca-4510-xxxxxxxxxxxxxxxxxxxx",
"attributes": {
"referenceName": "co.mygame.que.GvERandom34",
"classicMatchmakingBundleIds": []
},
・Objective-C Execution code
queueName = "co.mygame.que.GvERandom34"
keyStr = "gameVersion "
valueStr = "1.0"
- (void)MatchQueueParamStr1Start:(NSString*)queueName keyStr:(NSString*)keyStr valueStr:(NSString*)valueStr
{
if (@available(iOS 17.2, tvOS 17.2, macOS 14.2, visionOS 1.1, *) == NO)
{
DBGLOG(@"MatchQueueParamStr1Start Not support.");
return;
}
self->_matchMakingFlag = YES;
self->_matchFinishFlag = NO;
self->_myMatch = nil;
GKMatchRequest *req = [[GKMatchRequest alloc] init];
if (@available(iOS 17.2, tvOS 17.2, macOS 14.2, visionOS 1.1, *))
{
req.queueName = queueName;
req.properties = @{keyStr: valueStr};
}
[[GKMatchmaker sharedMatchmaker] findMatchForRequest:req withCompletionHandler: ^(GKMatch *match, NSError *error)
{
if (error)
{
[self SetupErrorInfo:error descriptionText:@"findMatchForRequest"];
}
else if(match)
{
self->_myMatch = match;
self->_myMatch.delegate = self;
}
self->_matchMakingFlag = NO;
self->_matchFinishFlag = YES;
}];
}
・
I'm trying to match with three devices.
Matching doesn't work.
5 minutes later times out.
What's the problem?
I want to use reality to create a custom material that can use my own shader and support Mesh instancing (for rendering 3D Gaussian splating), but I found that CustomMaterial does not support VisionOS. Is there any other interface that can achieve my needs? Where can I find examples?
Topic:
Graphics & Games
SubTopic:
RealityKit
In the Creating A 3D Application With Hydra Rendering tutorial on the Apple Developer website, on the last step where I execute this command:
cmake -S ~/Users/macuser/CreatingA3DApplicationWithHydraRendering/ -B ~/Users/macuser/CreatingA3DApplicationWithHydraRendering/
I keep getting an error:
CMake Error at CMakeLists.txt:5 (include):
include could not find requested file:
/Users/macuser/USDInstall/bin/pxrConfig.cmake
I've tried to follow the instructions as mentioned in the README.md file included in the project files at least 5 times as well as moving the pxrConfig.cmake file around and copying it in different folders, then executed the command and was still unsuccessful into generating the proper file expected to compile and render the HydraPlayer renderer. How do I get cmake to generate the Xcode file to create the HydraPlayer renderer?
My goal is to print a debug message from a shader. I follow the guide that orders to set -fmetal-enable-logging metal compiler flag and following environment variables:
MTL_LOG_LEVEL=MTLLogLevelDebug
MTL_LOG_BUFFER_SIZE=2048
MTL_LOG_TO_STDERR=1
However there's an issue with the guide, it's only covers Xcode project setup, however I'm working on a Swift Package. It has a Metal-only target that's included into main target like this:
targets: [
// A separate target for shaders.
.target(
name: "MetalShaders",
resources: [
.process("Metal")
],
plugins: [
// https://github.com/schwa/MetalCompilerPlugin
.plugin(name: "MetalCompilerPlugin", package: "MetalCompilerPlugin")
]
),
// Main target
.target(
name: "MegApp",
dependencies: ["MetalShaders"]
),
.testTarget(
name: "MegAppTests",
dependencies: [
"MegApp",
"MetalShaders",
]
]
So to apply compiler flag I use MetalCompilerPlugin which emits debug.metallib, it also allows to define DEBUG macro for shaders. This code compiles:
#ifdef DEBUG
logger.log_error("Hello There!");
os_log_default.log_debug("Hello thread: %d", gid);
// this proves that code exectutes
result.flag = true;
#endif
Environment is set via .xctestplan and valideted to work with ProcessInfo. However, nothing is printed to Xcode console nor to Console app.
In attempt to fix it I'm trying to setup a MTLLogState, however the makeLogState(descriptor:) fails with error:
if #available(iOS 18.0, *) {
let logDescriptor = MTLLogStateDescriptor()
logDescriptor.level = .debug
logDescriptor.bufferSize = 2048
// Error Domain=MTLLogStateErrorDomain Code=2 "Cannot create residency set for MTLLogState: (null)" UserInfo={NSLocalizedDescription=Cannot create residency set for MTLLogState: (null)}
let logState = try! device.makeLogState(descriptor: logDescriptor)
commandBufferDescriptor.logState = logState
}
Some LLMs suggested that this is connected with Simulator, and truly, I run the tests on simulator. However tests don't want to run on iPhone... I found solution running them on My Mac (Mac Catalyst). Surprisingly descriptor log works there, even without MTLLogState. But the Simulator behaviour seems like a bug...
Hi,
wanted to test if possible to use Mesa3D OGLon12+D3DMetal 2b3 to get GL>4.1 support on windows apps via D3D12Metal..
using simple wglgears.c app (similar glxgears) and running like:
GALLIUM_DRIVER=d3d12 wine64 wglgears64 -info
with overridden opengl32.dll using contents from:
https://github.com/pal1000/mesa-dist-win/releases/download/24.3.0-rc1/mesa3d-24.3.0-rc1-release-msvc.7z
I get:
[D3DMetal:LOG:5E53] Unsupported API: CreateCommandQueue1
caused by:
https://gitlab.freedesktop.org/mesa/mesa/-/commit/c022c9603d500b59ff5e6f93c8a214d1785ab20a
API:
https://learn.microsoft.com/en-us/windows/win32/api/d3d12/nf-d3d12-id3d12device9-createcommandqueue1
note setup is correct as using:
GALLIUM_DRIVER=llvmpipe wine64 wglgears64 -info
I get:
GL_RENDERER = llvmpipe (LLVM 19.1.3, 128 bits)
GL_VERSION = 4.5 (Compatibility Profile) Mesa 24.3.0-rc1 (git-85ba713d76)
GL_VENDOR = Mesa
GL_EXTENSIONS = GL_ARB_multisample GL_EXT_abgr GL_EXT_bgra GL_EXT_blend_color GL_EXT_blend_minmax GL_EXT_blend_subtract
r GL_EXT_texture.. etc..
I want to use reality to create a custom material that can use my own shader and support Mesh instancing (for rendering 3D Gaussian splating), but I found that CustomMaterial does not support VisionOS. Is there any other interface that can achieve my needs? Where can I find examples?
Topic:
Graphics & Games
SubTopic:
RealityKit
I am puzzled by the setAddress(_:attributeStride:index:) of MTL4ArgumentTable. Can anyone please explain what the attributeStride parameter is for? The doc says that it is "The stride between attributes in the buffer." but why?
Who uses this for what? On the C++ side in the shaders the stride is determined by the C++ type, as far as I know. What am I missing here?
Thanks!
hi everyone,
我们发现了一个和Metal相关崩溃。应用中使用了Metal相关的接口,在进行性能测试时,打开了设置-开发者-显示HUD图形。运行应用后,正常展示HUD,但应用很快发生了崩溃,日志主要信息如下:
Incident Identifier: 1F093635-2DB8-4B29-9DA5-488A6609277B
CrashReporter Key: 233e54398e2a0266d95265cfb96c5a89eb3403fd
Hardware Model: iPhone14,3
Process: waimai [16584]
Path: /private/var/containers/Bundle/Application/CCCFC0AE-EFB8-4BD8-B674-ED089B776221/waimai.app/waimai
Identifier:
Version: 61488 (8.53.0)
Code Type: ARM-64
Parent Process: ? [1]
Date/Time: 2025-06-12 14:41:45.296 +0800
OS Version: iOS 18.0 (22A3354)
Report Version: 104
Monitor Type: Mach Exception
Exception Type: EXC_BAD_ACCESS (SIGBUS)
Exception Codes: KERN_PROTECTION_FAILURE at 0x000000014fffae00
Crashed Thread: 57
Thread 57 Crashed:
0 libMTLHud.dylib esfm_GenerateTriangesForString + 408
1 libMTLHud.dylib esfm_GenerateTriangesForString + 92
2 libMTLHud.dylib Renderer::DrawText(char const*, int, unsigned int) + 204
3 libMTLHud.dylib Overlay::onPresent(id<CAMetalDrawable>) + 1656
4 libMTLHud.dylib CAMetalDrawable_present(void (*)(), objc_object*, objc_selector*) + 72
5 libMTLHud.dylib invocation function for block in void replaceMethod<void>(objc_class*, objc_selector*, void (*)(void (*)(), objc_object*, objc_selector*)) + 56
6 Metal __45-[_MTLCommandBuffer presentDrawable:options:]_block_invoke + 104
7 Metal MTLDispatchListApply + 52
8 Metal -[_MTLCommandBuffer didScheduleWithStartTime:endTime:error:] + 312
9 IOGPU IOGPUNotificationQueueDispatchAvailableCompletionNotifications + 136
10 IOGPU __IOGPUNotificationQueueSetDispatchQueue_block_invoke + 64
11 libdispatch.dylib _dispatch_client_callout4 + 20
12 libdispatch.dylib _dispatch_mach_msg_invoke + 464
13 libdispatch.dylib _dispatch_lane_serial_drain + 368
14 libdispatch.dylib _dispatch_mach_invoke + 456
15 libdispatch.dylib _dispatch_lane_serial_drain + 368
16 libdispatch.dylib _dispatch_lane_invoke + 432
17 libdispatch.dylib _dispatch_lane_serial_drain + 368
18 libdispatch.dylib _dispatch_lane_invoke + 380
19 libdispatch.dylib _dispatch_root_queue_drain_deferred_wlh + 288
20 libdispatch.dylib _dispatch_workloop_worker_thread + 540
21 libsystem_pthread.dylib _pthread_wqthread + 288
我们测试了几个不同的机型,只有iPhone 13 Pro Max会发生崩溃。
Q1:为什么会发生这个崩溃?
Q2:相同的逻辑,为什么仅在iPhone 13 Pro Max机型上出现崩溃?
期待您的解答。
Added achievements to my approved app. Added them for the next release version, which I am running in simulator. When I look at the Achievements page, I can see that there are 17 Achievements available (correct), but they all show as hidden, despite checking the "No" box in App Store Connect.
Topic:
Graphics & Games
SubTopic:
GameKit
This is the first time I have tried adding achievements to my games. The online documentation says:
"Before you can access achievements in your code, you can configure them in Xcode and sync the configuration updates you make with App Store Connect. Begin configuring achievements by creating a GameKit configuration file. In Xcode, choose File > New > File from Template. Select GameKit File, and click Next. In the sheet that appears, enter a name for the configuration and click Create."
Sounds easy - except Xcode 16.2 does not have a GameKit File as one of the Templates. Please advise on how to proceed and it would be nice if the documentation were updated.
Topic:
Graphics & Games
SubTopic:
GameKit
Hi all,
Wondering how I would go about creating a plugin/class to support a new (physical/hardware) device with the game controller framework?
Between GCVirtualController on iOS and the "KeyboardAndMouseSupport.bundle" I see inside GameController.framework on my Mac, it looks like the framework must be designed to support this but I can't find any documentation.
Thanks!
On MacBook Pro M3 14" I can profile the Metal App performance by running it, then clicking on the M icon and choosing profile after replay.
On Mac Studio M2 Ultra I cannot: the profiler starts and crashes. I have tried everything including reinstalling the OS, Xcode, the Metal SDK, you name it.
The app uses the Metal 4 API. The content of the replayer errorinfo report is shown at the end.
Any ideas what is going on here and/or what else I can do do root cause this and fix it?
FWIW, it was worse on 26.1 (Xcode just reported Metal 4 profiling not available). In 26.2 Xcode attempts to profile and invariably crashes.
=== Error summary: ===
1x DYErrorDomain (512) - guest app crashed (512)
1x com.apple.gputools.MTLReplayer (100) - Abort trap: 6
=== First Error ===
Domain: DYErrorDomain
Error code: 512
Description: guest app crashed (512)
GTErrorKeyPID: 26913
GTErrorKeyProcessName: GPUToolsReplayService
GTErrorKeyCrashDate: 2026-01-09 19:22:52 +0000
=== Underlying Error #1 ===
Domain: com.apple.gputools.MTLReplayer
Error code: 100
Description: Abort trap: 6
Call stack:
0 GPUToolsReplay 0x0000000249c25850 MakeNSError + 284
1 GPUToolsReplay 0x0000000249c26428 HandleCrashSignal + 252
2 libsystem_platform.dylib 0x00000001856c7744 _sigtramp + 56
3 libsystem_pthread.dylib 0x00000001856bd888 pthread_kill + 296
4 libsystem_c.dylib 0x00000001855c2850 abort + 124
5 libsystem_c.dylib 0x00000001855c1a84 err + 0
6 IOGPU 0x00000001a9ea60a8 -[IOGPUMetal4CommandQueue _commit:count:commitFeedback:].cold.1 + 0
7 IOGPU 0x00000001a9ea0df8 __77-[IOGPUMetal4CommandQueue commitFillArgs:count:args:argsSize:commitFeedback:]_block_invoke + 0
8 IOGPU 0x00000001a9ea1004 -[IOGPUMetal4CommandQueue _commit:count:commitFeedback:] + 148
9 AGXMetalG14X 0x00000001158a2c98 -[AGXG14XFamilyCommandQueue_mtlnext noMergeCommit:count:options:commitFeedback:error:] + 116
10 AGXMetalG14X 0x0000000115a45c14 +[AGXG14XFamilyRenderContext_mtlnext mergeRenderEncoders:count:options:commitFeedback:queue:error:] + 4740
11 AGXMetalG14X 0x00000001158a2b34 -[AGXG14XFamilyCommandQueue_mtlnext commit:count:options:] + 96
12 GPUToolsReplay 0x0000000249bf0644 GTMTLReplayController_defaultDispatchFunction_noPinning + 2744
13 GPUToolsReplay 0x0000000249befb10 GTMTLReplayController_defaultDispatchFunction + 1368
14 GPUToolsReplay 0x0000000249b7a61c _ZL16DispatchFunctionP21GTMTLReplayControllerPK11GTTraceFuncRb + 476
15 GPUToolsReplay 0x0000000249b8603c ___ZN35GTUSCSamplingStreamingManagerHelper19StreamFrameTimeDataEv_block_invoke + 456
16 Foundation 0x0000000186f6c878 __NSBLOCKOPERATION_IS_CALLING_OUT_TO_A_BLOCK__ + 24
17 Foundation 0x0000000186f6c740 -[NSBlockOperation main] + 96
18 Foundation 0x0000000186f6c6d8 __NSOPERATION_IS_INVOKING_MAIN__ + 16
19 Foundation 0x0000000186f6c308 -[NSOperation start] + 640
20 Foundation 0x0000000186f6c080 __NSOPERATIONQUEUE_IS_STARTING_AN_OPERATION__ + 16
21 Foundation 0x0000000186f6bf70 __NSOQSchedule_f + 164
22 libdispatch.dylib 0x00000001855104d0 _dispatch_block_async_invoke2 + 148
23 libdispatch.dylib 0x000000018551aad4 _dispatch_client_callout + 16
24 libdispatch.dylib 0x00000001855056e4 _dispatch_continuation_pop + 596
25 libdispatch.dylib 0x0000000185504d58 _dispatch_async_redirect_invoke + 580
26 libdispatch.dylib 0x0000000185512fc8 _dispatch_root_queue_drain + 364
27 libdispatch.dylib 0x0000000185513784 _dispatch_worker_thread2 + 180
28 libsystem_pthread.dylib 0x00000001856b9e10 _pthread_wqthread + 232
29 libsystem_pthread.dylib 0x00000001856b8b9c start_wqthread + 8
Replayer breadcrumbs:
[
]
GTErrorKeyProcessSignal: SIGABRT
=== Setup ===
Capture device: star.localdomain (Mac14,14) - macOS 26.2 (25C56) - 0BA10D1D-D340-5F2E-934B-536675AF9BA1
Metal version: 370.64.2
Supported graphics APIs:
Metal device: Apple M2 Ultra
Supported GPU families: Apple1 Apple2 Apple3 Apple4 Apple5 Apple6 Apple7 Apple8 Mac1 Mac2 Common1 Common2 Common3 Metal3 Metal4
Replay device: star (Mac14,14) - macOS 26.2 (25C56) - 0BA10D1D-D340-5F2E-934B-536675AF9BA1
Metal version: 370.64.2
Supported graphics APIs:
Metal device: Apple M2 Ultra
Supported GPU families: Apple1 Apple2 Apple3 Apple4 Apple5 Apple6 Apple7 Apple8 Mac1 Mac2 Common1 Common2 Common3 Metal3 Metal4
Host: Mac14,14 - macOS 26.2 (25C56)
Tool: Xcode (17C52)
Known SDKs:
Topic:
Graphics & Games
SubTopic:
Metal
I noticed that MTLPixelFormat has this cases:
case r32Float = 55
case rg32Float = 105
case rgba32Float = 125
But no case rgb32Float. What's the reason for such a discrimination?
TL;DR: RealityKit and Reality Composer Pro aren't forward or backward compatible with each other, and the resulting error message is terse and unhelpful. (FB14828873)
So far, I've been sticking with Xcode 16.4 for development and only using Xcode 26.0 beta experimentally.
Yesterday, I used xcode-select to switch to Xcode 26.0 beta 3 to test it, but I forgot to switch back.
Consequently, this morning I unintentionally used the future Reality Composer Pro (the version included with Xcode 26) to make a small change to a USD file.
Now I realize that if I'm unlucky, it's possible Reality Composer Pro may have silently introduced a small change into the USD file that may make RealityKit fail to read the file on iOS 18 and visionOS 2, which in the past has resulted in hours of debugging to track down the source of the failure, often a single line in the USD file that RealityKit can't communicate to me other than with the error "the operation couldn't be completed".
As an analogy, this situation is as if, during regular development (not involving Reality Composer Pro), Xcode didn't warn you about specific API version conflicts, but instead failed with a generic error message, without highlighting the line in your Swift file that was the source of the error.
Hello everyone,
I'm working on a visionOS application using RealityKit and am encountering a common coordinate system challenge when integrating 3D models created in Blender.
My goal is to display and dynamically update the Transform (position, rotation, scale) of models created in Blender within RealityKit.
The issue arises because Blender's default coordinate system is Z-up, and while exporting to USD/USDZ, I don't have a reliable "Y-up" export option that correctly reorients the model and its transform data for RealityKit's Y-up convention. This means I'm essentially exporting models with their "up" direction along the Z-axis.
When I load these Z-up exported models into RealityKit, they are often oriented incorrectly. To then programmatically update their Transform (e.g., move them, rotate them based on game logic, or apply physics), I need to ensure that the Transform values I set align with RealityKit's Y-up system, even though the original model data was authored in a Z-up context.
My questions are:
What is the recommended transformation process (e.g., using simd_quatf or simd_float4x4) to convert a Transform that was conceptually defined in a Z-up coordinate system to RealityKit's Y-up coordinate system? Specifically, when I have a Transform (or its translation, rotation, scale components) from a Z-up context, how should I apply this to a RealityKit Entity so it appears and behaves correctly in a Y-up world?
Are there any existing convenience APIs or helper functions within RealityKit, simd, or other Apple frameworks that simplify this Z-up to Y-up Transform conversion process? Or is a manual application of a transformation quaternion (e.g., simd_quatf(angle: -.pi / 2, axis: [1, 0, 0])) the standard approach?
Any guidance, code examples, or best practices from those who have faced similar challenges would be greatly appreciated!
Thank you.
Topic:
Graphics & Games
SubTopic:
RealityKit
Tags:
Reality Composer
RealityKit
Reality Composer Pro
visionOS
hello i am new to apple ecosystem and development i have some coding experience with c# now i like to develop my game for iphone 16 and up(due to ability to run ai models) but i am having hard time figuring out what to use there is a lot of resources for scene-kit but on its doc page it says its deprecated so i look at the reality-kit docs and tutorials and its strictly tells how to develop for visionos and i am really confused about this since there is no tutorials that shows how to develop a game for ios with reality-kit that does not focus visionos. i just want to develop for iphone 16 and up but i cant find resources focuses at that.
Topic:
Graphics & Games
SubTopic:
RealityKit
Due to the release of ProMotion devices, the system may switch frame rates in certain scenarios, resulting in the loss of reference value for data collected through CADisplayLink callbacks at a fixed 60Hz frame rate. We cannot distinguish whether the slow callback of CADisplayLink is due to a stutter or a system switch in frame rate.
I know Hitch Time Ratio, but I can't use this scheme for some reasons.
How can I distinguish between stuck and frame rate gear shift in CADisplaylink callback?
In iOS 15, CADisplayLink.preferredFrameRateRange.preferred always returns 0, while minimum and maximum do change. Can I use these minimum and maximum range values as criteria to distinguish between frame rate switching and stuttering?
Hi,
I’m using the latest iPad Pro (13-inch) and I can see that Metal offers an rgb10a2unorm texture for rendering, but when I render a grey ramp and measure the actual luminance, I get a pattern that I would expect from an 8-bit texture (see below). Before I start ripping apart all my code, is there anything else I need to do to convince iOS to render my texture in 10-bit?
I already tried setting the PixelFormat in my CMetalLayer to rgb10a2unorm, but that didn’t change anything.