Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

MTLCaptureManager.sharedCaptureManager generates corrupted .gputrace files (0KB, invalid internal structure)
Hello, I am experiencing an issue with programmatically capturing a GPU trace using MTLCaptureManager. The .gputrace file that is generated appears to be corrupted, and I'm looking for guidance or a solution. Description of the Problem: I am using MTLCaptureManager.sharedCaptureManager to capture a Metal frame and save it to disk. The generated .gputrace file is consistently reported as 0 bytes in size by the file system. Crucially, when I compress this 0-byte .gputrace file into a .zip archive, the resulting archive contains the full, expected data. After unzipping, the file can be opened and viewed correctly in Xcode. However,When inspecting the file's contents using NSFileManager in Objective-C (treating it as a directory), the internal structure is different from a .gputrace file captured directly from Xcode's Metal Debugger. capture in xcode capture in file Finally,When capturing multiple frames programmatically, the first captured frame contains valid buffer data. However, for subsequent frames (starting from the second frame), the corresponding buffer contents are all zero-filled. Frame 1: All MTLBuffer data is correctly captured and populated. Frame 2 and onward: The same MTLBuffer objects are present in the trace, but their contents are entirely 0 (i.e., the data is not captured or is corrupted). In this case, the on-screen display is normal, but the captured frame is incorrect. The frame captured directly in Xcode is also correct. Only the frame captured to a file is abnormal.
1
0
524
Aug ’25
Bugs custom 18.6
Hello, when I'm looking to customize the icons of my phone, the applications that are in the grouping genres without replacing with all-black images, I don't know what happens by changing the color of the applications in group of change no color throws just listen not the black stuff
1
0
151
Aug ’25
Will OpenGL API and Drivers be removed after appleOS 26?
Hi, I am a Multimedia and Graphics researcher and I am wondering if OpenGL API and drivers will be removed after appleOS 26? macOS 26 iOS 26 iPadOS 26 visionOS 26 I am asking this because most of the libraries I use depends on OpenGL. Like CGAL, libigl, immediate mode ui, nanovg, nanogui, bullet physics. Transitioning from Vulkan and metal while using and learning those libraries is just not viable. I would like to ask you that. I am the sole developer and I just want to ask you that. Regards.
1
0
230
Aug ’25
Sparse Texture Writes
Hey, I've been struggling with this for some days now. I am trying to write to a sparse texture in a compute shader. I'm performing the following steps: Set up a sparse heap and create a texture from it Map the whole area of the sparse texture using updateTextureMapping(..) Overwrite every value with the value "4" in a compute shader Blit the texture to a shared buffer Assert that the values in the buffer are "4". I have a minimal example (which is still pretty long unfortunately). It works perfectly when removing the line heapDesc.type = .sparse. What am I missing? I could not find any information that writes to sparse textures are unsupported. Any help would be greatly appreciated. import Metal func sparseTexture64x64Demo() throws { // ── Metal objects guard let device = MTLCreateSystemDefaultDevice() else { throw NSError(domain: "SparseNotSupported", code: -1) } let queue = device.makeCommandQueue()! let lib = device.makeDefaultLibrary()! let pipeline = try device.makeComputePipelineState(function: lib.makeFunction(name: "addOne")!) // ── Texture descriptor let width = 64, height = 64 let format: MTLPixelFormat = .r32Uint // 4 B per texel let desc = MTLTextureDescriptor() desc.textureType = .type2D desc.pixelFormat = format desc.width = width desc.height = height desc.storageMode = .private desc.usage = [.shaderWrite, .shaderRead] // ── Sparse heap let bytesPerTile = device.sparseTileSizeInBytes let meta = device.heapTextureSizeAndAlign(descriptor: desc) let heapBytes = ((bytesPerTile + meta.size + bytesPerTile - 1) / bytesPerTile) * bytesPerTile let heapDesc = MTLHeapDescriptor() heapDesc.type = .sparse heapDesc.storageMode = .private heapDesc.size = heapBytes let heap = device.makeHeap(descriptor: heapDesc)! let tex = heap.makeTexture(descriptor: desc)! // ── CPU buffers let bytesPerPixel = MemoryLayout<UInt32>.stride let rowStride = width * bytesPerPixel let totalBytes = rowStride * height let dstBuf = device.makeBuffer(length: totalBytes, options: .storageModeShared)! let cb = queue.makeCommandBuffer()! let fence = device.makeFence()! // 2. Map the sparse tile, then signal the fence let rse = cb.makeResourceStateCommandEncoder()! rse.updateTextureMapping( tex, mode: .map, region: MTLRegionMake2D(0, 0, width, height), mipLevel: 0, slice: 0) rse.update(fence) // ← capture all work so far rse.endEncoding() let ce = cb.makeComputeCommandEncoder()! ce.waitForFence(fence) ce.setComputePipelineState(pipeline) ce.setTexture(tex, index: 0) let threadsPerTG = MTLSize(width: 8, height: 8, depth: 1) let tgCount = MTLSize(width: (width + 7) / 8, height: (height + 7) / 8, depth: 1) ce.dispatchThreadgroups(tgCount, threadsPerThreadgroup: threadsPerTG) ce.updateFence(fence) ce.endEncoding() // Blit texture into shared buffer let blit = cb.makeBlitCommandEncoder()! blit.waitForFence(fence) blit.copy( from: tex, sourceSlice: 0, sourceLevel: 0, sourceOrigin: MTLOrigin(x: 0, y: 0, z: 0), sourceSize: MTLSize(width: width, height: height, depth: 1), to: dstBuf, destinationOffset: 0, destinationBytesPerRow: rowStride, destinationBytesPerImage: totalBytes) blit.endEncoding() cb.commit() cb.waitUntilCompleted() assert(cb.error == nil, "GPU error: \(String(describing: cb.error))") // ── Verify a few texels let out = dstBuf.contents().bindMemory(to: UInt32.self, capacity: width * height) print("first three texels:", out[0], out[1], out[width]) // 0 1 64 assert(out[0] == 4 && out[1] == 4 && out[width] == 4) } Metal shader: #include <metal_stdlib> using namespace metal; kernel void addOne(texture2d<uint, access::write> tex [[texture(0)]], uint2 gid [[thread_position_in_grid]]) { tex.write(4, gid); }
1
0
135
May ’25
RealityKit fails with EXC_BAD_ACCESS at CMClockGetAnchorTime in the simulator
Starting with iOS 18.0 beta 1, I've noticed that RealityKit frequently crashes in the simulator when an app launches and presents an ARView. I was able to create a small sample app with repro steps that demonstrates the issue, and I've submitted feedback: FB16144085 I've included a crash log with the feedback. If possible, I'd appreciate it if an Apple engineer could investigate and suggest a workaround. It's awkward to be restricted to the iOS 17 simulator, which does not exhibit this behavior. Please let me know if there's anything I can do to help. Thank you.
1
0
684
Apr ’25
NSScreen's maximumExtendedDynamicRangeColorComponentValue does not seem to provide the proper value after sleep/wake on third party HDR displays even when there is EDR content on screen in macOS Tahoe
The maximumExtendedDynamicRangeColorComponentValue should provide some value between 1.0 and maximumPotentialExtendedDynamicRangeColorComponentValue depending on the available EDR headroom if there is any content on-screen that uses EDR. This works fine in most scenarios but in macOS 26 Tahoe (including in 26.2) this seemingly breaks down when a third party external display is in HDR mode and the Mac goes to sleep and wakes up. After wake only a value of 1.0 is provided by the third party external display's NSScreen object, no matter what (although when the SDR peak brightness is being changed using the brightness slider, didChangeScreenParametersNotification is firing and the system should provide a proper updated headroom value). This makes dynamic tone-mapping that adapts to actual screen brightness impossible. Everything works fine in Sequoia. In Tahoe the user needs to turn off HDR, then go through a sleep/wake cycle and turn HDR back on to have this fixed, which is obviously not a sustainable workaround.
1
0
283
Dec ’25
Value of type 'SCRecordingOutput' has no member 'delegate'
Hello, I am trying to capture screen recording ( output.mp4 ) using ScreenCaptureKit and also the mouse positions during the recording ( mouse.json ). The recording and the mouse positions ( tracked based on mouse movements events only ) needs to be perfectly synced in order to add effects in post editing. I started off by using the await stream?.startCapture() and after that starting my mouse tracking function :- try await captureEngine.startCapture(configuration: config, filter: filter, recordingOutput: recordingOutput) let captureStartTime = Date() mouseTracker?.startTracking(with: captureStartTime) But every time I tested, there is a clear inconsistency in sync between the recorded video and the recorded mouse positions. The only thing I want is to know when exactly does the recording "actually" started so that I can start the mouse capture at that same time, and thus I tried using the Delegates, but being able to set them up perfectly. import Foundation import AVFAudio import ScreenCaptureKit import OSLog import Combine class CaptureEngine: NSObject, @unchecked Sendable { private let logger = Logger() private(set) var stream: SCStream? private var streamOutput: CaptureEngineStreamOutput? private var recordingOutput: SCRecordingOutput? private let videoSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.VideoSampleBufferQueue") private let audioSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.AudioSampleBufferQueue") private let micSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.MicSampleBufferQueue") func startCapture(configuration: SCStreamConfiguration, filter: SCContentFilter, recordingOutput: SCRecordingOutput) async throws { // Create the stream output delegate. let streamOutput = CaptureEngineStreamOutput() self.streamOutput = streamOutput do { stream = SCStream(filter: filter, configuration: configuration, delegate: streamOutput) try stream?.addStreamOutput(streamOutput, type: .screen, sampleHandlerQueue: videoSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .audio, sampleHandlerQueue: audioSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .microphone, sampleHandlerQueue: micSampleBufferQueue) self.recordingOutput = recordingOutput recordingOutput.delegate = self try stream?.addRecordingOutput(recordingOutput) try await stream?.startCapture() } catch { logger.error("Failed to start capture: \(error.localizedDescription)") throw error } } func stopCapture() async throws { do { try await stream?.stopCapture() } catch { logger.error("Failed to stop capture: \(error.localizedDescription)") throw error } } func update(configuration: SCStreamConfiguration, filter: SCContentFilter) async { do { try await stream?.updateConfiguration(configuration) try await stream?.updateContentFilter(filter) } catch { logger.error("Failed to update the stream session: \(String(describing: error))") } } func stopRecordingOutputForStream(_ recordingOutput: SCRecordingOutput) throws { try self.stream?.removeRecordingOutput(recordingOutput) } } // MARK: - SCRecordingOutputDelegate extension CaptureEngine: SCRecordingOutputDelegate { func recordingOutputDidStartRecording(_ recordingOutput: SCRecordingOutput) { let startTime = Date() logger.info("Recording output did start recording \(startTime)") } func recordingOutputDidFinishRecording(_ recordingOutput: SCRecordingOutput) { logger.info("Recording output did finish recording") } func recordingOutput(_ recordingOutput: SCRecordingOutput, didFailWithError error: any Error) { logger.error("Recording output failed with error: \(error.localizedDescription)") } } private class CaptureEngineStreamOutput: NSObject, SCStreamOutput, SCStreamDelegate { private let logger = Logger() override init() { super.init() } func stream(_ stream: SCStream, didOutputSampleBuffer sampleBuffer: CMSampleBuffer, of outputType: SCStreamOutputType) { guard sampleBuffer.isValid else { return } switch outputType { case .screen: break case .audio: break case .microphone: break @unknown default: logger.error("Encountered unknown stream output type:") } } func stream(_ stream: SCStream, didStopWithError error: Error) { logger.error("Stream stopped with error: \(error.localizedDescription)") } } I am getting error Value of type 'SCRecordingOutput' has no member 'delegate' Even though I am targeting macOs 15+ ( macOs 26 actually ) and macOs only. What is the best way to achieving the desired result? Is there any other / better way to do it?
1
0
287
Oct ’25
SceneKit - different behavior when debugging
Hello, I'm currently working on my first SceneKit game and have encountered an issue related to moving an SCNNode using a UIPanGestureRecognizer. When I deploy the game to my iPhone via Xcode in debug mode, all interactions are smooth. However, when I stop the debugging session and run the game directly from the device (outside of Xcode), the SCNNode movement behaves inconsistently — it works sometimes smoothly and sometimes not and the interaction becomes choppy. The SCNNode movement is controlled using a UIPanGestureRecognizer. Do you have any ideas what might be causing the issue?
1
0
361
May ’25
Does OLED burn-in affect iPad Pro displays, and should I implement a screen saver in my always-on app?
Hi everyone, I’m developing an iPad app that will be running continuously with the screen always on — similar to a restaurant ordering system. I understand that some of the newer iPad Pro models are equipped with OLED displays. I'm concerned about the potential risk of screen burn-in due to static UI elements being displayed for extended periods. Does burn-in occur on the OLED iPad Pro models under such usage? Would it be advisable to implement a screen saver or periodically animate/change parts of the UI to prevent this? Any insights or best practices would be greatly appreciated. Thank you!
1
0
139
Jun ’25
Metal fails to create PSO on AMD based GPUs
Hello, Shaders in our application is written using HLSL and we rely on Metal Shader Converter to convert DXIL to Metal IR. We ran into an issue that causes metal pipeline state creation to fail when vertex stage-in function is used on AMD GPUs. Here's the error reported by Metal in Xcode output: Compiler failed with XPC_ERROR_CONNECTION_INTERRUPTED XPC_ERROR_CONNECTION_INTERRUPTED MTLCompiler: Compilation failed with XPC_ERROR_CONNECTION_INTERRUPTED on 4 try. This error suggests an unexpected interruption in the connection. Possible reasons: a crash in the compiler service, termination by the OS due to resource constraints (e.g., jetsam), a timeout in the service, or an issue with IPC. Verify system stability and check the logs for more details. Compiler failed with XPC_ERROR_CONNECTION_INVALID XPC_ERROR_CONNECTION_INVALID MTLCompiler: Compiler encountered XPC_ERROR_CONNECTION_INVALID: failed to check-in, peer may have been unloaded: mach_error=10000003 (is the OS shutting down or process jetsammed?) Compilation failed due to an interrupted connection: XPC_ERROR_CONNECTION_INTERRUPTED. This error occurred after multiple retries. which seems to indicate a internal compiler error. I have a minimal repro here: https://github.com/kcloudy0717/metal_pso_fail/tree/main, simply follow the instructions in README.
1
0
190
Sep ’25
Gestures not working correctly when setting the fov orientation to .horizontal
Hi there, I've discovered an issue with gesture handling in RealityKit when setting the camera’s fieldOfViewOrientation to horizontal. For instance, if I render a simple box at the center of the view with a collision shape that exactly matches its dimensions, the actual hit area behaves as if it's smaller than the box. Additionally, when attempting to drag the box away from the center, the hit area appears misaligned—offset slightly towards the center. Since the default fieldOfViewOrientation is vertical and everything works as expected in that mode, it seems that the gesture system might be assuming a vertical FOV. Given that the API allows setting it to horizontal, perhaps gestures should function correctly regardless of the orientation? Thank you!
1
0
495
Mar ’25
[Bug] iPadOS 26: 4-Finger Fast Tap/Swipe Gesture Not Detected (Multitouch Issue)
Hi everyone I'm experiencing an issue with iPadOS 26 regarding multi-touch gesture detection. When performing a quick four-finger gesture (tap and swipe), the system often fails to recognize the input. This especially affects multi-touch gestures, such as rhythm games with difficult levels. Steps to Reproduce: Place four fingers on the screen. Perform a quick tap or a quick horizontal swipe (like the one used to switch apps). Observe whether the gesture is ignored or detected inconsistently. Expected Behavior: 4-finger multitouch gestures should be recognized regardless of gesture speed, just like previous iPadOS versions. Actual Behavior: Gestures fail to be detected when executed quickly—same gestures still work, and miss notes in rhythm games. You can check out my posts on Twitter/x and Facebook: [https://x.com/kokona_fwa/status/1978131164104728949?s=61] Facebook: [https://m.facebook.com/groups/idipad/permalink/24438964899058806/?]
1
1
1.2k
Oct ’25
Inquiry About Game Center Integration Analytics
Dear Apple Developer Support, I hope this message finds you well. I am a game developer looking to integrate Game Center into our game. Before proceeding, I would like to understand the analytical capabilities available post-integration. Specifically, I am interested in tracking detailed metrics such as: The number of new player downloads driven by Game Center features (e.g., friend challenges, leaderboards, or achievements). Data on user engagement and conversions originating from Game Center interactions. I have explored App Store Connect’s App Analytics but would appreciate clarification on whether Game Center-specific data (e.g., referrals from challenges or social features) is available to measure growth and optimize our strategies. If such data is accessible, could you guide me on how to view it? If not, are there alternative methods or plans to incorporate this in the future? Thank you for your time and assistance. I look forward to your response. Best regards, Bella
1
0
978
Sep ’25
Metal triangle strips uniform opacity.
I have this drawing app that I have been working on for the past few years when I have free time. I recently rebuilt the app in Metal to build out other brushes and improve performance, need to render 10000s of lines in realtime. I’m running into this issue trying to create a uniform opacity per path. I have a solution but do not love it - as this is a realtime app and the solution could have some bottlenecks. If I just generate a triangle strip from touch points and do my best to smooth, resample, and handle miters I will always get some overlaps. See: To create a uniform opacity I render to an offscreen texture with blending disabled. I then pre-multiply the color and draw that texture to a composite texture with blending on (I do this per path). This works but gets tricky when you introduce a textured brush, the edges of the texture in the frag shader cut out the line. Pasted Graphic 1.png Solution: I discard below a threshold fragment float4 fragment_line(VertexOut in [[stage_in]], texture2d<float> texture [[ texture(0) ]]) { constexpr sampler s(coord::normalized, address::mirrored_repeat, filter::linear); float2 texCoord = in.texCoord; float4 texColor = texture.sample(s, texCoord); if (texColor.a < 0.01) discard_fragment(); // may be slow (from what I read) return in.color * texColor; } Better but still not perfect. Question: I'm looking for better ways to create a uniform opacity per path. I tried .max blending but that will cause no blending of other paths. Any tips, ideas, much appreciated. If this is too detailed of a question just achieve.
1
0
115
Mar ’25
Metal: Intersection results unstable when reusing Instance Acceleration Structures
Hi all, I'm encountering an issue with Metal raytracing on my M5 MacBook Pro regarding Instance Acceleration Structure (IAS). Intersection tests suddenly stop working after a certain point in the sampling loop. Situation I implemented an offline GPU path tracer that runs the same kernel multiple times per pixel (sampleCount) using metal::raytracing. Intersection tests are performed using an IAS. Since this is an offline path tracer, geometries inside the IAS never changes across samples (no transforms or updates). As sampleCount increases, there comes a point where the number of intersections drops to zero, and remains zero for all subsequent samples. Here's a code sketch: let sampleCount: UInt16 = 1024 for sampleIndex: UInt16 in 0..<sampleCount { // ... do { let commandBuffer = commandQueue.makeCommandBuffer() // Dispatch the intersection kernel. await commandBuffer.completed() } do { let commandBuffer = commandQueue.makeCommandBuffer() // Use the intersection test results from the previous command buffer. await commandBuffer.completed() } // ... } kernel void intersectAlongRay( const metal::uint32_t threadIndex [[thread_position_in_grid]], // ... const metal::raytracing::instance_acceleration_structure accelerationStructure [[buffer(2)]], // ... ) { // ... const auto result = intersector.intersect(ray, accelerationStructure); switch (result.type) { case metal::raytracing::intersection_type::triangle: { // Write intersection result to device buffers. break; } default: break; } Observations Encoding both the intersection kernel and the subsequent result usage in the same command buffer does not resolve the problem. Switching from IAS to Primitive Acceleration Structure (PAS) fixes the problem. Rebuilding the IAS for each sample also resolves the issue. Intersections produce inconsistent results even though the IAS and rays are identical — Image 1 shows a hit, while Image 2 shows a miss. Questions Am I misusing IAS in some way ? Could this be a Metal bug ? Any guidance or confirmation would be greatly appreciated.
1
1
344
Dec ’25
Issues with installing Game Porting Toolkit 2.1
I am trying to install the Game Porting Toolkit 2.1 according to the Readme file provided with the toolkit. When I run the following command: WINEPREFIX=~/my-game-prefix brew --prefix game-porting-toolkit/bin/wine64 winecfg I get an error message: zsh: no such file or directory: /usr/local/opt/game-porting-toolkit/bin/wine64 I don't know how to resolve this. When I type in the command which brew , I get the path /usr/local/bin/brew What am I doing wrong?
1
0
367
Apr ’25
Monthly recurring leaderboard and placement achievement?
I would like a monthly recurring leaderboard, but the most days one can set for the recurring leaderboard is 30, and some months have 31 days (or 28/29). This is dumb. I guess I have to manually reset a classic leaderboard each month to get this result? Additionally once it closes and is about to reset (I also have daily recurring leaderboards), I'd like to grant the top placers on the leaderboard a corresponding achievement, but I don't see any way of doing this. I believe I can do all these things on PlayFab, but it'll take a bit more work, and eventually cost. Any one have advise?
1
0
201
Sep ’25
RealityKit VideoMaterial renders pink on iOS 18
our app is live, and it appears that since the ios 18 update - the VideoMaterial renders pink / purple color instead of the video (picture attached). the audio is rendered properly. we found that it occurs on old devices: iPhone 11 & iPhone SE 2020. I've found this thread of Andy Jazz on stackoverflow: Steps to Reproduce: Create a plane for the video screen. Apply a VideoMaterial using AVPlayerItem. Anchor the model entity to an ARImageAnchor. Expected Outcome: The video should play as a material on the plane in RealityKit. Actual Outcome: On iOS 18, the plane appears pink, indicating the VideoMaterial isn’t applied. What I’ve Tried: -Verified the video URL is correct. -Checked that the AVPlayerItem and VideoMaterial are initialised correctly. -Ensured the AVPlayer is playing the video. I also tried different formats (mov / mp4 / m4v), and verifying that the video's status is readyToPlay. any suggestions?
1
0
217
Jun ’25
GKLeaderboard.LoadLeaderboards() return empty when testing on Xcode with local gamekit file (Unity)
var allLeaderboards = await GKLeaderboard.LoadLeaderboards(); Log(allLeaderboards.Count); // returns 0 What am I missing?? What doesn’t work: await GKGameActivityDefinition.LoadGameActivityDefinitions() → count = 0 await GKLeaderboard.LoadLeaderboards() (no args) → 0 leaderboards await GKLeaderboard.LoadLeaderboards("MY ID") → returns 0 GkGameActivity.SetScoreOnLeaderboard(Leaderboard, score, context); returns an error since my Leaderboard is null. Activities and leaderboards are defined in GameCenterResources.gamekit in Xcode. Achievements that I add locally in the .gamekit file do not appear at runtime either; only ASC live ones show. ** What works:** xcode- debug- Gamekit- Manage Game progress- I can submit new scores with the plus button and see the notification on my device. await GKLocalPlayer.Authenticate() succeeds. await GKAchievement.LoadAchievements() returns the list of achievements configured in App Store Connect- but not any new local achievements created in Xcode in GameCenterResources.gamekit Environment Device/OS: iPhone on iOS 26.0 beta (Game Center sandbox) Xcode: 26.0 beta 6 Unity: 2022.3.21 Apple GameKit Unity plugin: 2025-beta1 (GameKit package) Signing: Game Center capability enabled; using development provisioning profile GameKit resources: GameCenterResources.gamekit in project (Target: Unity-iPhone), appears under Build Phases → Copy Bundle Resources
1
0
768
Sep ’25