Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

How can I get pixel coordinates in the fragment tile function?
In this video, tile fragment shading is recommended for image processing. In this example, the unpack function takes two arguments, one of which is RasterizerData. As I understand it, this is the data passed to us from the previous stage (Vertex) of the graphics pipeline. However, the properties of MTLTileRenderPipelineDescriptor do not include an option for specifying a Vertex function. Therefore, in this render pass, a mix of commands is used: first, a draw command is executed to obtain UV coordinates, and then threads are dispatched. My question is: without using a draw command, only dispatch, how can I get pixel coordinates in the fragment tile function? For the kernel tile function, everything is clear. typedef struct { float4 OPTexture [[ color(0) ]]; float4 IntermediateTex [[ color(1) ]]; } FragmentIO; fragment FragmentIO Unpack(RasterizerData in [[ stage_in ]], texture2d<float, access::sample> srcImageTexture [[texture(0)]]) { FragmentIO out; //... // Run necessary per-pixel operations out.OPTexture = // assign computed value; out.IntermediateTex = // assign computed value; return out; }
1
0
190
Mar ’25
Race conditions when changing CAMetalLayer.drawableSize?
Is the pseudocode below thread-safe? Imagine that the Main thread sets the CAMetalLayer's drawableSize to a new size meanwhile the rendering thread is in the middle of rendering into an existing MTLDrawable which does still have the old size. Is the change of metalLayer.drawableSize thread-safe in the sense that I can present an old MTLDrawable which has a different resolution than the current value of metalLayer.drawableSize? I assume that setting the drawableSize property informs Metal that the next MTLDrawable offered by the CAMetalLayer should have the new size, right? Is it valid to assume that "metalLayer.drawableSize = newSize" and "metalLayer.nextDrawable()" are internally synchronized, so it cannot happen that metalLayer.nextDrawable() would produce e.g. a MTLDrawable with the old width but with the new height (or a completely invalid resolution due to potential race conditions)? func onWindowResized(newSize: CGSize) { // Called on the Main thread metalLayer.drawableSize = newSize } func onVsync(drawable: MTLDrawable) { // Called on a background rendering thread renderer.renderInto(drawable: drawable) }
1
1
522
Dec ’25
Compute kernel fails to compile when calling texture.read()
If I compile a compute kernel with a call to texture.read(), it fails with the following error: "Error Domain=AGXMetalG13X Code=3 "Encountered unlowered function call to air.get_read_sampler" UserInfo={NSLocalizedDescription=Encountered unlowered function call to air.get_read_sampler}." This error occurs on both macOS and iOS 26 Beta 5, but not when running on a simulator or in a playground. It does not occur on a macOS Sequoia VM. It occurs whether I use the old metal 3 or new metal 4 compilation method. A workaround would be to use a sampler, but according to the feature tables, all platforms support reading from textures of all formats. Below is a minimal example which produces the error: let device = MTLCreateSystemDefaultDevice()! let library = device.makeDefaultLibrary()! let computeFunction = library.makeFunction(name: "compute_test")! do { let pipeline = try device.makeComputePipelineState(function: computeFunction) debugPrint(pipeline) } catch { debugPrint("Metal 3 failed with error:\n\(error)") } #import <metal_stdlib> using namespace metal; kernel void compute_test(uint2 gid [[thread_position_in_grid]], texture2d<float, access::read> in [[texture(0)]], texture2d<float, access::write> out [[texture(1)]]) { out.write(in.read(gid), gid); } I filed feedback FB19530049.
1
0
216
Aug ’25
moveCharacter reports collision with itself
I'm running into an issue with collisions between two entities with a character controller component. In the collision handler for moveCharacter the collision has both hitEntity and characterEntity set to the same object. This object is the entity that was moved with moveCharacter() The below example configures 3 objects. stationary sphere with character controller falling sphere with character controller a stationary cube with a collision component if the falling sphere hits the stationary sphere then the collision handler reports both hitEntity and characterEntity to be the falling sphere. I would expect that the hitEntity would be the stationary sphere and the character entity would be the falling sphere. if the falling sphere hits the cube with a collision component the the hit entity is the cube and the characterEntity is the falling sphere as expected. Is this the expected behavior? The entities act as expected visually however if I want the spheres to react differently depending on what character they collided with then I am not getting the expected results. IE: If a player controlled character collides with a NPC then exchange resource with NPC. if player collides with enemy then take damage. import SwiftUI import RealityKit struct ContentView: View { @State var root: Entity = Entity() @State var stationary: Entity = createCharacter(named: "stationary", radius: 0.05, color: .blue) @State var falling: Entity = createCharacter(named: "falling", radius: 0.05, color: .red) @State var collisionCube: Entity = createCollisionCube(named: "cube", size: 0.1, color: .green) //relative to root @State var fallFrom: SIMD3<Float> = [0,0.5,0] var body: some View { RealityView { content in content.add(root) root.position = [0,-0.5,0.0] root.addChild(stationary) stationary.position = [0,0.05,0] root.addChild(falling) falling.position = fallFrom root.addChild(collisionCube) collisionCube.position = [0.2,0,0] collisionCube.components.set(InputTargetComponent()) } .gesture(SpatialTapGesture().targetedToAnyEntity().onEnded { tap in let tapPosition = tap.entity.position(relativeTo: root) falling.components.remove(FallComponent.self) falling.teleportCharacter(to: tapPosition + fallFrom, relativeTo: root) }) .toolbar { ToolbarItemGroup(placement: .bottomOrnament) { HStack { Button("Drop") { falling.components.set(FallComponent(speed: 0.4)) } Button("Reset") { falling.components.remove(FallComponent.self) falling.teleportCharacter(to: fallFrom, relativeTo: root) } } } } } } @MainActor func createCharacter(named name: String, radius: Float, color: UIColor) -> Entity { let character = ModelEntity(mesh: .generateSphere(radius: radius), materials: [SimpleMaterial(color: color, isMetallic: false)]) character.name = name character.components.set(CharacterControllerComponent(radius: radius, height: radius)) return character } @MainActor func createCollisionCube(named name: String, size: Float, color: UIColor) -> Entity { let cube = ModelEntity(mesh: .generateBox(size: size), materials: [SimpleMaterial(color: color, isMetallic: false)]) cube.name = name cube.generateCollisionShapes(recursive: true) return cube } struct FallComponent: Component { let speed: Float } struct FallSystem: System{ static let predicate: QueryPredicate<Entity> = .has(FallComponent.self) && .has(CharacterControllerComponent.self) static let query: EntityQuery = .init(where: predicate) let down: SIMD3<Float> = [0,-1,0] init(scene: RealityKit.Scene) { } func update(context: SceneUpdateContext) { let deltaTime = Float(context.deltaTime) for entity in context.entities(matching: Self.query, updatingSystemWhen: .rendering) { let speed = entity.components[FallComponent.self]?.speed ?? 0.5 entity.moveCharacter(by: down * speed * deltaTime, deltaTime: deltaTime, relativeTo: nil) { collision in if collision.hitEntity == collision.characterEntity { print("hit entity has collided with itself") } print("\(collision.characterEntity.name) collided with \(collision.hitEntity.name) ") } } } } #Preview(windowStyle: .volumetric) { ContentView() }
1
0
178
Aug ’25
Game Center fetchSavedGames sometimes returns empty list of games, although it works correctly on the next tries
I have implemented the Game Center for authentication and saving player's game data. Both authentication and saving player's data works correctly all the time, but there is a problem with fetching and loading the data. The game works like this: At the startup, I start the authentication After the player successfully logs in, I start loading the player's data by calling fetchSavedGames method If a game data exists for the player, I receive a list of SavedGame object containing the player's data The problem is that after I uninstall the game and install it again, sometimes the SavedGame list is empty(step 3). But if I don't uninstall the game and reopen the game, this process works fine. Here's the complete code of Game Center implementation: class GameCenterHandler { public func signIn() { GKLocalPlayer.local.authenticateHandler = { viewController, error in if let viewController = viewController { viewController.present(viewController, animated: false) return } if error != nil { // Player could not be authenticated. // Disable Game Center in the game. return } // Auth successfull self.load(filename: "TestFileName") } } public func save(filename: String, data: String) { if GKLocalPlayer.local.isAuthenticated { GKLocalPlayer.local.saveGameData(Data(data.utf8), withName: filename) { savedGame, error in if savedGame != nil { // Data saved successfully } if error != nil { // Error in saving game data! } } } else { // Error in saving game data! User is not authenticated" } } public func load(filename: String) { if GKLocalPlayer.local.isAuthenticated { GKLocalPlayer.local.fetchSavedGames { games, error in if let game = games?.first(where: {$0.name == filename}){ game.loadData { data, error in if data != nil { // Data loaded successfully } if error != nil { // Error in loading game data! } } } else { // Error in loading game data! Filename not found } } } else { // Error in loading game data! User is not authenticated } } } I have also added Game Center and iCloud capabilities in xcode. Also in the iCloud section, I selected the iCloud Documents and added a container. I found a simillar question here but it doesn't make things clearer.
1
0
1.1k
Dec ’25
virtual game controller + SwiftUI warning
Hi, I've just moved my SpriteKit-based game from UIView to SwiftUI + SpriteView and I'm getting this mesage Adding 'GCControllerView' as a subview of UIHostingController.view is not supported and may result in a broken view hierarchy. Add your view above UIHostingController.view in a common superview or insert it into your SwiftUI content in a UIViewRepresentable instead. Here's how I'm doing this struct ContentView: View { @State var alreadyStarted = false let initialScene = GKScene(fileNamed: "StartScene")!.rootNode as! SKScene var body: some View { ZStack { SpriteView(scene: initialScene, transition: .crossFade(withDuration: 1), isPaused: false , preferredFramesPerSecond: 60) .edgesIgnoringSafeArea(.all) .onAppear { if !self.alreadyStarted { self.alreadyStarted.toggle() initialScene.scaleMode = .aspectFit } } VirtualControllerView() .onAppear { let virtualController = BTTSUtilities.shared.makeVirtualController() BTTSSharedData.shared.virtualGameController = virtualController BTTSSharedData.shared.virtualGameController?.connect() } .onDisappear { BTTSSharedData.shared.virtualGameController?.disconnect() } } } } struct VirtualControllerView: UIViewRepresentable { func makeUIView(context: Context) -> UIView { let result = PassthroughView() return result } func updateUIView(_ uiView: UIView, context: Context) { } } class PassthroughView: UIView { override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? { for subview in subviews.reversed() { let convertedPoint = convert(point, to: subview) if let hitView = subview.hitTest(convertedPoint, with: event) { return hitView } } return nil } }
1
0
489
Sep ’25
GKMatch rule-based matching. Can't match more than 3 people.
Matchmaking rules https://developer.apple.com/documentation/gamekit/matchmaking-rules?language=objc AppStoreConnectApi rules https://developer.apple.com/documentation/appstoreconnectapi/rules?language=objc ・Environment Unity 6000.2.2f1 XCode 16.1 iOS 26 3 iPhones ・AppStoreConnectApi rules "type": "gameCenterMatchmakingRuleSets", "id": "f6a88caf-85db-42bf-xxxxxxxxxxxxxxxxxxxx", "attributes": { "referenceName": "co.mygame.RuleSets.GvERandom34", "ruleLanguageVersion": 1, "minPlayers": 3, "maxPlayers": 4 }, "type": "gameCenterMatchmakingRules", "id": "6afa68ce-4d2c-496f-xxxxxxxxxxxxxxxxxxxx", "attributes": { "referenceName": "GameVersion", "description": "Check Game Version. GvERandom34", "type": "COMPATIBLE", "expression": "requests[0].properties.gameVersion == requests[1].properties.gameVersion", "weight": null }, "type": "gameCenterMatchmakingQueues", "id": "7fb645ef-4eca-4510-xxxxxxxxxxxxxxxxxxxx", "attributes": { "referenceName": "co.mygame.que.GvERandom34", "classicMatchmakingBundleIds": [] }, ・Objective-C Execution code queueName = "co.mygame.que.GvERandom34" keyStr = "gameVersion " valueStr = "1.0" - (void)MatchQueueParamStr1Start:(NSString*)queueName keyStr:(NSString*)keyStr valueStr:(NSString*)valueStr { if (@available(iOS 17.2, tvOS 17.2, macOS 14.2, visionOS 1.1, *) == NO) { DBGLOG(@"MatchQueueParamStr1Start Not support."); return; } self->_matchMakingFlag = YES; self->_matchFinishFlag = NO; self->_myMatch = nil; GKMatchRequest *req = [[GKMatchRequest alloc] init]; if (@available(iOS 17.2, tvOS 17.2, macOS 14.2, visionOS 1.1, *)) { req.queueName = queueName; req.properties = @{keyStr: valueStr}; } [[GKMatchmaker sharedMatchmaker] findMatchForRequest:req withCompletionHandler: ^(GKMatch *match, NSError *error) { if (error) { [self SetupErrorInfo:error descriptionText:@"findMatchForRequest"]; } else if(match) { self->_myMatch = match; self->_myMatch.delegate = self; } self->_matchMakingFlag = NO; self->_matchFinishFlag = YES; }]; } ・ I'm trying to match with three devices. Matching doesn't work. 5 minutes later times out. What's the problem?
1
0
273
Nov ’25
Distortion Artifacts on VisionOS When Rendering Opaque/Alpha Clipped Foliage in URP (Unity 6.0, Metal)
I'm running into a persistent visual issue while deploying a floral corridor scene to Apple Vision Pro using Unity 6.0 with URP and Metal. The issue only appears on the Vision Pro device — everything looks fine in the Unity Editor. Issue Description When the frame rate drops to around 60–70 FPS, noticeable distortion artifacts appear around the edges of foliage models. It seems like the background meshes (behind the plants) get warped and leak through the edges of the foliage. Although this is most visible around the leaves, even solid objects like standard URP wall or box models show distorted edges when the issue occurs. All the foliage uses Opaque or Alpha Clipping materials. Things I've Tried Changing the foliage materials to Transparent mode —distortion around edges disappears, but using Transparent for a large number of foliage assets is not ideal for performance or sorting complexity. Reducing the number of foliage objects — with only a few plants in the scene and the frame rate staying around 100 FPS, the distortion disappears. However, this isn’t a practical solution for a full environment. Possible Cause? I came across this note in the Unity documentation: "Ensure depth-buffer for each pixel is non-zero - on visionOS, the depth buffer is used for reprojection. To ensure visual effects like skyboxes and shaders are displayed beautifully, ensure that some value is written to the depth for each pixel." Could this be related to the issue? Is it possible that Alpha Clipping with low pixel coverage leads to some pixels not writing to the depth buffer, which then causes problems during Vision Pro’s reprojection or foveated rendering? However, even when I disable Alpha Clipping entirely, the distortion issue still persists, so it may not be solely caused by clipping itself. Project Setup Unity 6.0 (URP) Depth Texture: Enable Using Metal as the graphics backend Running on real Vision Pro hardware (not simulator) Any advice on how to avoid these distortion issues on Vision Pro would be greatly appreciated. Thanks!
1
0
149
Jul ’25
Trouble with MDLMesh.newBox()
I'm trying to build an MDLMesh then add normals let mdlMesh = MDLMesh.newBox(withDimensions: SIMD3<Float>(1, 1, 1), segments: SIMD3<UInt32>(2, 2, 2), geometryType: MDLGeometryType.triangles, inwardNormals:false, allocator: allocator) mdlMesh.addNormals(withAttributeNamed: MDLVertexAttributeNormal, creaseThreshold: 0) When I render the mesh, some normals are (0,0,0). I don't know if the problem is in the mesh, or in the conversion to MTKMesh. Is there a way to examine an MDLMesh with the geometry viewer? When I look at the variable values for my mdlMesh I get this: Not too useful. I don't know how to track down the normals. What's the best way to find out where the normals getting broken?
1
0
166
May ’25
PhotogrammetrySession fails with internal errors 4011 / 4012 when using iOS Object Capture (Area Mode) images
Hi all, I’m running into an issue when trying to reconstruct a 3D model using PhotogrammetrySession on macOS from a set of images captured via the iOS Object Capture sample app, specifically in Area mode. When I attempt to create the model from these images (using the raw Images/ folder exported directly from the capture session), I get the following errors: ERROR cv3dapi.pg: Internal error codes (2): 4011 4012 WARN cv3dapi.pg: Internal warning codes (1): 4507 Output error with code = -15 requestError: CoreOC.PhotogrammetrySession.Error.processError I use the "Images" directory directly exported from Object Capture with my iphone 12 pro max (has lidar) set to "area mode" in the object capture app here is an example heic image metadata from the sequence. heif-info Images/00044.869568833.HEIC MIME type: image/heic main brand: heic compatible brands: mif1, MiHE, MiPr, miaf, MiHB, heic image: 3024x4032 (id=49), primary tiles: 6x8, tile size: 512x512 colorspace: YCbCr, 4:2:0 bit depth: 8 thumbnail: 240x320 color profile: nclx alpha channel: no depth channel: yes size: 192x256 bits per pixel: 8 z-near: 1.173828 z-far: 2.552734 d-min: undefined d-max: undefined representation: uniform Z metadata: Exif: 960 bytes uri /tag:apple.com,2023:ObjectCapture#CameraTrackingState: 4 bytes uri /tag:apple.com,2023:ObjectCapture#CameraCalibrationData: 1015 bytes uri /tag:apple.com,2023:ObjectCapture#ObjectTransform: 48 bytes uri /tag:apple.com,2023:ObjectCapture#ObjectBoundingBox: 48 bytes uri /tag:apple.com,2023:ObjectCapture#RawFeaturePoints: 832 bytes uri /tag:apple.com,2023:ObjectCapture#PointCloudData: 23984 bytes uri /tag:apple.com,2023:ObjectCapture#BundleVersion: 5 bytes uri /tag:apple.com,2023:ObjectCapture#SegmentID: 4 bytes uri /tag:apple.com,2024:ObjectCapture#SessionUUID: 36 bytes uri /tag:apple.com,2024:ObjectCapture#CaptureMode: 4 bytes uri /tag:apple.com,2023:ObjectCapture#Feedback: 4 bytes uri /tag:apple.com,2023:ObjectCapture#WideToDepthCameraTransform: 48 bytes uri /tag:apple.com,2023:ObjectCapture#TemporalDepthPointClouds: 864026 bytes transformations: angle (ccw): 270 region annotations: none properties: camera intrinsic matrix: focal length: 2813.695557; 2813.695557 principal point: 1522.338502; 2002.843018 skew: 0.000000 camera extrinsic matrix: rotation matrix: -0.695 0.344 -0.632 0.007 -0.875 -0.483 -0.719 -0.340 0.606 Questions: • What do internal error codes 4011 and 4012 refer to? • Is there something specific about Area mode captures that require preprocessing before they’re compatible with PhotogrammetrySession? • Has anyone successfully reconstructed a model from an Area mode session using the stock Apple tools? NOTE: I can provide the folder of images for debugging if that would help!
1
2
1k
Jul ’25
How to configure RealityKit entities for animations on a modular character?
I am currently using RealityKit (perspective camera) to render a character in my swiftUI app. The character has customization such as clothing items and hair and all objects are properly weighted to the rig. The way the model is setup in Blender is like so: Groups of objects that will be swapped (ex: Shoes -> Shoes objects) and an armature. I then export it to usdc with all objects active. This is the resulting entity hierarchy, viewed in Reality Composer Pro: My problem is that when I export with the Armature Modifier applied to the objects, so that animations get exported, the ModelComponent gets flattened to the armature and swapping entities is no longer as simple as removing the entity with the corresponding name. What's the best practice here? Should animation be exported separately and then applied to the skeleton? If so, how is that achieved? I'm not really sure how to proceed here.
1
0
103
May ’25
GKLeaderboard.loadLeaderboards returns empty array
After authenticating the user I'm loading my Game Center leaderboards like this: let leaderboards = try await GKLeaderboard.loadLeaderboards(IDs: [leaderboardID]) This is working fine, but there are times when this just returns an empty array. When I encounter this situation, the array remains empty for several hours when retrying, but then at some point it suddenly starts working again. Is this a known issue? Or am I hitting some kind of quota maybe (as I do it quite often while developing my game)?. Edit: My leaderboards are grouped in sets if that makes any difference here.
1
0
539
Dec ’25
Game Rejected as Spam
My app is being rejected and all I'm being told is that it is spam. I've tried improving various aspects of the game, but I just receive the same copy and paste rejection message each time. I have no idea if I'm moving in the right direction or what part of my game needs to be changed or improved. Is there a game quality benchmark document or some kind of resource I can use to better understand why my game is being rejected and how to bring it to a level that meets apple's standards.
1
0
118
Apr ’25
Improving person segmentation and occlusion quality in RealityKit
I’m building an app that uses RealityKit and specifically ARConfiguration.FrameSemantics.personSegmentationWithDepth. The goal is to insert an AR object into the scene behind a person, and an additional AR object in front of the person, while being as photo realistic as possible. Through testing, I’ve noticed that many times, the edges of the person segmentation mask are not well matched to the actual person, and parts of the person are transparent, with the AR object bleeding through. It’s sort of like a “bad green screen” effect, which I’d expect to see a little bit, but not to this extent. I’ve been testing on iPhone 16, iPhone 14 Pro, iPad Pro 12.9 inch 6th Generation, and iPhone 12 Pro, with similar results across all devices. I’m wondering what else I can do to improve this… either code changes, platform (like different iPhone models), or environment (like lighting, distance, etc). Attaching some example screen grabs and a minimum reproducible code sample. Appreciate any insights! import ARKit import SwiftUI import RealityKit struct RealityViewContainer: UIViewRepresentable { func makeUIView(context: Context) -> ARView { let arView = ARView(frame: .zero) arView.environment.sceneUnderstanding.options.insert(.occlusion) arView.renderOptions.insert(.disableMotionBlur) arView.renderOptions.insert(.disableDepthOfField) let configuration = ARWorldTrackingConfiguration() configuration.planeDetection = [.horizontal] if ARWorldTrackingConfiguration.supportsFrameSemantics(.personSegmentationWithDepth) { configuration.frameSemantics.insert(.personSegmentationWithDepth) } arView.session.run(configuration) arView.session.delegate = context.coordinator context.coordinator.arView = arView } func makeCoordinator() -> Coordinator { Coordinator(self) } class Coordinator: NSObject, ARSessionDelegate { var parent: RealityViewContainer var floorAnchor: ARPlaneAnchor? init(_ parent: RealityViewContainer) { self.parent = parent } func session(_ session: ARSession, didAdd anchors: [ARAnchor]) { if let arView,floorAnchor == nil { for anchor in anchors { if let horizontalPlaneAnchor = anchor as? ARPlaneAnchor, horizontalPlaneAnchor.alignment == .horizontal, horizontalPlaneAnchor.transform.columns.3.y < arView.cameraTransform.translation.y { // filter out ceiling floorAnchor = horizontalPlaneAnchor let backgroundEntity = BackgroundEntity() let anchorEntity = AnchorEntity(anchor: horizontalPlaneAnchor) anchorEntity.addChild(background) let foregroundEntity = ForegroundEntity() backgroundEntity.addChild(foregroundEntity) arView.scene.addAnchor(anchorEntity) arView.installGestures([.rotation, .translation], for: backgroundEntity) break // Stop after adding the first horizontal plane (floor) } } } } } }
1
0
121
May ’25
SKScene editor canvas gone
I've recently run into an issue in Xcode where the sks editor's preview canvas just vanishes for every project on my computer. I don't think it is an issue with my sks files because this works as expected on another computer with the same files, and when it happens it happens for ALL sks files in all projects. There used to be menu items to toggle the canvas and its settings, but those are now gone for me in sks files (they show up for swift files that have previews, however). Any idea what is going on here? How do I get the canvas back? I literally cannot get any work done on my primary computer because of this...
1
0
629
Dec ’25