It looks like, for some reason, our apps are using a bunch of power sometimes. sysdiagnose has this in the power log:
Never mind. Including the output of sysdiagnose has "sensitive language," and it won't tell me what is sensitive, making this a waste of my time.
ETA: Ok, I I can attach the file: power.log
I've gone through the energy documentation, but it seems geared towards embedded, not macOS, so I'm not sure how I can figure this out more. The extra problem, of course, is that we have a network extension, two daemons, and a GUI app. 😄
Instruments
RSS for tagInstruments is a performance-analysis and testing tool for iOS, iPadOS, watchOS, tvOS, and macOS apps.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I am using XCode16 and macOS 14.7.2. Previously, using the instruments on an iPhone with iOS 14.3 was normal, but when I upgraded to iOS 18, the instruments often couldn't find the library.
I have to restart the instruments to restore normal operation, but the problem will occur again after using it for a period of time
I am trying to record the requests and responses in a WKWebView, but instruments does not seem to record them. Is this to be expected?
The webView is set to inspectable, and I am using the HTTP Traffic instrument.
All the requests the app is doing are recorded, but neither the original request for the webView, nor subsequent traffic is recorded.
When I use Safari to inspect the webView, all I see is the last page (even when I start the inspector before the first request is made).
How can I see these requests?
I made a box with MDLMesh.newBox(). I added normals.
let mdlMesh = MDLMesh.newBox(withDimensions: SIMD3<Float>(1, 1, 1),
segments: SIMD3<UInt32>(2, 2, 2),
geometryType: MDLGeometryType.triangles,
inwardNormals:false,
allocator: allocator)
mdlMesh.addNormals(withAttributeNamed: MDLVertexAttributeNormal, creaseThreshold: 0.25)
After I convert to MTKMesh the normals are (0,0,0) for a group of vertices. I can only inspect the geometry after I convert to MTKMesh. Is there a way you can use Geometry Viewer on a MDLMesh?
hello, I got a question about coreml.
I loaded the coreml model in the project and set the computing unit to CPU+GPU.
When I used instruments to analyze the performance, I found that there was an overhead of prepare gpu request before each inference. I also checked the freezing point graph and found that memory was frequently allocated.
Is this as expected? Is there any way to avoid frequent prepares?
I have tried some methods, such as memory sharing of predict interface input parameters, but it seems to be ineffective.
Instruments: GPU Service reported error: Selected counter profile is not supported on target device`
I could use the Metal System Trace before the most recent update, but now whenever I try to profile using the Metal Counter instrument, I get the
[Warning] GPU Service reported error: Selected counter profile is not supported on target device. What is the issue here?
It seems as though using any initializer of SubscriptionOfferView or StoreView will create a memory leak.
This can be simply reproduced by adding this to your SwiftUI view:
SubscriptionOfferView(groupID: "yourgroupID", visibleRelationship: .all, useAppIcon: true)
or
StoreView(ids: ["monthly", "yearly"])
Tested on iOS 26 beta 2
Dear Apple Developer Support team,
I would like to request an official confirmation regarding the handling of transaction status in the App Store Server API, specifically for the GET /inApps/v1/transactions/{transactionId} endpoint.
As per our current understanding from the official documentation (Get Transaction Info), the API’s behavior appears to be:
If a transaction is finalized and successfully processed by App Store, querying this API will return HTTP 200 OK along with transaction details.
If a transaction is still in a pending or deferred state (such as awaiting Ask to Buy approval or pending authorization), the API will not return a 200, and instead respond with HTTP 404 Not Found or an appropriate error.
Could you please confirm if this behavior is accurate and officially supported?
Specifically:
Does a 200 OK response guarantee that a transaction is finalized and successfully recorded on App Store servers?
In cases where a transaction is pending approval (e.g. Ask to Buy), is it correct that GET /transactions/{transactionId} would return 404 Not Found until the transaction is finalized?
We would greatly appreciate your confirmation to align our server-side logic for transaction validation accordingly.
Thank you very much for your support!
Kind regards,
cuongnx
Topic:
Developer Tools & Services
SubTopic:
Instruments
Tags:
Wallet
StoreKit Test
StoreKit
Apple Pay
Hello
We use Datadog Mobile Vitals in our app and I'm trying to run some tools in Instruments for comparison. I'm not sure what tool should I use for some of those metrics:
Slow Renders
Description: With slow renders data, you can monitor which views are taking longer than 16ms or 60Hz to render.
Instruments equivalent: Hangs including microhangs (?)
CPU ticks per second
Description: RUM tracks CPU ticks per second for each view and the CPU utilization over the course of a session. The recommended range is <40 for good and <80 for moderate.
Instruments equivalent: CPU Profiler (?)
Frozen Frames -
Description: Frames that take longer than 700ms to render appear as stuck and unresponsive in your application. These are classified as frozen frames.
Instruments equivalent: Hangs with > 500ms (?)
Memory Utilization
Description: The amount of physical memory used by your application in bytes for each view, over the course of a session. The recommended range is <200MB for good and <400MB for moderate.
Instruments equivalent: Allocation (?)
I tried using Create ML of Xcode 26.0 beta 7 to generate a model using the "Word Tagging" template, and I received the error: Training progress unavailable - Unexpected error.
Using Create ML of XCode 16.4 with the same documentation, I was able to build the model and use it in a test app.
I'd like to understand why Create ML of Xcode 26 no longer works.
Topic:
Developer Tools & Services
SubTopic:
Instruments
Hello Apple team,
I am using xctrace to record an Allocations trace on iOS. For example:
xctrace record
--template "Allocations"
--launch com.example.myapp
--time-limit 30s
--output alloc.trace
After recording, I can export the results in Allocations List format (flat list of allocations) using:
xcrun xctrace export --input ./alloc.trace --xpath '/trace-toc/run/tracks/track[@name="Allocations"]/details/detail[@name="Allocations List"]' --output ./alloc.xml
This works fine and produces an XML output.
However, what I really need is to export the data in Call Tree format (as shown in Instruments GUI). I checked xctrace export --help, but it seems that the Allocations template only supports the List view for export, not the Call Tree breakdown.
My question is:
👉 Is there a way to export an Allocations trace in XML with Call Tree details using xctrace?
👉 If not, is there an API or recommended workflow to automate this instead of exporting manually from Instruments GUI?
Thanks in advance for your help!
Topic:
Developer Tools & Services
SubTopic:
Instruments
I'm working on a custom instrument that displays intervals from os_signpost data. I'd like to color the intervals in the graph based on data from an accompanying aggregate. For example, color the interval red if its duration is greater than 3 standard deviations from the mean. Is this possible?
Topic:
Developer Tools & Services
SubTopic:
Instruments
I have been working on battery consumption analysis for my application, and as part of this effort, I wanted to understand how competitor apps behave under similar usage conditions.
To do this, I downloaded competitor apps from the App Store and attached them to Instruments via Xcode. I then executed a defined set of manual test scenarios to simulate real user behavior. During these tests, the iPhone was connected to a Mac and charging continuously, which meant that System Power Usage logs were not generated in Instruments.
However, I was able to capture detailed metrics such as:
Network usage
CPU load
GPU activity
Display and brightness impact
Other runtime performance characteristics
Since direct battery drain data was unavailable, I used derived analysis (with AI assistance) to estimate approximate power consumption based on the above metrics, assuming real-device (battery-powered) conditions.
According to Apple documentation, System Power Usage in Instruments is not directly tied to the device’s battery percentage. Instead, it appears to be computed using contributing factors such as CPU, network, display, and other subsystem activity. This raises a few important questions about data reliability and methodology.
Key questions:
How reliable are Instruments-based metrics (CPU, network, display, GPU) for estimating real-world battery consumption when the device is charging?
Can these metrics be safely used as a comparative baseline between competitor applications, even if absolute battery drain values are unavailable?
Is the System Power Usage instrument essentially a derived model based on subsystem activity, and if so, does it remain accurate when the device is not discharging?
From Apple’s perspective, is this a valid approach for relative power comparison, provided that:
The same device is used
OS version is identical
Test scenarios are consistent and repeatable
Based on these findings, would it be reasonable to proceed with instrumenting our own application, run the same scenarios, and draw conclusions using relative comparisons rather than absolute battery percentages?
The intent is not to claim exact battery drain numbers, but to establish a directionally correct and repeatable comparison that can guide performance optimizations in our own application.
I would like to understand whether this methodology aligns with Apple’s recommended practices, or if there are limitations or inaccuracies I should be aware of before relying on these results for decision-making.
Topic:
Developer Tools & Services
SubTopic:
Instruments
what is the diff between INST_ALL and Instructions(FIXED_INSTRUCTIONS)?
also CORE_ACTIVE_CYCLE VS Cycles(FIXED_CYCLES)
Hi everyone,
I’m developing a cross-platform mobile app (React Native) but I don’t currently own a Mac.
What is the most reliable and professional way to:
Build the iOS version
Test it properly (real device / TestFlight)
Upload it to the App Store
Are cloud Mac services (like MacinCloud, AWS Mac, etc.) considered stable for production release workflows?
Is there any fully supported workflow without direct access to a physical Mac?
Would appreciate real-world experience from developers who faced the same situation.
Thanks in advance.
Topic:
Developer Tools & Services
SubTopic:
Instruments
When I try to run dtruss on a command line program (freshclam) I see:
$ sudo dtruss -a /usr/local/bin/freshclam 2>&1 | tee ~/tmp/dtruss.out
dtrace: system integrity protection is on, some features will not be available
dtrace: failed to execute /usr/local/bin/freshclam: DTrace cannot instrument translated processes
I did some research and found advice on how to enable dtrace use via running:
csrutil enable --without dtrace
in a terminal running in macOS recovery mode. When I do that I see a warning saying this is an unsupported configuration and that it will allow unsigned kernel modules to be loaded. This is not what I want, I just want to run dtruss on a program while keeping all the other SIP protections in place. Why can't I just use sudo to grant the privileges for dtrace to work?
All of this has me wondering if Apple intends for developers to use dtruss/dtrace in the current macOS?
Topic:
Developer Tools & Services
SubTopic:
Instruments
I've discovered what appears to be a system-level memory leak when pressing any key in Swift UI projects. This issue occurs even in a completely empty SwiftUI project with no custom code or event handlers.
When monitoring with Instruments' Leaks tool, I observe multiple memory leaks each time any key is pressed. These leaks consist primarily of:
NSExtraData objects (240 bytes each)
NSMenuItem objects (112 bytes each)
Other AppKit and Foundation objects
Has anyone else encountered this issue? How can I fix this behavior? While the leaks are small (about 5-6KB per keypress), they could potentially accumulate in applications where keyboard input is frequent.
I want to check the disassembly code of vImage function by Instrument.
But when I double click the stack of time profiler, I can see nothing.
My Mac is intel chip and iPhone is 14 pro max
Xcode version is 16.3.
How can I fix this(maybe no .dSYM in iOS device support?)
I kept CoreLocation’s startUpdatingLocation running for a full day and used Performance trace - PowerProfiler to track the power usage during that time. The trace file was successfully generated on the iOS device, and I later transferred it to my MacBook.
However, when I tried to open the .atrc file, I received the following warning:
The document cannot be imported because of an error: File ‘/Users/jun/Downloads/PowerProfiler_25-06-16_181049_to_25-06-17_091037_001.atrc’ doesn’t contain any events.
Why is this happening? Is there a known issue with PowerProfiler in iOS 26, or am I missing something in the tracing setup?
Note: The .aar file and the extracted .atrc file are not attached here, as forum uploads do not support these formats.
Hello,
I wanted to try new Bottleneck analysis mode showcased in recent Apple's video, however when I select CPU Counters template in Instruments, there's no such option - just the same old "sample by Time/Events".
I have the latest XCode 16.4 and OS Sonoma 15.5, the system is M4 Max. While Instruments shows version 16.0 in About dialog for some reason (a bug?), it definitely comes from the Xcode 16.4 package and the build id is the same (16F6) as for XCode 16.4. I also checked just in case on another M1 system (all updated as well) and it's all the same.
Any clues why Bottleneck analysis is missing?
Regards,
Maxim
Topic:
Developer Tools & Services
SubTopic:
Instruments