Hello Apple Forums,
We are developing an iOS application that connects to a custom BLE accessory and sends control commands to it.
Our system architecture is as follows:
A separate hardware device collects data and sends it to our backend server via Wi-Fi.
The backend evaluates state changes and determines when the BLE accessory should update its display.
The iOS app acts purely as a BLE command executor for this accessory.
Our goal is to:
Maintain a BLE connection with the accessory while the app is in the background.
Receive state-change events from our backend server.
Upon receiving such events, send a BLE command to the accessory to update its state.
We understand that iOS does not allow arbitrary background execution. We would like to confirm whether there is any supported mechanism, entitlement, or program that allows:
Long-running background execution for BLE control, or
Server-originated events (other than APNs) to trigger background BLE actions.
If this is not supported, we would appreciate confirmation that APNs (silent push) is the only supported way to trigger such background BLE actions, or guidance on any recommended alternative architectures.
Thank you for your guidance.
Hardware
RSS for tagDelve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Summary
On Mac Studio systems (no built-in camera), macOS does not initialize camera services after a normal reboot if no physical camera is present. As a result, Continuity Camera does not appear anywhere in the system.
Observed behavior
System Information → Camera reports “No video capture devices were found.”
Continuity Camera (iPhone) is completely absent from camera lists.
Plugging in any USB UVC webcam immediately initializes camera services and causes both the USB camera and the iPhone (Continuity Camera) to appear.
The USB camera can then be unplugged and Continuity Camera continues working until the next reboot.
Reproduction steps
Use a Mac Studio (no built-in camera) on recent macOS.
Ensure no USB webcam or external camera is connected.
Reboot the Mac normally.
After login, open System Information → Camera.
Expected
Camera services should initialize even when no physical camera is present, allowing Continuity Camera to be available as the primary camera.
Actual
No camera devices are present unless a physical USB camera is connected at least once after boot.
This reproduces 100% of the time on Mac Studio and appears to be a camera service bootstrap issue where Continuity Camera cannot be the first camera device.
Issue has been filed via Feedback Assistant.
My team has developed an app with a biref Matter commissioner feature using the Matter framework on the MatterSupport extension.
Our app support iOS and Android. However, we ran into a problem that the control certificate generated by the iOS app could not control the device on the Android side. And the control certificate generated by the Android app could not control the device on the iOS side.
The Matter library used by Android is compiled by connectedhomeip.
Does anyone have the same problem as us? How to solve this?
Thank you
I am developing a standard UAC 2.0 device and encountered an issue where the channel names do not update according to the iChannelNames field in the Class Specific AS Interface Descriptor when switching between different channel counts.
For example:
AS1 (6 channels) is configured with the following channel names:
ADAT 1, ADAT 2, ADAT 3, ADAT 4, HP L, HP R
AS2 (4 channels) is configured with:
ADAT 1, ADAT 2, HP L, HP R
However, when switching from AS1 (6 channels) to AS2 (4 channels), the channel names displayed in Audio MIDI Setup do not reflect the change as expected. The actual result is:
ADAT 1, ADAT 2, ADAT 3, ADAT 4
The system simply hides the last two channels; the names of the remaining channels are not updated.
Initial Topology
My original topology was as follows:
Later, I discovered that macOS uses the iChannelNames field from the Input Terminal to display channel names. Therefore, I modified the USB device descriptors and updated the topology to the following:
To distinguish the channel names for different channel counts, each Input Terminal is assigned a unique iChannelNames value.
This method worked perfectly on macOS 15. However, after updating to macOS 26, this topology no longer displays the correct channel names.
Question
On macOS 26, what is the correct method to ensure that the channel names update dynamically when switching between different audio channel configurations?
I am developing a standard UAC 2.0 device and encountered an issue where the channel names do not update according to the iChannelNames field in the Class Specific AS Interface Descriptor when switching between different channel counts.
For example:
AS1 (6 channels) is configured with the following channel names:
ADAT 1, ADAT 2, ADAT 3, ADAT 4, HP L, HP R
AS2 (4 channels) is configured with:
ADAT 1, ADAT 2, HP L, HP R
However, when switching from AS1 (6 channels) to AS2 (4 channels), the channel names displayed in Audio MIDI Setup do not reflect the change as expected. The actual result is:
ADAT 1, ADAT 2, ADAT 3, ADAT 4
The system simply hides the last two channels; the names of the remaining channels are not updated.
Initial Topology
My original topology was as follows:
Later, I discovered that macOS uses the iChannelNames field from the Input Terminal to display channel names. Therefore, I modified the USB device descriptors and updated the topology to the following:
To distinguish the channel names for different channel counts, each Input Terminal is assigned a unique iChannelNames value.
This method worked perfectly on macOS 15. However, after updating to macOS 26, this topology no longer displays the correct channel names.
Question
On macOS 26, what is the correct method to ensure that the channel names update dynamically when switching between different audio channel configurations?
Hi everyone,
We are currently exploring ways to implement a frictionless Wi-Fi setup for our hardware devices without requiring a dedicated third-party application. We are interested in leveraging Apple's WAC (Wireless Accessory Configuration) to sync Wi-Fi credentials directly from iOS devices. However, we have struggled to find comprehensive technical documentation or specifications regarding the WAC service. Could anyone point us to the official source for these materials?
Additionally, we have a couple of technical questions:
1.We are testing WAC provisioning and found that the Home app can discover our device and successfully get it online. However, it always ends with a "Failed to add accessory" message.
Does WAC support imply that a device should be addable via the Home app? If not, why is the Home app able to discover and start the setup for a non-HomeKit WAC device?
2. Our device is already Apple AirPlay certified. Does implementing WAC require additional standalone certification, or is it covered under the existing MFi/AirPlay certification umbrella?
Any insights or guidance would be greatly appreciated. Thank you!
Hello,
I am an individual developer working on a macOS application using SwiftUI and RealityKit.
I would like to understand the feasibility of face-related tracking on macOS when using an external USB camera, compared to iOS/iPadOS.
Specifically:
• Does macOS provide an ARKit Face Tracking–equivalent API (e.g., real-time facial expressions, gaze direction, depth)?
• If not, is it common to rely on Vision / AVFoundation as alternatives for:
• Facial expression coefficients
• Gaze estimation
• Depth approximation
• In an environment without dedicated sensors such as TrueDepth, is it correct to assume that accurate depth data and high-fidelity blend shape extraction are realistically difficult?
Any clarification on official limitations, recommended alternatives, or relevant documentation would be greatly appreciated.
Thank you.
Topic:
App & System Services
SubTopic:
Hardware
Environment:
iPhone 17 / iPhone 17 Pro (Apple N1 chip)
iOS 26.x
Xcode 26
Framework: Flutter app with native iOS BLE library (CoreBluetooth)
We have a production IoT app that communicates with BLE nodes (Nordic, PIC, EnOcean peripherals) using an advertising/scanning-based protocol — not GATT connections. The app broadcasts commands via CBPeripheralManager (advertising service UUIDs) and receives responses by scanning with CBCentralManager (reading manufacturer data and service UUIDs from advertisement packets). This workflow has been reliable across all iPhone models from iPhone 8 through iPhone 16 Pro Max.
On iPhone 17 devices, we are experiencing multiple failures in this workflow.
Architecture:
Sending commands: We use CBPeripheralManager.startAdvertising() with CBAdvertisementDataServiceUUIDsKey to broadcast a UUID-encoded command to nearby nodes.
Receiving responses: We use CBCentralManager.scanForPeripherals(withServices: nil, options: [CBCentralManagerScanOptionAllowDuplicatesKey: true]) and filter responses in centralManager(_:didDiscover:advertisementData:rssi:) by matching CBAdvertisementDataServiceUUIDsKey or CBAdvertisementDataManufacturerDataKey against expected UUID masks.
Communication pattern: Advertise a command → stop advertiser → start scanner → wait for matching response → process result. Typical timeout is 1.5 seconds per exchange.
Issues observed on iPhone 17:
peripheralManagerDidStartAdvertising behaviour change
After calling CBPeripheralManager.startAdvertising(:), the delegate callback peripheralManagerDidStartAdvertising(:error:) either fires with errors that did not occur on previous hardware, or advertising does not appear to reach the peripheral nodes at all. The same advertising payload works immediately when tested on iPhone 15/16.
Is the N1 chip's Bluetooth 6 stack handling CBAdvertisementDataServiceUUIDsKey advertising differently? Are there new constraints on advertising payload size or format?
Scanner returning fewer/no results with withServices: nil
Our scanner uses scanForPeripherals(withServices: nil) because we need to read manufacturer data from advertisement packets and filter using a custom UUID mask. On iPhone 17, we observe significantly fewer didDiscover callbacks compared to iPhone 15/16 in the same physical environment, with the same nodes advertising.
We understand that passing service UUIDs in withServices: is recommended, but our protocol requires reading raw manufacturer data bytes that aren't associated with a single service UUID — we use mask-based matching (e.g., filter mask 11110000-0000-0000-0000-000000000000 against scan results).
Has the N1 chip changed the rate or filtering behaviour of unfiltered BLE scans? Is there a new throttling mechanism?
Background scanning stops immediately
When the app moves to background, scanning appears to stop entirely on iPhone 17 — even with bluetooth-central in UIBackgroundModes. On iPhone 16, background scanning continued (at reduced intervals) and delivered results for peripherals advertising filtered service UUIDs.
Aggressive session termination on app backgrounding
Our advertise-then-scan sequences (typically 1.5s round-trip) are being interrupted when the user briefly switches apps. The CBPeripheralManager stops advertising and the CBCentralManager stops scanning, causing timeout errors. This was not observed on previous iPhone models with the same iOS background mode configuration.
Questions for Apple:
Are there documented changes to CoreBluetooth behaviour on the N1 Bluetooth 6 chip that affect advertising-based (non-GATT) communication patterns?
Has the scan response rate for scanForPeripherals(withServices: nil) been intentionally reduced on iPhone 17?
Is CBCentralManagerOptionRestoreIdentifierKey now required for reliable background scanning on iPhone 17, or is this a known regression?
Are there new advertising payload constraints (size, format, interval) that we should be aware of for the N1 chip?
What we've tried:
Added NSBluetoothAlwaysUsageDescription and NSBluetoothWhileInUseUsageDescription to Info.plist
Confirmed Bluetooth permissions are granted
Tested with identical BLE nodes that work on iPhone 15/16
Verified CBManagerState.poweredOn before all operations
Any guidance or known workarounds would be greatly appreciated. Happy to provide sysdiagnose logs or a minimal reproducible sample project.
Hi all,
I’m facing a device-specific issue in a live production iOS app distributed privately via the App Store . The app crashes immediately after login on one client’s iPhone, while the same account works fine on other devices. There’s no crash log generated in Analytics, and the app just pops to the home screen.
Environment:
App: Production app on App Store
iOS version: 26.3
Devices: Only one device exhibits the crash; other iPhones work fine
Login flow: App calls an API and writes the response to a local SQLite database immediately after login
Distribution: App Store (Privately). The user is install via the redemption codes.
Observations:
All users on the problematic device crash immediately after login.
The crash does not occur on any other devices, including the same iOS version.
The client had already uninstalled and reinstalled the app via App Store cloud download, but the crash persisted.
No crash log appears in Analytics or Xcode (process just terminates).
Device restart had not been attempted before reinstall.
App does not use Keychain tokens; local DB is only SQLite in the app sandbox.
Hypotheses so far:
Corrupted binary or cached app installation on that device
SQLite database corruption or write failure
Device-specific OS/environment issue (temp files, file locks, provisioning)
iOS watchdog silently terminating the app during post-login DB write
Language / region differences unlikely
Questions:
Is it possible for a device to retain a corrupted app binary or cached installation even after uninstall + cloud download reinstall from the App Store?
Can uninstalling, restarting the device, and reinstalling guarantee a fresh binary and sandbox?
Are there any known iOS behaviors where a local SQLite write could trigger an instant crash on one device only, without generating crash logs?
Any other suggestions for diagnosing this device-specific post-login crash in a live production environment?
Thanks in advance for any guidance — this issue is affecting a client’s live usage, and we’d like to understand the root cause and best way to resolve it safely.
Hello Apple Developer Technical Support Team,
I’m working on an iOS banking/security SDK and we’re trying to match an Android feature that reads payment cards via NFC (EMV). On Android, this is implemented using an NFC scanning screen (e.g., “NfcScanActivity”) that can read EMV data from contactless credit/debit cards.
Could you please clarify the current iOS capabilities and App Store policy around this?
On iOS, is it currently possible for a third-party App Store app to read contactless credit/debit cards using Core NFC (i.e., accessing EMV application data/AIDs from payment cards)?
If this is possible, what are the supported APIs/frameworks and any entitlement requirements (if applicable)?
If this is not possible for App Store apps, could you recommend the closest acceptable alternatives for achieving a similar user outcome? For example:
Using Apple Pay / PassKit flows for payment-related experiences
Card scanning alternatives (camera-based OCR) for capturing card details (if allowed)
Using an external certified card reader accessory (MFi) and required approach/entitlements
Any other Apple-recommended approach for “card verification / identification” without reading EMV NFC data
Our goal is not to bypass security restrictions, but to provide a compliant solution on iOS comparable to Android’s NFC-based card reading, or to adopt an Apple-approved alternative if direct EMV reading is not supported.
If helpful, I can share a brief technical summary of the Android behavior and the exact data we need to obtain (e.g., whether it’s card presence verification vs. reading specific EMV tags).
Thank you for your guidance.
Best regards,
Imran
Topic:
App & System Services
SubTopic:
Hardware
Hello Apple Developer Technical Support Team,
I’m working on an iOS banking/security SDK and we’re trying to match an Android feature that reads payment cards via NFC (EMV). On Android, this is implemented using an NFC scanning screen (e.g., “NfcScanActivity”) that can read EMV data from contactless credit/debit cards.
Could you please clarify the current iOS capabilities and App Store policy around this?
On iOS, is it currently possible for a third-party App Store app to read contactless credit/debit cards using Core NFC (i.e., accessing EMV application data/AIDs from payment cards)?
If this is possible, what are the supported APIs/frameworks and any entitlement requirements (if applicable)?
If this is not possible for App Store apps, could you recommend the closest acceptable alternatives for achieving a similar user outcome? For example:
Using Apple Pay / PassKit flows for payment-related experiences
Card scanning alternatives (camera-based OCR) for capturing card details (if allowed)
Using an external certified card reader accessory (MFi) and required approach/entitlements
Any other Apple-recommended approach for “card verification / identification” without reading EMV NFC data
Our goal is not to bypass security restrictions, but to provide a compliant solution on iOS comparable to Android’s NFC-based card reading, or to adopt an Apple-approved alternative if direct EMV reading is not supported.
If helpful, I can share a brief technical summary of the Android behavior and the exact data we need to obtain (e.g., whether it’s card presence verification vs. reading specific EMV tags).
Thank you for your guidance.
Best regards,
Anis
Topic:
App & System Services
SubTopic:
Hardware