Hello,We have been working on FPS for a while and managed to play an encrypted asset with it, using version 2 of FairPlay Streaming Server SDK 2.03.Now we are working on "Download and Play Offline" functionality. For this I have downloaded the FairPlay Streaming Server SDK 3.0 and in SDK Folder there is "OfflineHLSGuide_withFPS.pdf" document beside "FairPlayStreaming_PG.pdf" where they tell about how it works and what need to be done.In that document ("OfflineHLSGuide_withFPS.pdf") there is new version of Content Key Duration TLLV which is slightly different than the structure shown in "FairPlayStreaming_PG.pdf"For Content Key Duration TLLV (0x47acf6a418cd091a) in "FairPlayStreaming_PG" there is "Lease Duration" between bytes 16-19.But in the document about OfflineHLSGuide, it says "Reserved" for bytes between 16-19.Where has LeaseDuration gone for offline? ORShould I use one ContentKeyDuration TLLV for non-persist and other version of ContentKeyDuration TLLV for (persist)offline?Thanks
Streaming
RSS for tagDeep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello,
I am currently developing a live streaming application using AVPlayer to play LL-HLS (Low-Latency HLS) content.
During our testing phase, we consistently encountered the following error in the logs:
CoreMediaErrorDomain Code=-15517
The challenge we are facing is that the error description is quite vague. It only provides cryptic messages such as "Key not found" or "No value information," which makes it extremely difficult to identify the root cause or perform a deep-dive analysis.
I have searched through the official Apple Developer documentation and technical notes, but I couldn’t find any specific reference to what Code -15517 signifies in the context of LL-HLS or CoreMedia.
Regarding this issue, I have the following questions:
What is the specific meaning of this error code (-15517)? Does it relate to missing tags in the HLS manifest, or is it an internal state issue within the AVPlayer stack?
Specifically, I would like to know if this is a critical error that disrupts playback, or if it is just a warning that can be safely ignored.
Is there any additional logging or debugging tool you would recommend to further investigate "Key not found" issues in LL-HLS?
Any insights or guidance from the community or Apple engineers would be greatly appreciated.
Thank you in advance for your help.
Hi,
Has anyone been able to protect the audio part of FairPlay protected content from being captured as part of screen recording on Safari/iOS (PWA and/or online web app)?
We have tried many things but could not prevent the audio from being recorded.
Same app and content on Safari/Mac does not allow audio to be recorded. Any tips?
Hello,
I am reviewing the sample codes of FairPlay Streaming SDK 26 and there was a place where I think is a mistake.
The codes are for the server, for both Swift and Rust codes.
There is an if statement which compares "ProtocolVersionUsed"(spcData.versionUsed) and SPCVersion1 constant, though "ProtocolVersionUsed" and SPC Version is a different thing, so shouldn't it be using a different constant value?
[createContentKeyPayload.swift]
// Fallback to version 1 if content can have encrypted slice headers, which need to be decrypted separately. Slice headers are not encrypted when using CBCS.
if serverCtx.spcContainer.spcData.versionUsed == base_constants.SPCVersion.v1.rawValue &&
[createContentKeyPayload.rs]
// Fallback to version 1 if content can have encrypted slice headers, which need to be decrypted separately. Slice headers are not encrypted when using CBCS.
if (serverCtx.spcContainer.spcData.versionUsed == SPCVersion::v1 as u32) &&
Thank you.
Hello,
I’m using a valid certificate bundle generated with SDK 26 (combined RSA‑1024 + RSA‑2048).
However, all my devices currently still generate SPC v2 during playback, including my iPhone 16 under iOS 26.2.
Apple staff mentioned that future iOS versions will send SPC v3 when using an SDK 26 certificate bundle.
Could you please clarify:
Which iOS/macOS versions will first support SPC v3?
Are there any additional client‑side requirements (Safari version, playback APIs, headers, etc.) to trigger SPC v3?
Is there any way to test SPC v3 today, e.g., using beta builds?
Thank you!
Hello,
I am developing a custom player SDK based on AVPlayer.
While testing LL-HLS streams, I intermittently encounter the following error: Error Domain=CoreMediaErrorDomain Code=-12880
Since I cannot find documentation for this specific code, could you please clarify its meaning?
Specifically, I would like to know if this is a critical error that disrupts playback, or if it is just a warning that can be safely ignored.
Any insights would be appreciated.
Thank you.
According to the documentation (https://developer.apple.com/documentation/avfoundation/avcontentkeyrequest/originatingrecipient?changes=_3&language=objc), starting with ios 18.4, I can get AVContentKeyRecipient from AVContentKeyRequest. But when I try to get it, I get a crash. What could be the issue?
I want to note that I add the asset to the AVContentKeySession using the addContentKeyRecipient method (https://developer.apple.com/documentation/avfoundation/avcontentkeysession/addcontentkeyrecipient(_:)?changes=_3&language=objc).
Hi
We’re updating our KSM to support SPC v2/v3 and currently operate with both legacy SDK4 credentials (ASK + 1024 cert) and SDK26 credentials (certificate bundle + provisioning data + 1024/2048 keys).
Our client apps run across a wide range of iOS/tvOS versions, so we want to follow Apple’s recommended client strategy for certificate selection. The docs describe SHA‑1 vs SHA‑256 in the SPC header, but do not specify which OS versions should use SDK4 vs SDK26 credentials.
Could you clarify:
Is there an official minimum iOS/tvOS version where you recommend SDK26 credentials for client apps?
For older OS versions (e.g. iOS 15), is SDK4 still the recommended choice for client apps?
Are there any official migration guidelines for client apps moving from SDK4 to SDK26 credentials?
Thanks in advance.
Hello,
I have a problem generating a 2048-bit FairPlay Streaming certificate.
I tried generating SDK v26.x certificate in two ways.
(1) Use existing certificate
(2) Create new certificate
Though, in both ways, Apple gives me a certificate bundle of 1024-bit certificate.
(fps_certificate.bin)
I've uploaded 2048-bit CSR on creating a certificate.
Just to note, I have created a SDK v4.x certificate few years ago.
Have anyone bumped into a same issue?
Or am I missing something?
Hello,
I have implemented Low-Latency Frame Interpolation using the VTFrameProcessor framework, based on the sample code from https://developer.apple.com/kr/videos/play/wwdc2025/300. It is currently working well for both LIVE and VOD streams.
However, I have a few questions regarding the lifecycle management and synchronization of this feature:
1. Common Questions (Applicable to both Frame Interpolation & Super Resolution)
1.1 Dynamic Toggling
Do you recommend enabling/disabling these features dynamically during playback?
Or is it better practice to configure them only during the initial setup/preparation phase?
If dynamic toggling is supported, are there any recommended patterns for managing VTFrameProcessor session lifecycle (e.g., startSession / endSession timing)?
1.2 Synchronization Method
I am currently using CADisplayLink to fetch frames from AVPlayerItemVideoOutput and perform processing.
Is CADisplayLink the recommended approach for real-time frame acquisition with VTFrameProcessor?
If the feature needs to be toggled on/off during active playback, are there any concerns or alternative approaches you would recommend?
1.3 Supported Resolution/Quality Range
What are the minimum and maximum video resolutions supported for each feature?
Are there any aspect ratio restrictions (e.g., does it support 1:1 square videos)?
Is there a recommended resolution range for optimal performance and quality?
2. Frame Interpolation Specific Questions
2.1 LIVE Stream Support
Is Low-Latency Frame Interpolation suitable for LIVE streaming scenarios where latency is critical?
Are there any special considerations for LIVE vs VOD?
3. Super Resolution Specific Questions
3.1 Adaptive Bitrate (ABR) Stream Support
In ABR (HLS/DASH) streams, the video resolution can change dynamically during playback.
Is VTLowLatencySuperResolutionScaler compatible with ABR streams where resolution changes mid-playback?
If resolution changes occur, should I recreate the VTLowLatencySuperResolutionScalerConfiguration and restart the session, or does the API handle this automatically?
3.2 Small/Square Resolution Issue
I observed that 144x144 (1:1 square) videos fail with error:
"VTFrameProcessorErrorDomain Code=-19730: processWithSourceFrame within VCPFrameSuperResolutionProcessor failed"
However, 480x270 (16:9) videos work correctly.
minimumDimensions reports 96x96, but 144x144 still fails. Is there an undocumented restriction on aspect ratio or a practical minimum resolution?
3.3 Scale Factor Selection
supportedScaleFactors returns [2.0, 4.0] for most resolutions.
Is there a recommended scale factor for balancing quality and performance?
Are there scenarios where 4.0x should be avoided?
The documentation on this specific topic seems limited, so I would appreciate any insights or advice.
Thank you.
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
VideoToolbox
HTTP Live Streaming
AVKit
AVFoundation
I am trying to Build server for testing on Linux(Alma linux 9 VM)
NAME="AlmaLinux"
VERSION="9.7 (Moss Jungle Cat)"
ID="almalinux"
ID_LIKE="rhel centos fedora"
VERSION_ID="9.7"
PLATFORM_ID="platform:el9"
PRETTY_NAME="AlmaLinux 9.7 (Moss Jungle Cat)"
ANSI_COLOR="0;34"
[azuki@AlmaDevVM ~]$ uname -m
x86_64
I have tried the following steps:
Before starting, ensured that Swift 6 installed. Referred https://www.swift.org/install/ for instructions.
Build the library
In Terminal, uses the following commands to compile the Swift library:
cd Development/Key_Server_Module/Swift
swift build -Xbuild-tools-swiftc -DTEST_CREDENTIALS
After building the library, ran test cases to ensure the library behaves as expected. ALL unit tests are passing with the development credentials.
• Since I was using an x86_64 machine:
export LD_LIBRARY_PATH=./Sources/prebuilt/x86_64-unknown-linux-gnu/
Run all tests:
swift test -Xbuild-tools-swiftc -DTEST_CREDENTIALS --disable-swift-testing
Build the server
Build the server: Apache
Before starting, ensured the following:
a. Installed Apache HTTPD and the dev tools. Using the following command for installation:
yum install httpd httpd-devel redhat-rpm-config
b. After this, integrated it into the Apache server
environment with swift library that was built above. Used the following command to build the server using apxs:
• Since I was using an x86_64 machine:
apxs -i -a -c
-Wl,-L${PWD}/.build/x86_64-unknown-linux-gnu/debug/
-Wl,-lswift_fpssdk
-Wl,-L${PWD}/Sources/prebuilt/x86_64-unknown-linux-gnu -lfpscrypto
-Wl,-R${PWD}/.build/x86_64-unknown-linux-gnu/debug
server_setup/mod_fps.c
c. Next, copied the dependent libraries to the Apache modules folder using these commands:
• If using an x86_64 machine:
cp Sources/prebuilt/x86_64-unknown-linux-gnu/libfpscrypto.so /usr/lib64/httpd/modules/libfpscrypto.so
cp .build/x86_64-unknown-linux-gnu/debug/libswift_fpssdk.so /usr/lib64/httpd/modules/libswift_fpssdk.so
d. Configuring Apache HTTPD
Configured Apache HTTPD by adding the module and handler to your Apache
HTTPD configuration (/etc/httpd/conf/httpd.conf). Note that the apxs command
may automatically add the LoadModule line in the previous step.
Listen 8080
LoadFile /usr/lib64/httpd/modules/libfpscrypto.so
LoadFile /usr/lib64/httpd/modules/libswift_fpssdk.so
LoadModule fps_module /usr/lib64/httpd/modules/mod_fps.so
<Location "/fps">
SetHandler fps_handler
Copy the credentials to the Apache modules folder.
cp -r ../credentials /usr/lib64/httpd/modules/
export FPS_CERT_PATH=
/usr/lib64/httpd/modules/credentials/test_certificates.json
e. Run your server
You can run the Apache HTTPD server with the configured module by using the following command:
httpd -D FOREGROUND
No issues see till step.
Get SDK version
[azuki@AlmaDevVM Key_Server_Module]$ curl localhost:8080/fps/v
26.0.0
But when i try to generate license
[azuki@AlmaDevVM Key_Server_Module]$ curl -d ../Test_Inputs/iOS/spc_ios_hd_lease_2048.json localhost:8080/fps
{"fairplay-streaming-response":{"create-ckc":[{"id":1,"status":-42601}]}}
Can you please suggest what i might be missing here?
Hello,
I am currently developing a video player using Custom AVPlayer SDK and testing LL-HLS live streaming.
I encountered a specific error, CoreMediaErrorDomain -15418, during playback. I have searched through the official documentation and the forums, but I could not find any information regarding this error code.
I would like to inquire about the following:
Description & Cause: What does the error code -15418 specifically represent in the context of CoreMedia and LL-HLS?
Severity: Is this a critical error that halts playback, or is it merely a warning?
Environment Details:
iOS Version: iOS 26.2
Device: iPhone 15 Pro Max
Stream Type: LL-HLS (Low-Latency HLS)
Impact: Quality drops
Any insights or references to documentation would be greatly appreciated.
Thank you.
Hi there,
We're working on offline playback of DRM tracks. The persistent keys (also known as track licenses) for offline playback are stored locally on the device and are served from cache when a user initiates playback of a downloaded track.
Our persistent keys have a limited validity time and need to be refreshed when they expire. To prevent a situation where a persistent key expires while the user is offline, we've decided to eagerly refresh these keys one week before their expiration date. To make that happen we need to be able to obtain the expiration date of the given track license.
We've been attempting to use the makeSecureTokenForExpirationDateOfPersistableContentKey API to facilitate this process. The documentation states that this API returns a secret token representing the persistent key, which we can then exchange with our license server for the expiration date: https://developer.apple.com/documentation/avfoundation/avcontentkeysession/makesecuretokenforexpirationdate(ofpersistablecontentkey:completionhandler:)?language=objc
However, every time we call makeSecureTokenForExpirationDateOfPersistableContentKey, we receive an error with code -46250. We haven't been able to find any public references or documentation for this specific error code, which is preventing us from troubleshooting the issue. We are conducting our tests on a physical device, as the simulator does not support FairPlay playback. We don't use dual expiry approach.
Is our understanding of how to obtain the expiration timestamp correct? Are we using the makeSecureTokenForExpirationDateOfPersistableContentKey API as it was intended? What does the -46250 error code mean, and what steps should we take to fix our FairPlay implementation to make this work?
Thanks in advance for your assistance.
Hello, our application is unable to HDMI output FairPlay protected content to TV via official Lightning HDMI AV Adapter, by checking the console log on mediaplayerd it is found that a CoreMediaErrorDomain Code=-19156 is raised, but we are unable to know what this error code means.
default 11:18:15.121584+0800 mediaplaybackd keyboss ckb_customURLReadCallback: 0x7fa62f800 60/0 customURLReqID 4 isComplete 1 err -19156 error <private> (0) dokeyCallbacksExist 0
default 11:18:15.121670+0800 mediaplaybackd keyboss ckb_processErrorForRequest: 0x7fa62f800 60/0 handler 4 err 0
default 11:18:15.121752+0800 mediaplaybackd <<<< FigCustomURLHandling >>>> curll_cancelRequestOnQueue: 0x7fa031360: requestID: 4
default 11:18:15.121932+0800 mediaplaybackd keyboss ckb_transitionRequestToTerminalState: 0x7fa62f800 60/0 reqFin err Error Domain=CoreMediaErrorDomain Code=-19156 (-19156) dokeyCallbacksExist 0
default 11:18:15.122025+0800 mediaplaybackd keyboss ckb_transitionRequestToTerminalState: 0x7fa62f800 60/0 retry
default 11:18:15.123195+0800 mediaplaybackd <<<< FigCPECryptorPKD >>>> PostKeyRequestErrorOccurred: 0x7fab7be80 029592C2-093D-400D-B57F-7AB06CC292D1 key request error: Error Domain=CoreMediaErrorDomain Code=-19160 (-19160)
The ASk is used by the KSM to derive the dASk, which is then used to decrypt the SK...R1.
If the only thing we give the client is the certificate, how does it encrypt the SK...R1 so the server is able to process it.
Would be nice to know it it works generally, because I've been getting questions about it and can't provide a helpful answer.
Thanks in advance.
Hello Apple team and developer community,
I am preparing a visionOS app for a fair environment, where we want to automatically stream the current experience to a nearby monitor via AirPlay, without requiring guests or staff to manually interact with the Control Center or AirPlay pickers all the time.
The goal is to provide a smooth, frictionless setup so attendees can focus on the demo, not the configuration.
Feature Request:
A supported API or method to programmatically start/stop AirPlay video streaming (mirroring or external playback) from within a visionOS app, allowing the current experience to be instantly displayed on an external monitor or Apple TV for the audience.
Context & Rationale:
In a trade fair or exhibition setting, rapid guest turnaround and minimal staff intervention are crucial. Having to manually guide each visitor through AirPlay setup is impractical.
As I understood, AVRoutePickerView can be used for this on iOS/macOS, but this is not available in visionOS. Enabling similar automated streaming on visionOS would make the device far more suitable for live demos and public showcases.
Questions:
Are there any supported workarounds or best practices for enabling automated screen streaming or AirPlay initiation on visionOS in public demo environments that I missed?
Is Apple considering adding programmatic AirPlay control or accessibility features to support such use cases in future visionOS releases?
Thank you for considering this request! If there are recommended patterns, entitlements, or accessibility solutions we could explore for trade fair scenarios, your guidance would be greatly appreciated.
Best regards,
Julian Zürn - IPI, HS Kempten
I want develop an app for real-time streaming spatial video transmission from an Apple Vision Pro to another Apple Vision Pro and play, like MV-HEVC, does it's possible? If it's possible how to make it?
Hi,
I'm trying to create a FairPlay Streaming Certificate for the SDK 26.x version.
Worth to mention that we already have 2 (1024 and 2048) and we only have the possibility to use our previous 1024-bit certificate (which we do not want because we want a 2048 cert)
Our main issue is that when I upload a new "CSR" file, the "Continue" button is still on "gray" and cannot move forward on the process.
The CSR file has been created with this command:
openssl req -out csr_2048.csr -new -newkey rsa:2048
-keyout priv_key_2048.pem
-subj /CN=SubjectName/OU=OrganizationalUnit/O=Organization/C=US
Some help will be appreciated.
Thanks in advance
Best,
We're troubleshooting SCK issues. They occur with a relatively small amount of sessions, but lack of context and/or ability to advise the customer on how they could make behavior more predictable and reliable is problematic.
Generally, there is 2 distinct issues which may or may not have the same root cause:
Failure to establish SCK session. Usually manifests within the app as SCShareableContent.getWithCompletionHandler call either never invoking the completion handler, or taking prohibitively long time (we usually give it 3-10 sec before giving up). In the system log it may look like this:
(log omitted - suspecting it triggers the content filter)
Note the 6 seconds delay to completion of fetchShareableContentWithOption (normally it's a 30-40ms operation).
Sometime, we'd see the stream established, but some minutes (or even hours) into the recording we'd stop receiving frames.
Both scenarios are likely to occur when the disk space is low, with reliable repro of the problem #2 at below 8gb of free space (in that case, we've seen replayd silently dropping the session, without ever notifying the client ... improving API could go a long way there). However, out of recent occurrences, while most have less than 100GB available, we've seen it on machines with as much as 500GB free.
Unfortunately, it's almost never reproducible in dev environment, so we have to rely on diagnostics we're able to collect in the field -- which nothing obvious yet.
I'd like to understand the root cause of both scenarios better and/or how what specific frameworks can cause these behaviors.
Hi,
I understand that AVPlayer/AVFoundation doesn’t natively play MPEG-DASH manifests (.mpd) today, while HLS is supported and widely documented by Apple.
I’m not asking for roadmap commitments, but I’d like to understand whether there is any publicly documented rationale for not supporting DASH/MPD in AVFoundation (e.g., technical constraints, platform integration, DRM ecosystem, power/performance considerations, etc.).
Questions:
Is there any Apple statement / documentation explaining why DASH (MPD) isn’t supported in AVFoundation?
Is Apple’s recommended approach still “provide HLS for Apple clients” (potentially sharing CMAF segments and generating separate manifests)?
If there’s no public rationale, is filing Feedback Assistant the best channel for requesting MPD playback support?
Thanks!