Things I did:
created an Intents Extension target
added "Supported Intents" to both my main app target and the intent extension, with "INAddTasksIntent" and "INCreateNoteIntent"
created the AppIntentVocabulary in my main app target
created the handlers in the code in the Intents Extension target
class AddTaskIntentHandler: INExtension, INAddTasksIntentHandling {
func resolveTaskTitles(for intent: INAddTasksIntent) async -> [INSpeakableStringResolutionResult] {
if let taskTitles = intent.taskTitles {
return taskTitles.map { INSpeakableStringResolutionResult.success(with: $0) }
} else {
return [INSpeakableStringResolutionResult.needsValue()]
}
}
func handle(intent: INAddTasksIntent) async -> INAddTasksIntentResponse {
// my code to handle this...
let response = INAddTasksIntentResponse(code: .success, userActivity: nil)
response.addedTasks = tasksCreated.map {
INTask(
title: INSpeakableString(spokenPhrase: $0.name),
status: .notCompleted,
taskType: .completable,
spatialEventTrigger: nil,
temporalEventTrigger: intent.temporalEventTrigger,
createdDateComponents: DateHelper.localCalendar().dateComponents([.year, .month, .day, .minute, .hour], from: Date.now),
modifiedDateComponents: nil,
identifier: $0.id
)
}
return response
}
}
class AddItemIntentHandler: INExtension, INCreateNoteIntentHandling {
func resolveTitle(for intent: INCreateNoteIntent) async -> INSpeakableStringResolutionResult {
if let title = intent.title {
return INSpeakableStringResolutionResult.success(with: title)
} else {
return INSpeakableStringResolutionResult.needsValue()
}
}
func resolveGroupName(for intent: INCreateNoteIntent) async -> INSpeakableStringResolutionResult {
if let groupName = intent.groupName {
return INSpeakableStringResolutionResult.success(with: groupName)
} else {
return INSpeakableStringResolutionResult.needsValue()
}
}
func handle(intent: INCreateNoteIntent) async -> INCreateNoteIntentResponse {
do {
// my code for handling this...
let response = INCreateNoteIntentResponse(code: .success, userActivity: nil)
response.createdNote = INNote(
title: INSpeakableString(spokenPhrase: itemName),
contents: itemNote.map { [INTextNoteContent(text: $0)] } ?? [],
groupName: INSpeakableString(spokenPhrase: list.name),
createdDateComponents: DateHelper.localCalendar().dateComponents([.day, .month, .year, .hour, .minute], from: Date.now),
modifiedDateComponents: nil,
identifier: newItem.id
)
return response
} catch {
return INCreateNoteIntentResponse(code: .failure, userActivity: nil)
}
}
}
uninstalled my app
restarted my physical device and simulator
Yet, when I say "Remind me to buy dog food in Index" (Index is the name of my app), as stated in the examples of INAddTasksIntent, Siri proceeds to say that a list named "Index" doesn't exist in apple Reminders app, instead of processing the request in my app.
Am I missing something?
SiriKit
RSS for tagHandle requests for your app’s services from users using Siri or Maps.
Posts under SiriKit tag
30 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
We're in the process of migrating our app's custom intents from the older SiriKit Custom Intents framework to App Intents. The migration has been straightforward for our app-specific actions, and we appreciate the improved discoverability and Apple Intelligence integration that App Intents provides.
However, we also implement SiriKit domain intents for calling and messaging:
INStartCallIntent / INStartCallIntentHandling
INSendMessageIntent / INSendMessageIntentHandling
These require us to maintain an Intents Extension to handle contact resolution and the actual call/message operations.
Our questions:
Is there a planned App Intents equivalent for these SiriKit domains (calling, messaging), or is the Intents Extension approach still the recommended path?
If we want to support phrases like "Call [contact] on [AppName]" or "Send a message to [contact] on [AppName]" with Apple Intelligence integration, is there any way to achieve this with App Intents today?
Are there any WWDC sessions or documentation we may have missed that addresses the migration path for SiriKit domain intents?
What we've reviewed:
"Migrate custom intents to App Intents" Tech Talk
"Bring your app's core features to users with App Intents" (WWDC24)
App Intents documentation
These resources clearly explain custom intent migration but don't seem to address the system domain intents.
Our current understanding:
Based on our research, it appears SiriKit domain intents should remain on the older framework, while custom intents should migrate to App Intents. We'd like to confirm this is correct and understand if there's a future direction we should be planning for.
Thank you!
I'm implementing app intents for my tasks app which supports recurrence rule for tasks. I see that when creating a todo for Reminders via Siri it allows to set a recurrence rule via natural language.
Is there a built in way to receive that recurrence rule as a @Parameter in my AppIntent? If not, is it possible to receive the full user dictated text in the AppIntent:perform method so that I can use some ML model to convert the text to EKRecurrenceRule or similar?
I am implementing AppIntent into my application as follows:
// MARK: - SceneDelegate
var window: UIWindow?
private var observer: NSObjectProtocol?
func scene(_ scene: UIScene, willConnectTo session: UISceneSession, options connectionOptions: UIScene.ConnectionOptions) {
guard let windowScene = (scene as? UIWindowScene) else { return }
// Setup window
window = UIWindow(windowScene: windowScene)
let viewController = ViewController()
window?.rootViewController = viewController
window?.makeKeyAndVisible()
setupUserDefaultsObserver()
checkShortcutLaunch()
}
private func setupUserDefaultsObserver() {
// use NotificationCenter to receive notifications.
NotificationCenter.default.addObserver(
forName: NSNotification.Name("ShortcutTriggered"),
object: nil,
queue: .main
) { notification in
if let userInfo = notification.userInfo,
let appName = userInfo["appName"] as? String {
print("📱 Notification received - app is launched: \(appName)")
}
}
}
private func checkShortcutLaunch() {
if let appName = UserDefaults.standard.string(forKey: "shortcutAppName") {
print("🚀 App is opened from a Shortcut with the app name: \(appName)")
}
}
func sceneDidDisconnect(_ scene: UIScene) {
if let observer = observer {
NotificationCenter.default.removeObserver(observer)
}
}
}
// MARK: - App Intent
struct StartAppIntent: AppIntent {
static var title: LocalizedStringResource = "Start App"
static var description = IntentDescription("Launch the application with the command")
static var openAppWhenRun: Bool = true
@MainActor
func perform() async throws -> some IntentResult {
UserDefaults.standard.set("appName", forKey: "shortcutAppName")
UserDefaults.standard.set(Date(), forKey: "shortcutTimestamp")
return .result()
}
}
// MARK: - App Shortcuts Provider
struct AppShortcutsProvider: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: StartAppIntent(),
phrases: [
"let start \(.applicationName)",
],
shortTitle: "Start App",
systemImageName: "play.circle.fill"
)
}
}
the app works fine when starting with shortcut. but when starting with siri it seems like the log is not printed out, i tried adding a code that shows a dialog when receiving a notification from userdefault but it still shows the dialog, it seems like the problem here is when starting with siri there is a problem with printing the log.
I tried sleep 0.5s in the perform function and the log was printed out normally
try? await Task.sleep(nanoseconds: 500_000_000) // 0.5 seconds
I have consulted some topics and they said that when using Siri, Intent is running completely separately and only returns the result to Siri, never entering the Main App. But when set openAppWhenRun to true, it must enter the main app, right? Is there any way to find the cause and completely fix this problem?
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
SiriKit
Intents
App Intents
OSLog
Hi,
I am developing a music app. We are using siri media search functionality for a while. We recently had a case where siri would not provide keyword for a search.
When user speaks "Play Kid songs" (in Turkish, çocuk şarkıları çal), when I debug I see mediaSearch.mediaName is nil.
When user speaks "Play Kids" (in Turkish, çocuklar çal) a keyword is given and we can search and play related song.
Normally I would think that siri is somehow censoring the word "Kid". But when i try the same voice search in Spotify, I get a children song search result.
I've read documentations and searched web but couldnt find any similar experience.
What would be the cause, is there an extra setting for this kind of behaviour. What would be the cause or a different capability that Spotify can get a keyword out of this voice search but not us?
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
Siri and Voice
SiriKit
App Intents
Hello,
I’m developing a third-party VoIP app called Heyno and trying to support Siri-initiated calls so they behave like WhatsApp / FaceTime, especially from the lock screen.
Target behavior
From the locked device, the user says:
“Hey Siri, call <contact> using Heyno”
Expected result:
• System CallKit audio-call UI appears.
• No “continue in ” sheet, no forced unlock or foregrounding.
• Our app handles the VoIP leg in the background via CXProviderDelegate.
WhatsApp already does this with:
“Hey Siri, call <contact> on WhatsApp”
I’m trying to reproduce that behavior for Heyno using public APIs.
I have followed the SiriKit + CallKit VoIP docs but cannot get a clean Siri → CallKit → app flow from the lock screen without either:
Being forced into .continueInApp (unlock + foreground), or
Hitting CallKit transaction errors when starting the call from the app in response to the intent.
Current implementation
Intents extension (INStartCallIntentHandling)
• resolveContacts(for:with:) normalizes to E.164 and returns INPersonResolutionResult.success.
• resolveDestinationType → .success(.normal).
• resolveCallCapability → .success(.audioCall).
Confirm / handle currently:
func confirm(intent: INStartCallIntent,
completion: @escaping (INStartCallIntentResponse) -> Void) {
completion(INStartCallIntentResponse(code: .ready, userActivity: nil))
}
func handle(intent: INStartCallIntent,
completion: @escaping (INStartCallIntentResponse) -> Void) {
completion(INStartCallIntentResponse(code: .ready, userActivity: nil))
}
Earlier, I used .continueInApp with an NSUserActivity carrying the normalized number and metadata, but that always produced a “Continue in Heyno” sheet that requires unlock and foreground, which breaks the lock-screen Siri flow.
App target – CallKit provider
In the app I have CXProvider + CXProviderDelegate, which work correctly when calls are initiated from inside the app:
func provider(_ provider: CXProvider, perform action: CXStartCallAction) {
let handle = action.handle.value
// Start VoIP / WebRTC / LiveKit / Asterisk call here
provider.reportOutgoingCall(with: action.callUUID,
startedConnectingAt: Date())
provider.reportOutgoingCall(with: action.callUUID,
connectedAt: Date())
action.fulfill()
}
If I construct a CXStartCallAction and submit it via CXCallController.request(...) from the app, CallKit UI appears and our pipeline runs correctly.
What I tried and what fails
Starting CallKit from the Intents extension
Calling CXCallController.request(...) directly from handle(intent:completion:) in the extension always yields:
com.apple.CallKit.error.requesttransaction error 1 (unentitled)
The extension does not have the CallKit entitlement, and the docs say not to initiate calls from the extension, so this path seems unsupported.
Using .continueInApp + NSUserActivity
Pattern:
• handle(intent:) builds NSUserActivity (activityType = NSStringFromClass(INStartCallIntent.self), title = "Heyno Start Call", userInfo with E.164 handle, etc.).
• Returns INStartCallIntentResponse(code: .continueInApp, userActivity: activity).
• App receives the activity, then starts CallKit + VoIP.
Functionally this works, but iOS always requires unlock + foreground (“Continue in Heyno”), which is not acceptable for a Siri lock-screen call.
App group + Darwin notification (extension → app → CallKit)
Experiment:
• Extension writes the normalized number into an app-group UserDefaults.
• Extension posts a Darwin notification.
• App (if running) listens, reads the number, and initiates CXStartCallAction + VoIP.
Observed:
• Works only when the app is already running in the background; a killed app is not woken.
• In some states I see CXErrorCodeRequestTransactionError.invalidAction (error 6) if I try to issue a CXStartCallAction while CallKit is already doing something as part of the Siri flow.
• Siri sometimes replies “There was a problem with the app,” likely because CallKit rejects the transaction or sees duplicate/conflicting actions.
My understanding so far
• The Intents extension should resolve/confirm the intent but not start the call.
• The source of truth for starting a call should be:
Siri → CallKit → app’s CXProviderDelegate.provider(_:perform: CXStartCallAction)
• The app then starts the VoIP leg, reports started/connected, and fulfills.
Where I am stuck
What is not clear is how Siri is supposed to route an INStartCallIntent into CallKit for a third-party VoIP app on a locked device without using .continueInApp.
If my extension simply:
• resolves the contact,
• confirm → .ready,
• handle → .ready (no NSUserActivity, no CallKit),
I do not see a documented mechanism that causes:
“Hey Siri, call <contact> using Heyno”
on the lock screen to:
• Present a CallKit audio call bound to Heyno, and
• Deliver CXStartCallAction to my CXProviderDelegate while the app stays in the background.
Questions
For third-party VoIP apps today, is it recommended to implement INStartCallIntentHandling at all, or should we rely only on CallKit registration and Siri’s built-in support for “Call with ” (no SiriKit extension)?
If an INStartCallIntentHandling extension is still the intended pattern:
• Should confirm/handle simply return .ready and never start CallKit or set NSUserActivity?
• In that case, is Siri expected to invoke CallKit on our behalf and create a CXStartCallAction targeting our provider, even when the device is locked and the app is not foreground?
Is there any supported way for a Siri-triggered third-party VoIP call to start from the lock screen via CallKit without:
• using .continueInApp (unlock + foreground), and
• starting CallKit directly from the Intents extension (unentitled)?
Is there any additional configuration, entitlement, provisioning profile flag, or Info.plist key required so that Siri can map “Call using Heyno” directly to our CallKit provider and background VoIP implementation?
Current options:
• .continueInApp + NSUserActivity → works, but always requires unlock + app UI.
• Start CallKit from the extension → fails with “unentitled” and appears unsupported.
• Extension → app-group + notification → app → CallKit → VoIP → fragile, with intermittent CXErrorCodeRequestTransactionError.invalidAction.
• Remove the extension and hope Siri/CallKit auto-routes to our provider → unclear if this is supported for third-party VoIP apps or reserved for privileged apps.
I would appreciate guidance on the intended architecture for this scenario, and whether the “Siri from lock screen → CallKit UI → background VoIP call” flow is achievable for an App Store VoIP app like Heyno using public APIs only.
My app used app intents. And when user said "Prüfung der Bluetooth Funktion", screen can show the whole words. But in my app, it only can get "Bluetooth Funktion". This behaviour only happened in German version. In English version, everything worked well.
Is anyone can support me? Why German version siri cut my words?
When my AppShortcut phrase is:
"Go (.$direction) with (.applicationName)"
Then everything works correctly, the AppIntent correctly receives the parameter. But when my phrase is:
"What is my game (.$direction) with (.applicationName)"
The an alert dialog pops up saying:
"Hey siri what is my game tomorrow with {app name}
Do you want me to use ChatGPT to answer that?"
The phrase is obviously heard correctly, and it's exactly what I've specified in the AppShortcut. Why isn't it being sent to my AppIntent?
import Foundation
import AppIntents
@available(iOS 17.0, *)
enum Direction: String, CaseIterable, AppEnum {
case today, yesterday, tomorrow, next
static var typeDisplayRepresentation: TypeDisplayRepresentation {
TypeDisplayRepresentation(name: "Direction")
}
static var caseDisplayRepresentations: [Direction: DisplayRepresentation] = [
.today: DisplayRepresentation(title: "today", synonyms: []),
.yesterday: DisplayRepresentation(title: "yesterday", synonyms: []),
.tomorrow: DisplayRepresentation(title: "tomorrow", synonyms: []),
.next: DisplayRepresentation(title: "next", synonyms: [])
]
}
@available(iOS 17.0, *)
struct MoveItemIntent: AppIntent {
static var title: LocalizedStringResource = "Move Item"
@Parameter(title: "Direction")
var direction: Direction
func perform() async throws -> some IntentResult {
// Logic to move item in the specified direction
print("Moving item \(direction)")
return .result()
}
}
@available(iOS 17.0, *)
final class MyShortcuts: AppShortcutsProvider {
static let shortcutTileColor = ShortcutTileColor.navy
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: MoveItemIntent()
, phrases: [
"Go \(\.$direction) with \(.applicationName)"
// "What is my game \(\.$direction) with \(.applicationName)"
]
, shortTitle: "Test of direction parameter"
, systemImageName: "soccerball"
)
}
}
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
Siri and Voice
SiriKit
App Intents
The name of our app is a portmanteau, and Siri is getting it slightly wrong. It’s putting the emphasis on the second and fourth syllables instead of the first and third. Kind of like if someone starting singing /tinˈeɪʒd muˌtent/. It’s just wrong enough to be funny. It still recognizes the correct name when someone says “hey Siri”.
We’ve already tried adding CFBundleSpokenName and INAlternativeAppNamePronunciationHint to info.plist, but neither is changing how Siri says it. We can’t put it in AppIntentVocabulary.plist because we don’t know the key path for the app name.
Hello fellow developers,
I'm the founder of a FinTech startup, Cent Capital (https://cent.capital), where we are building an AI-powered financial co-pilot.
We're deeply exploring the Apple ecosystem to create a more proactive and ambient user experience. A core part of our vision is to use App Intents and the Shortcuts app to surface personalized financial insights without the user always needing to open our app. For example, suggesting a Shortcut like, "What's my spending in the 'Dining Out' category this month?" or having an App Intent proactively surface an insight like, "Your 'Subscriptions' budget is almost full."
My question for the community is about the architectural and user experience best practices for this.
How are you thinking about the balance between providing rich, actionable insights via Intents without being overly intrusive or "spammy" to the user?
What are the best practices for designing the data model that backs these App Intents for a complex domain like personal finance?
Are there specific performance or privacy considerations we should be aware of when surfacing potentially sensitive financial data through these system-level integrations?
We believe this is the future of FinTech apps on iOS and would love to hear how other developers are thinking about this challenge.
Thanks for your insights!
In my app, when invoking a Shortcut via Siri, the
application(_:continueUserActivity:restorationHandler:)
method in AppDelegate is called twice.
When I debug, both NSUserActivity objects are identical.
However, when I run the same Shortcut by tapping it manually, the method is only called once as expected.
Has anyone experienced this issue? How can I prevent Siri Shortcuts from delivering the same NSUserActivity twice?
I'm a bit confused as to what we're supposed to be doing to support starting a workout using Siri in iOS/watchOS 26. On one hand, I see a big push to move towards App Intents and shortcuts rather than SiriKit. On the other hand, I see that some of the things I would expect to work with App Intents well... don't work. BUT - I'm also not sure it isn't just developer error on my part.
Here are some assertions that I'm hoping someone more skilled and knowledgable can correct me on:
Currently the StartWorkoutIntent only serves the Action button on the Watch Ultra. It cannot be used to register Shortcuts, nor does Siri respond to it.
I can use objects inherited from AppIntent to create shortcuts, but this requires an additional permission to run a shortcut if a user starts a workout with Siri.
AppIntent shortcuts requires the user to say "Start a workout in " - if the user leaves out the "in " part, Siri will not prompt the user to select my app.
If I want to allow the user to simply say "Start a Workout" and have Siri prompt the user for input as to which app it should use, I must currently use the older SiriKit to do so.
Are these assertions correct - or am I just implementing something incorrectly?
Using the latest Xcode 26 beta for what it is worth.
Topic:
App & System Services
SubTopic:
Health & Fitness
Tags:
Siri and Voice
SiriKit
Intents
App Intents
Hi, experts,
I want to find Siri response window by bundle id and use it for checking or printing, here is my example code:
XCUIDevice.shared.siriService.activate(voiceRecognitionText: "call mom")
let siriApp = XCUIApplication(bundleIdentifier: "***")
// Print out text from siriApp,
// expecte print: "Sorry, I can't make a phone call with your iphone."
Where should I put into ***?
I tried "com.apple.SiriViewService", "com.apple.siri.velocity", "com.apple.springboard' but nothing work
Any suggestion appreciated, thanks!
Hello,
I’m working on integrating SiriKit with my music app using INPlayMediaIntent. My app is live on TestFlight, and the Siri command is being recognized, but mediaItems is always empty in my Intent
Demo Project
Hi,
I’m developing an app, which just like Clock App, uses multiple counters.
I want to speak Siri commands, such as “Siri, count for one hour”. ‘count’ is the alternative app name.
My AppIntent has a parameter, and Siri understands if I say “Siri, count” and asks for duration in a separate step. It runs fine, but I can’t figure out how to run the command with the duration specified upfront, without any subsequent questions from Siri.
Clock App has this functionality, so it can be done.
//title
//perform()
@Parameter(title: "Duration")
var minutes: Measurement<UnitDuration>
}
I have a struct ShortcutsProvider: AppShortcutsProvider, phrases accept only parameters of type AppEnum or AppEntity.
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
Siri and Voice
SiriKit
App Intents
Hi,
we're having trouble implementing search through Siri voice commands.
We already did it successfully for audio playback using INPlayMediaIntentHandling.
For search, none of the available ways works.
Both INSearchForMediaIntentHandling and ShowInAppSearchResultsIntent never open the App in the first place. We tried various commands, but e.g. "Search for " sometimes opens the Apple Music app and sometimes shows a Google search widget. Our app is never taken into consideration for providing any results.
We implemented all steps mentioned in WWDC videos and documentation (e.g. https://developer.apple.com/documentation/appintents/making-in-app-search-actions-available-to-siri-and-apple-intelligence), but nothing seems to work.
We're mainly testing on iOS 18 currently. Any idea why this is not working?
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
Siri and Voice
SiriKit
Intents
App Intents
A user can use Siri to display a list of items from my app. When the user touches on an item to open the app - how do I pass that item to the main app so I know which item detail page to display? This is a SwiftUI app.
Hi everyone,
I’m an AI engineer working on autonomous AI agents and exploring ways to integrate them into the Apple ecosystem, especially via Siri and Apple Intelligence.
I was impressed by Apple’s integration of ChatGPT and its privacy-first design, but I’m curious to know:
• Are there plans to support third-party LLMs?
• Could Siri or Apple Intelligence call external AI agents or allow extensions to plug in alternative models for reasoning, scheduling, or proactive suggestions?
I’m particularly interested in building event-driven, voice-triggered workflows where Apple Intelligence could act as a front-end for more complex autonomous systems (possibly local or cloud-based).
This kind of extensibility would open up incredible opportunities for personalized, privacy-friendly use cases — while aligning with Apple’s system architecture.
Is anything like this on the roadmap? Or is there a suggested way to prototype such integrations today?
Thanks in advance for any thoughts or pointers!
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
SiriKit
Machine Learning
Apple Intelligence
I have a question about the app lifecycle when my app is launched via a Shortcut. I'm adding a INIntent to a Mac app. So my app delegate implements:
- (nullable id)application:(NSApplication *)application handlerForIntent:(INIntent *)intent
Then my custom intent handler implements the two protocol methods -confirmIntentNameHere:completion: and -handleIntentNameHere:completion:
During my testing -applicationDidFinishLaunching: is called before the intent methods, so I can forward methods to my main window controller to perform the shortcut actions, since it's already ready.
....But if this is not always the case, I can still perform them but I'd have to move the code out of the window controller to perform the action "headless" if invoked before my app has built its UI. Just wondering if this is something I should be prepared for.
Thanks in advance.
Was going to add a shortcut to an app via INIntent. I followed the WWDC developer.apple.com/videos/play/wwdc2021/10232/?time=986
Steps:
Created a .intentdefinition file and created an intent.
Added the intent to .intentdefinition and compiled the app.
Import the header file for the custom intent in the AppDelegate MyIntentname.h
Have the AppDelegate conform to the protocol created in the generated code.
Implement: -application:handlerForIntent: and return self (the app delegate)
Run the app.
Open the Shortcuts app and search for the 'shortcut' (according to the WWDC video linked above it should show up in the actions list).
Doesn't show up in the list.
I tried moving the build application out from Debug to my Applications folder to see if that would help the Shortcuts app find it, but it didn't.
Am I missing a step/doing something wrong?
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
Shortcuts
SiriKit
Intents
App Intents