My app used app intents. And when user said "Prüfung der Bluetooth Funktion", screen can show the whole words. But in my app, it only can get "Bluetooth Funktion". This behaviour only happened in German version. In English version, everything worked well.
Is anyone can support me? Why German version siri cut my words?
Apple Intelligence
RSS for tagApple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac and powers incredible new features to help users communicate, work, and express themselves.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi, guys. I'm writing about Apple Intelligence and I reached the point I have to explain App Intent Domains
https://developer.apple.com/documentation/AppIntents/app-intent-domains
but I noticed that there is a note explaining that these services are not available with Siri. I tried the example provided by Apple at
https://developer.apple.com/documentation/AppIntents/making-your-app-s-functionality-available-to-siri
and I can only make the intents work from the Shortcuts App, but not from Siri.
Is this correct. App Intent Domains are still not available with Siri?
Thanks
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
When the system language and Siri language are not the same, Apple AI may not be usable.
For example, if the system is in English and Siri is in Chinese, it may cause Apple AI to not work.
May I ask if there are other reasons why the app still cannot be used internally even after enabling Apple AI?
We are developing Apple AI for foreign markets and adapting it for iPhone models 17 and above.
When the system language and Siri language are not the same—for example, if the system is in English and Siri is in Chinese—it can cause a situation where Apple AI cannot be used. So, may I ask if there are any other reasons that could cause Apple AI to be unavailable within the app, even if it has been enabled?
After running performance test on my CoreML qwen3 vision, I appreciated the update where results were viewable... ON Mac it mentions Ios18 and im not sure if or how to change..
that bottle neck lead to rebuilding CoreML view.
I woke up and realized I have all the pieces together... and ended up with a swift package working demo of Clawbot..
the current issue is Im trying to use gguf 3b to code it.. I have become well aware that everything I create using the big models, they soon become the default themes /layouts for everyone else simply asking for this or that (I appoligise)
so here I am asking (while looking to schedule meet with dev) if its possible to speak with anyone about th 1000s of Apple Intelligence PCC, Xcode, and vision reports and feedback ive sent , in terms of just general ways I can work more efficiently without the crash...
ive already build a TUI for MLX but the tools for coreML while seems promising are not intuitive, but the vision format instruction was nice to see.
Anyway my question is:
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence