If Apple and Google wants to give access to the OS to an LLM such as Siri, they'd make it easy.
It's the trivial part of all this. You're getting too much into the details of what is available to developers today. Instead, you should focus on what Apple and Google would do internally to make things as easy for an LLM as possible.
Up until now, the hardest part was always understanding what exactly users want. GPT4 is a thousand times better than what Siri and Google Assistant are currently doing.
Again, mapping OS APIs is the easiest part. By far.
> If Apple and Google wants to give access to the OS to an LLM such as Siri, they'd make it easy.
You keep skipping the question of how.
> It's the trivial part of all this. You're getting too much into the details of what is available to developers today.
Somehow you have this mystical magical idea of "oh, it's just this small insignificant little thing".
Literally this:
LLMs understand human requests
* magic *
Things happen in the OS
You, yes you already have basically the same access as developers of the OS have. You already have access to tens of thousands of OS APIs and to the LLMs.
And yet we haven't seen a single implementation that does what you want.
> Again, mapping OS APIs is the easiest part. By far.
If it is, it would make it trivially easy how to do this trivial and easy task for a small subset of those APIs, wouldn't it? Can you show me how you would do it?
It's the trivial part of all this. You're getting too much into the details of what is available to developers today. Instead, you should focus on what Apple and Google would do internally to make things as easy for an LLM as possible.
Up until now, the hardest part was always understanding what exactly users want. GPT4 is a thousand times better than what Siri and Google Assistant are currently doing.
Again, mapping OS APIs is the easiest part. By far.