@stroughtonsmith OpenAI's API is excellent. Of course, it costs money! I'm puzzled that there are no new Apple APIs for running data through an on-device model. Hopefully that's coming when Intelligence rolls out. You could also embed a small flavor of Llama, but obviously you then incur the storage cost and miss any Apple optimizations.
Replying to: