@zorn I use OpenAI’s API. For a little while I tried models on my own server, but the server cost with GPUs is high unless you’re processing a lot of data. Eventually I want to embed a model in the Mac app, when everyone has more RAM.
Replying to:
