You still need to connect to Anthropic and obtain an authorization token.
The isolation here refers to the workspace. Since you run the CLI in a container, the process can only access what you have mapped inside. This is helpful if you want to avoid issues like this: https://hackaday.com/2025/07/23/vibe-coding-goes-wrong-as-ai...
Ok. Thanks for the clarification. Still a good project, and many people like to use online services.
I prefer local models. All I use and used on the local model could be on an online, no secrets here. The speed is more than acceptable for a low end cpu+gpu.
I stil use Perplexity sometimes for more complex questions.
But if it is "a completely isolated environment" why does it need to login and get a token? It defeats isolation.
This should work like any other model, like we do with Ollama, download a model and it runs strict local with no network connections or tokens.