Goose offers a viable alternative to Claude Code with great freedom in the choice of model provider. An interesting tool to avoid technological dependence.
When it comes to AI for development, Anthropic is king. The San Francisco firm has managed to establish itself as number one in AI for coding with Claude Code, its ultra-optimized development agent used by everyone. His two main problems? Cost and technological dependence. A subscription to Claude Max is necessary to use Claude Code for more than a few minutes in a day, and it costs between 100 and 200 dollars per month for this plan. Likewise, many developers tend to remain compartmentalized in the Anthropic ecosystem… at the risk of never leaving again. This is the lock-in strategy.
Faced with this double problem, several alternative solutions to Claude Code are beginning to emerge in the open source ecosystem. One of the most recent, Goose, focuses on independence and allows native use of a wide variety of providers. Even more interesting, it is possible to use models locally, notably via ollama.
A Desktop and CLI version
Goose was entirely developed as a side project (in January 2025), by Bradley Axen, principal engineer and tech lead of the AI & DataPlatform of Block, the fintech launched by Jack Dorsey in the United States. The project is now being developed by the Block teams, particularly for the company’s internal needs. The tool was designed as a local, open source AI agent capable of automating complex tasks like data migrations or DevOps workflows, leveraging the MCP protocol for seamless integration of external tools. Two distinct versions were launched: Goose Desktop, a graphical version of the AI agent, and Goose CLI, its terminal equivalent.
Goose’s agent is based, like Claude Code, on a feedback loop. When a prompt is sent to the AI, Goose uses an LLM (the one chosen by default) and JSON tools calls to read and act on the machine. The agent then iterates until the complete task is completed. The entire feedback loop remains local, only the context data used by the LLM is sent to the cloud (to the model provider). The real advantage of Goose lies in its native compatibility with the main providers on the market.
Claude Code vs Goose, the feature match
|
goose |
Claude Code |
|
|---|---|---|
|
Subagents |
x |
x |
|
Outline mode |
x |
x |
|
Available in CLI |
x |
x |
|
Native third-party models |
x |
|
|
MCP |
x |
x |
|
Skills |
x |
x |
|
Instruction files |
x |
x |
|
VS Code Integration |
|
x |
|
Plugins |
x |
x |
|
Intelligent memory compaction |
x |
x |
Under agents, plan mode, CLI mode, MCP, instruction files… Claude Code and Goose have the vast majority of standard functionalities for code agents. The real difference is in the ecosystem. Claude Code has an advantage with a huge library of skills, turnkey skills that the AI can use to teach Claude given tasks (PR review, creation of Word documents, creation of react UX, etc.). Likewise, Claude Code has a huge library of third-party plugins while that of Goose is naturally smaller. The real strength of Claude Code lies in its very comfortable complete ecosystem. That of Goose is based on its LLM agnostic positioning, an approach increasingly appreciated by IT departments, particularly in France.
26 different providers supported by Goose
Goose natively supports 26 providers of different models, while Anthropic supports, quite logically, only its own models (natively and without third-party tinkering). Anthropic, Amazon Bedrock, Azure OpenAI, ChatGPT Codex, Claude Code CLI, OpenAI Codex CLI, Cursor Agent, DeepSeek, Databricks, GCP Vertex AI, Gemini CLI, GitHub Copilot, Google Gemini, Groq, Inception, LiteLLM, Mistral AI, OpenAI, OpenRouter, OVHcloud, Amazon SageMaker, Snowflake, Tetrate Agent Router, Venice.ai, and xAI are notably supported by Goose. The user pays only according to his consumption of tokens from the different providers (pay as you go). Interestingly, it is possible to use Goose with your ChatGPT account without requiring an API key. OpenAI is one of the last providers to allow this practice.
With Claude Code, the economic model is simple: a fixed package of 100 to 200 dollars per month per user. For a team with sporadic usage, Goose will be mechanically more economical. For intensive daily use, Claude Max can prove competitive. The balance point depends on actual uses. Concretely, for intensive and daily use, the Claude Code subscription becomes more competitive. The break-even point for Claude Max at $100 is around 9 million monthly tokens with Sonnet 4.5, or 5.5 million tokens with Opus 4.5.
To be truly in a FinOps logic, it is possible to use the ollama API with Goose. This is, for users who can, the most economical method of using the AI agent. The open source model runs locally and so does the agent (Goose). With increasingly efficient models locally or with deployment on your own servers, Goose can maximize the productivity of your teams at lower cost, without multiplying licenses. You also gain freedom of choice. If the new best code model is at Google, there is no need to migrate your agents, a simple click and the model is changed.
Who is Goose or Claude Code for?
The choice between Goose and Claude Code depends on three key factors: your business model, your sovereignty constraints and your organizational maturity. Goose is primarily aimed at small teams of developers with irregular use of the agent. The pay-as-you-go model automatically becomes more economical than a fixed package of $100 to $200 per developer per month. The tool also meets the needs of regulated sectors (banking, health, defense) where local data processing via ollama or self-hosted LLMs is required. Finally, Goose avoids proprietary lock-in: if the best code model migrates tomorrow from Anthropic to Google or DeepSeek, a simple configuration change is enough without calling into question your entire infrastructure.
In large organizations with hundreds of developers, Claude Code can be justified by the desire to standardize practices across the board and avoid the multiplication of tools. For intensive daily use, the fixed plan also becomes competitive in the face of high token consumption. The turnkey ecosystem, with its extensive library of skills and plugins, offers immediate productivity that is difficult to match.
So, should we prioritize raw performance or technological independence? At a time when digital sovereignty is becoming a decision-making criterion for French IT departments, Goose offers a serious alternative that deserves attention.




