If you've been looking for a way to get AI-powered coding assistance without breaking the bank, OpenCode is worth checking out. It's an open-source CLI-first coding agent that can help you write, debug, and optimize code using large language models.
The best part? You can connect it to Ollama Cloud and get started for free. And if you want to go completely free, you can even run it locally with Ollama or LM Studio. Let's walk through how to set this up.
Setting Up with Ollama Cloud
1. Create Your Ollama Account
First, sign up for a free account at Ollama Cloud. The free tier gives you enough credits to get started and test things out.
2. Get Your API Key
Once you're logged in, navigate to Settings → Keys → Add API key.
3. Configure OpenCode
- Open your terminal and run:
opencode
-
Type
/modelsand hit enter. -
Then press
ctrl + ato connect providers. Search forOllama Cloudand select it. -
Enter your API key when prompted and submit.
4. Test Your Setup
- Type
/modelsagain to see if Ollama Cloud is listed as your active provider. - Select a model from Ollama Cloud (e.g.,
kimi-k2.5). - And try a simple prompt like "Write a hello world function in Python" to verify everything is working.
If everything is configured correctly, OpenCode should generate the code for you.
Going Completely Free: Local Setup
Want to avoid API costs altogether? You can run everything locally.
Option 1: Local Ollama
-
Install Ollama on your machine from ollama.com
-
Pull a model you want to use:
ollama pull qwen3-coder
- Launch OpenCode with the following command:
ollama launch opencode --model qwen3-coder
That's it! Now OpenCode will use your local Ollama instance.
Option 2: LM Studio
LM Studio is another great option that gives you a nice GUI to manage models.
-
Download and install LM Studio
-
From LM Studio, download a model you like.
-
Load the model in LM Studio and start the API server.
-
Launch OpenCode:
opencode
-
Connect to LM Studio by typing
/models, pressingctrl + a, and selectingLM Studio. Enter any random string for API key and submit. -
Verify by selecting the model you loaded in LM Studio and testing a prompt.
Quick Tips
-
Model Choice: For coding tasks, models like
qwen2.5-coder,deepseek-coder, orcodellamatend to perform better than general-purpose models. -
Context Window: Larger context windows are better for complex projects. Check the model specs and configuration before choosing and loading the model (LM Studio).
-
Local vs Cloud: Local setup is free but uses your hardware. Cloud setup is easier to manage and scales better but may incur costs.
Troubleshooting
If you run into connection issues:
- Make sure Ollama or LM Studio is actually running for local setups
- Check that your API key is valid for cloud setups
That's all there is to it! With OpenCode connected to Ollama, you now have a powerful coding assistant at your fingertips. Whether you choose the cloud option for convenience or the local option for privacy and cost savings, you're all set to start building with AI assistance.
Give it a try and see how it speeds up your development workflow!