Published on

OpenCode: The Open Alternative to Claude Code


Why OpenCode? 🤔

Previously, we covered Claude Code configuration with specialized agents and MCP servers. While Claude Code is powerful, it has a critical limitation: vendor lock-in.

Claude Code's closed strategy makes it difficult to use other LLM providers. If Claude shuts down or changes its pricing model, your entire AI workflow breaks. This single point of failure is unacceptable for production environments.

OpenCode solves this problem. It's an open-source AI coding CLI that supports 75+ LLM providers while maintaining the same powerful features: agents, custom commands, and MCP server integration. You get the familiar workflow without the vendor dependency.

We've successfully integrated KIMI-K2, Qwen-Code, GLM-4.6, and DeepSeek-Chat into our workflow. This guide shows you how to do the same.

Installation 🚀

Install OpenCode via the install script:

curl -fsSL https://opencode.ai/install | bash

Or using package managers:

# npm
npm install -g opencode-ai

# Homebrew (macOS/Linux)
brew install opencode

# Arch Linux
paru -S opencode-bin

Verify installation:

opencode --version

For detailed installation options, see the official documentation.

Configuration 📝

OpenCode uses opencode.json for configuration, similar to Claude Code's settings.json. Configuration files can be placed in:

  • Global: ~/.config/opencode/opencode.json
  • Per-project: ./opencode.json (in project root)
  • Custom path: Set OPENCODE_CONFIG environment variable

Basic Configuration

{
  "$schema": "https://opencode.ai/config.json",
  "model": "anthropic/claude-sonnet-4-5",
  "theme": "opencode",
  "autoupdate": true
}

Configuration files are merged together using deep merge strategy. Project-specific settings override global settings.

Multi-Provider Setup 🌐

The key advantage of OpenCode is provider flexibility. Here's how to configure the providers we use:

1. DeepSeek

DeepSeek offers cost-effective models with strong coding capabilities.

# Connect provider
opencode
/connect
# Select DeepSeek and enter API key

Configuration in opencode.json:

{
  "provider": {
    "deepseek": {
      "models": {
        "deepseek-chat": {}
      }
    }
  }
}

Get API keys from DeepSeek Platform.

2. KIMI-K2 (Moonshot AI)

KIMI-K2 provides excellent context length and reasoning.

/connect
# Select Moonshot AI and enter API key

Configuration:

{
  "provider": {
    "moonshot": {
      "models": {
        "moonshot-v1-128k": {}
      }
    }
  }
}

Get API keys from Moonshot AI Console.

3. Qwen-Code

Qwen-Code models are available through multiple providers (Cerebras, Hugging Face, OpenCode Zen).

Via Cerebras (recommended for speed):

/connect
# Select Cerebras and enter API key
{
  "provider": {
    "cerebras": {
      "models": {
        "qwen3-coder": {}
      }
    }
  }
}

Get API keys from Cerebras Console.

4. GLM-4.6 (Z.AI)

GLM-4.6 offers strong multilingual and coding capabilities.

/connect
# Select Z.AI and enter API key
{
  "provider": {
    "zai": {
      "models": {
        "glm-4": {}
      }
    }
  }
}

Get API keys from Z.AI API Console.

Switching Between Models

Use the /models command in the TUI to switch between configured providers:

/models

Select from all configured models across different providers. You can also set different models per agent or command.

Agents System 🤖

OpenCode supports the same agent concept as Claude Code with primary agents and subagents.

Built-in Agents

  • Build - Full development mode with all tools enabled (default)
  • Plan - Read-only mode for analysis without code changes
  • General - Subagent for complex research tasks
  • Explore - Subagent for fast codebase exploration

Switch between primary agents using Tab key during sessions.

Custom Agents

Create agents in .opencode/agent/ directory:

.opencode/agent/review.md:

---
description: Code review without modifications
mode: subagent
model: deepseek/deepseek-chat
permission:
  edit: deny
  bash: ask
---

You are a code reviewer. Focus on:
- Security vulnerabilities
- Performance issues
- Best practices
- Code maintainability

Provide constructive feedback without making changes.

Or configure in opencode.json:

{
  "agent": {
    "review": {
      "description": "Code review without modifications",
      "mode": "subagent",
      "model": "deepseek/deepseek-chat",
      "tools": {
        "write": false,
        "edit": false
      }
    }
  }
}

Invoke subagents with @ mention:

@review analyze this authentication code

For detailed agent configuration, see OpenCode Agents Documentation.

Custom Commands ⚡

Create reusable commands in .opencode/command/:

.opencode/command/test.md:

---
description: Run tests with coverage
agent: build
model: moonshot/moonshot-v1-128k
---

Run the full test suite with coverage report.
Analyze failures and suggest fixes.

Or in opencode.json:

{
  "command": {
    "test": {
      "template": "Run the full test suite with coverage report.\nAnalyze failures and suggest fixes.",
      "description": "Run tests with coverage",
      "agent": "build",
      "model": "moonshot/moonshot-v1-128k"
    }
  }
}

Use commands with /:

/test

Command Features

Arguments: Use $ARGUMENTS or $1, $2, etc.

Create a React component named $ARGUMENTS with TypeScript support.

Shell output: Inject command output with !command``

Recent commits:
!`git log --oneline -10`

Review these changes.

File references: Include files with @

Review @src/components/Button.tsx for performance issues.

See OpenCode Commands Documentation for more details.

MCP Server Integration 🔌

OpenCode supports the same MCP (Model Context Protocol) servers as Claude Code. You can reuse your existing MCP configuration.

Configuration Format

{
  "mcp": {
    "all-in-mcp": {
      "type": "local",
      "command": ["pipx", "run", "all-in-mcp"],
      "environment": {
        "APAPER": "true",
        "GITHUB_REPO_MCP": "true"
      }
    }
  }
}

For academic research with all-in-mcp (covered in our Claude Code configuration guide), the setup is identical:

# Install all-in-mcp
pipx install all-in-mcp

# Configure in opencode.json
# (same as shown above)

Tools become available automatically:

  • search-iacr-papers
  • search-google-scholar-papers
  • search-cryptobib-papers
  • download-iacr-paper
  • read-pdf

Remote MCP Servers

OpenCode also supports remote MCP servers:

{
  "mcp": {
    "context7": {
      "type": "remote",
      "url": "https://mcp.context7.com/mcp"
    }
  }
}

See OpenCode MCP Documentation for more examples.

Migration from Claude Code 🔄

If you're migrating from Claude Code, here's the mapping:

File Structure

Claude CodeOpenCode
.claude/.opencode/
.claude/agents/.opencode/agent/
.claude/commands/.opencode/command/
.claude/settings.jsonopencode.json
.claude/.mcp.jsonmcp section in opencode.json

Configuration Mapping

Claude Code settings.json:

{
  "permissions": {
    "allow": ["Bash(git add:*)"]
  },
  "enableAllProjectMcpServers": true
}

OpenCode opencode.json:

{
  "permission": {
    "bash": {
      "git add*": "allow",
      "*": "ask"
    }
  },
  "mcp": {
    "my-mcp-server": {
      "enabled": true
    }
  }
}

Agent Migration

Claude Code .claude/agents/blog-writing.md:

---
name: blog-writing
description: Professional blog writing
---
Transform drafts into polished posts...

OpenCode .opencode/agent/blog-writing.md:

---
description: Professional blog writing
mode: subagent
---
Transform drafts into polished posts...

Command Migration

Commands use nearly identical syntax. Simply move files from .claude/commands/ to .opencode/command/ and adjust frontmatter:

Claude Code:

---
name: test
description: Run tests
---

OpenCode:

---
description: Run tests
agent: build
---

Practical Workflow Example 🌟

Here's a typical workflow using multiple providers:

# 1. Initialize project with DeepSeek (fast and cheap)
opencode
/models
# Select deepseek/deepseek-chat

# 2. Initial code generation
Add user authentication with JWT tokens

# 3. Switch to KIMI-K2 for complex reasoning
/models
# Select moonshot/moonshot-v1-128k

Analyze the security implications of this auth implementation

# 4. Use Qwen-Code for refactoring
/models
# Select cerebras/qwen3-coder

Refactor the authentication code following best practices

# 5. Final review with GLM-4.6
/models
# Select zai/glm-4

Review the complete authentication system and documentation

This multi-provider approach lets you optimize for cost, speed, and capability at each step.

Key Advantages 🎯

Provider Independence: Switch between 75+ providers without changing your workflow.

Cost Optimization: Use cheaper models for simple tasks, powerful models for complex ones.

Future-Proof: No vendor lock-in. If one provider shuts down, seamlessly switch to another.

Same Features: Agents, commands, MCP servers work identically to Claude Code.

Open Source: Community-driven development and transparency.

Flexible Configuration: JSON or markdown files, global or per-project settings.

Conclusion 🚀

OpenCode delivers the same powerful AI coding experience as Claude Code without vendor lock-in. With support for DeepSeek, KIMI-K2, Qwen-Code, GLM-4.6, and 70+ other providers, you're free to choose the best model for each task.

The migration path is straightforward: similar directory structure, compatible agent/command syntax, and identical MCP integration. Your existing workflows translate directly.

Start building a provider-agnostic AI coding workflow today:

curl -fsSL https://opencode.ai/install | bash
opencode

Configuration Examples: