Skip to main content
OpenCode is an open-source AI coding assistant that runs in your terminal.

Install

Install the OpenCode CLI:
curl -fsSL https://opencode.ai/install.sh | bash
OpenCode requires a larger context window. It is recommended to use a context window of at least 64k tokens. See Context length for more information.

Usage with Ollama

Quick setup

ollama launch opencode
To configure without launching:
ollama launch opencode --config

Manual setup

Add a configuration block to ~/.config/opencode/opencode.json:
{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "ollama": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Ollama",
      "options": {
        "baseURL": "http://localhost:11434/v1"
      },
      "models": {
        "qwen3-coder": {
          "name": "qwen3-coder"
        }
      }
    }
  }
}

Cloud Models

glm-4.7:cloud is the recommended model for use with OpenCode. Add the cloud configuration to ~/.config/opencode/opencode.json:
{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "ollama": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Ollama",
      "options": {
        "baseURL": "http://localhost:11434/v1"
      },
      "models": {
        "glm-4.7:cloud": {
          "name": "glm-4.7:cloud"
        }
      }
    }
  }
}

Connecting to ollama.com

  1. Create an API key from ollama.com and export it as OLLAMA_API_KEY.
  2. Update ~/.config/opencode/opencode.json to point to ollama.com:
{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "ollama": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Ollama Cloud",
      "options": {
        "baseURL": "https://ollama.com/v1"
      },
      "models": {
        "glm-4.7:cloud": {
          "name": "glm-4.7:cloud"
        }
      }
    }
  }
}
Run opencode in a new terminal to load the new settings.