Install
Install the Codex CLI:Usage with Ollama
Codex requires a larger context window. It is recommended to use a context window of at least 32K tokens.
codex
with Ollama, use the --oss
flag:
Changing models
By default, codex will use the localgpt-oss:20b
model. However, you can specify a different model with the -m
flag:
Cloud models
Connect to ollama.com directly
Create an API key from ollama.com and export it asOLLAMA_API_KEY
.
To use ollama.com directly, edit your ~/.codex/config.toml
file to point to ollama.com.
codex
in a new terminal to load the new settings.