Routing Codex to a local model keeps prompts on the workstation while still allowing fast switches between hosted and local backends for different coding tasks.
The Codex CLI uses --oss to enter the local routing path for a run, --local-provider to pick Ollama or LM Studio, and -m to select the exact local model identifier. If --oss should default to one backend, set oss_provider in ~/.codex/config.toml, while saved profiles can pin the concrete provider ID such as ollama or lmstudio together with the model.
Routing changes only which backend Codex calls; the provider still needs to be running and the model still needs to be available locally before the prompt is sent. Leaving the provider API bound beyond localhost can expose prompts to other hosts, and running Codex outside a trusted repository can still trigger the trust check before execution starts.
Related: How to use local models with Codex
Related: How to fix Codex trusted directory error
$ ollama list NAME ID SIZE MODIFIED gpt-oss:20b 17052f91a42e 13 GB 4 months ago
For LM Studio, use the model identifier shown by the local server and substitute lmstudio in the routing steps below. Related: [DRAFT] How to model list in Ollama
Related: [DRAFT] How to download a model in LM Studio
$ codex exec --oss --local-provider ollama -m gpt-oss:20b "Reply with exactly OK" OpenAI Codex v0.121.0 (research preview) -------- model: gpt-oss:20b provider: ollama -------- codex OK
Direct flags override any saved config for that run only.
oss_provider = "ollama"
oss_provider picks the backend for --oss, but it does not choose the local model. Keep passing -m unless a saved profile also pins the model.
$ codex exec --oss -m gpt-oss:20b "Reply with exactly OK" OpenAI Codex v0.121.0 (research preview) -------- model: gpt-oss:20b provider: ollama -------- codex OK
[profiles.local_ollama] model_provider = "ollama" model = "gpt-oss:20b"
Saved profiles use the concrete provider ID such as ollama or lmstudio rather than the --oss alias.
$ codex exec -p local_ollama "Reply with exactly OK" OpenAI Codex v0.121.0 (research preview) -------- model: gpt-oss:20b provider: ollama -------- codex OK