Merged
Size
M
Change Breakdown
Bug Fix65%
Testing25%
Docs10%
#66438fix(codex): canonicalize the gpt-5.4-codex alias

Legacy codex alias now canonicalized

The gpt-5.4-codex runtime alias is being mapped to gpt-5.4, ensuring that per-model configuration overrides work whether developers use the legacy name or the canonical model ID.

The OpenAI Codex provider had a legacy model alias that wasn't being properly canonicalized. When developers configured overrides or templates using the older gpt-5.4-codex identifier, those settings could fail to apply because the system wasn't recognizing the alias as equivalent to gpt-5.4.

This fix ensures that both the legacy alias and canonical model ID resolve correctly, with per-model overrides honoring whichever identifier was used in the configuration. In the agent runtime, the change propagates through model resolution so that override lookups check both the alias and canonical forms. Test coverage was added for the embedded runner to prevent regression.

The change lives in the OpenAI Codex provider and embedded agent runner, part of an ongoing effort to clean up model alias handling.

View Original GitHub Description

Summary

  • canonicalize the legacy openai-codex/gpt-5.4-codex runtime alias to openai-codex/gpt-5.4
  • preserve both alias-specific and canonical per-model overrides
  • add focused provider and embedded-runner regression coverage

Why

The original alias-gap report is still real on current main, but the old PR carried extra no-op churn. This keeps the runtime fix narrow and covers the real override edge cases.

Supersedes #43060.

Testing

  • pnpm test:serial extensions/openai/openai-codex-provider.test.ts src/agents/pi-embedded-runner/model.test.ts
© 2026 · via Gitpulse