Skip to content

fix: support non-OAuth auth in login status check#71

Open
sudev-chirappat wants to merge 5 commits intoopenai:mainfrom
sudev-chirappat:fix/custom-provider-auth
Open

fix: support non-OAuth auth in login status check#71
sudev-chirappat wants to merge 5 commits intoopenai:mainfrom
sudev-chirappat:fix/custom-provider-auth

Conversation

@sudev-chirappat
Copy link
Copy Markdown

@sudev-chirappat sudev-chirappat commented Mar 31, 2026

Summary

  • codex login status only validates OAuth tokens, causing the plugin to reject users authenticated via OPENAI_API_KEY or a custom model_provider in config.toml (Azure, Bedrock, corporate gateways, etc.)
  • Adds two fallback checks in getCodexLoginStatus() before calling codex login status:
    1. OPENAI_API_KEY environment variable
    2. Custom model_provider in ~/.codex/config.toml (detected via grep to avoid loading secrets into memory)

Fixes #21
Fixes #58
Related: #63

Test plan

  • Verify OPENAI_API_KEY env var bypasses the OAuth check
  • Verify custom model_provider in config.toml (e.g. Azure, Bedrock) bypasses the OAuth check
  • Verify default openai provider still falls through to codex login status
  • Verify /codex:review and /codex:task work end-to-end with custom provider auth

🤖 Generated with Claude Code

@sudev-chirappat sudev-chirappat requested a review from a team March 31, 2026 16:04
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: b2083a9efd

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 38a6cbed2c

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

sudev-chirappat and others added 3 commits April 1, 2026 10:50
`codex login status` only validates OAuth tokens, causing the plugin to
reject users who authenticate via OPENAI_API_KEY or a custom
model_provider in config.toml (Azure, Bedrock, corporate gateways).

Check these auth methods before falling through to `codex login status`.

Fixes openai#21

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…port)

Replace grep shell-out with safeReadFile to fix Windows compatibility
(grep unavailable) and handle single-quoted TOML values for
model_provider.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Project config takes precedence over global config, matching Codex CLI
behavior for provider resolution.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@sudev-chirappat
Copy link
Copy Markdown
Author

Note on remaining auth paths: The one potentially missing edge case is OPENAI_BASE_URL — if someone sets a custom base URL pointing to a non-OpenAI endpoint but doesn't set model_provider, they'd still fall through to codex login status. However, this is an unusual setup and OPENAI_API_KEY would typically also be set in that scenario, which we already catch. Happy to add a check for it if maintainers think it's worth covering.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 20599a2fb0

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Use resolveWorkspaceRoot (git toplevel) instead of raw cwd for
project-level config lookup, and permit leading whitespace in
model_provider regex to handle indented TOML entries.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@sudev-chirappat sudev-chirappat force-pushed the fix/custom-provider-auth branch from 20599a2 to 958487b Compare April 1, 2026 02:56
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 958487b3d7

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Only read project-level .codex/config.toml when the workspace is marked
as trusted in the global config, matching Codex CLI behavior.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 83a32d5f9d

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".


function isProjectTrusted(globalConfig, workspaceRoot) {
const escaped = workspaceRoot.replace(/[.*+?^${}()|[\]\\]/g, "\\$&");
const re = new RegExp(`\\[projects\\."${escaped}"\\][\\s\\S]*?trust_level\\s*=\\s*["']?trusted`);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Restrict trust lookup to the matched project section

The trust check regex can cross section boundaries, so an untrusted workspace can be treated as trusted if any later [projects."..."] block in ~/.codex/config.toml has trust_level = "trusted". In isProjectTrusted, \[projects\."<path>"\][\s\S]*?trust_level... is not bounded to the current table, and getCodexLoginStatus then reads project .codex/config.toml and may report loggedIn: true for custom providers even though Codex would ignore that untrusted project config.

Useful? React with 👍 / 👎.

robinmordasiewicz added a commit to f5xc-salesdemos/codex-plugin-cc that referenced this pull request Apr 1, 2026
getCodexLoginStatus() only validates OAuth tokens via `codex login
status`, causing the plugin to reject users authenticated via API key
through a custom model_provider (LiteLLM, Azure, Bedrock, etc.).

Add detectApiKeyAuth() fallback that reads ~/.codex/config.toml, finds
the active model_provider's env_key, and checks if that environment
variable is set. Falls back to checking OPENAI_API_KEY directly.

Fixes #1
Upstream: openai#58
Upstream PR: openai#71

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
robinmordasiewicz added a commit to f5xc-salesdemos/codex-plugin-cc that referenced this pull request Apr 1, 2026
getCodexLoginStatus() only validates OAuth tokens via `codex login
status`, causing the plugin to reject users authenticated via API key
through a custom model_provider (LiteLLM, Azure, Bedrock, etc.).

Add detectApiKeyAuth() fallback that reads ~/.codex/config.toml, finds
the active model_provider's env_key, and checks if that environment
variable is set. Falls back to checking OPENAI_API_KEY directly.

Fixes #1
Upstream: openai#58
Upstream PR: openai#71

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

1 participant