fix(validation): add nvidia-nim to shouldSkipResponsesProbe#2505
fix(validation): add nvidia-nim to shouldSkipResponsesProbe#2505ColinM-sys wants to merge 1 commit intoNVIDIA:mainfrom
Conversation
nvidia-prod and nvidia-nim both map to the same NVIDIA endpoint config in inference-config.ts (same URL, same credential, same providerLabel). Neither supports /v1/responses. nvidia-prod was already listed but nvidia-nim was missing, so sandboxes onboarded with the nvidia-nim provider incorrectly ran the Responses probe against an endpoint that does not expose it. Signed-off-by: ColinM-sys <cmcdonough@50words.com>
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: Path: .coderabbit.yaml Review profile: CHILL Plan: Enterprise Run ID: 📒 Files selected for processing (2)
📝 WalkthroughWalkthroughExtended the Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~8 minutes Poem
🚥 Pre-merge checks | ✅ 5✅ Passed checks (5 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
|
✨ Thanks for submitting this pull request that proposes a way to fix a bug where the nvidia-nim provider was missing from the shouldSkipResponsesProbe list. This identifies a bug and proposes a change to add nvidia-nim to the list, ensuring consistent handling across all supported providers. |
`nvidia-prod` and `nvidia-nim` map to the same NVIDIA endpoint config
in `inference-config.ts` — same base URL, same credential env, same
`providerLabel: "NVIDIA Endpoints"`. Neither supports `/v1/responses`.
`nvidia-prod` was already listed in `shouldSkipResponsesProbe` but
`nvidia-nim` was missing, so sandboxes onboarded with the `nvidia-nim`
provider incorrectly ran the Responses probe against an endpoint that
does not expose it.
Signed-off-by: ColinM-sys cmcdonough@50words.com
Summary by CodeRabbit
New Features
Tests