Skip to content

fix(js/plugins/compat-oai): update model extraction logic to use slice instead of pop for better path handling #3395

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

sd0ric4
Copy link

@sd0ric4 sd0ric4 commented Aug 10, 2025

Description here... Help the reviewer by:

  • linking to an issue that includes more details
  • if it's a new feature include samples of how to use the new feature
  • (optional if issue link is provided) if you fixed a bug include basic bug details

Checklist (if applicable):

When I was using OpenRouter as the provider, I discovered this issue: the previous approach using .pop() cannot preserve any of the / separators, which is problematic.
image

Main Drawback: Cannot Handle Model Names with Multiple '/' Separators

const name = 'openrouter/deepseek/deepseek-chat-v3-0324:free';

// Problem with using .pop():
name.split('/').pop() // Only returns the last part: 'deepseek-chat-v3-0324:free'

// You lose all the important middle information:
// - 'deepseek' (model family)
// - Only 'deepseek-chat-v3-0324:free' remains (specific model)

Former
image

Now
image

import {
  compatOaiModelRef,
  defineCompatOpenAIModel,
  openAICompatible,
} from '@genkit-ai/compat-oai';
import { genkit } from 'genkit/beta';

export const deepseekRef = compatOaiModelRef({
  name: 'openrouter/deepseek/deepseek-chat-v3-0324:free',
});

export const ai = genkit({
  plugins: [
    openAICompatible({
      name: 'openrouter',
      apiKey: process.env.OPENROUTER_API_KEY,
      baseURL: 'https://openrouter.ai/api/v1',
      initializer: async (ai, client) => {
        // Register a text model
        defineCompatOpenAIModel({
          ai,
          name: deepseekRef.name,
          client,
          modelRef: deepseekRef,
        });
      },
    }),
  ],
});

const llmResponse = await ai.generate({
  model: deepseekRef,
  prompt: 'Tell me a joke about a llama.',
  config: {
    temperature: 0.9,
  },
});

console.log('llmResponse:', JSON.stringify(llmResponse, null, 2));

console.log('llmResponse (detailed):');
console.dir(llmResponse, { depth: null, colors: true });

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Status: No status
Development

Successfully merging this pull request may close these issues.

1 participant