-
-
Notifications
You must be signed in to change notification settings - Fork 365
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Basic checks
- I searched existing issues - this hasn't been reported
- I can reproduce this consistently
- This is a RubyLLM bug, not my application code
What's broken?
Google Gemini models support extended reasoning through "thinking" modes, which generate thought signatures that need to be preserved when working with tool calls. When using Gemini models through an OpenAI-compatible endpoint, these thought signatures are transmitted in the extra_content.google.thought_signature field, following the OpenAI Realtime API's extension pattern.
How to reproduce
#!/usr/bin/env ruby
# frozen_string_literal: true
require 'bundler/setup'
require 'ruby_llm'
require 'faraday'
require 'json'
# Configure RubyLLM with your OpenAI API key
RubyLLM.configure do |config|
config.openai_api_key = ENV.fetch('OPENAI_API_KEY', nil)
config.openai_api_base = "https://generativelanguage.googleapis.com/v1beta/openai"
end
# Define a Weather tool that fetches real weather data
class Weather < RubyLLM::Tool
description 'Gets current weather for a location'
params do
string :latitude, description: 'Latitude (e.g., 52.5200)'
string :longitude, description: 'Longitude (e.g., 13.4050)'
end
def execute(latitude:, longitude:)
url = "https://api.open-meteo.com/v1/forecast?latitude=#{latitude}&longitude=#{longitude}¤t=temperature_2m,wind_speed_10m"
response = Faraday.get(url)
data = JSON.parse(response.body)
current = data['current']
"Temperature: #{current['temperature_2m']}°C, Wind Speed: #{current['wind_speed_10m']} km/h"
rescue StandardError => e
{ error: e.message }
end
end
# Create a chat with the OpenAI provider and attach the Weather tool
chat = RubyLLM.chat(model: 'google:gemini-3-pro-preview', provider: :openai, assume_model_exists: true)
.with_tool(Weather)
# Ask about the weather
puts "Asking: What's the weather in Berlin? (52.5200, 13.4050)"
puts '-' * 50
response = chat.ask("What's the weather in Berlin? (52.5200, 13.4050)")
puts response.contentExpected behavior
OpenAI provider should add the though_signature if present.
What actually happened
Unable to submit request because function call `weather` in the 2. content block is missing a `thought_signature`. Learn more: https://docs.cloud.google.com/vertex-ai/generative-ai/docs/thought-signatures (RubyLLM::BadRequestError)
Environment
All versions.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working