-
Notifications
You must be signed in to change notification settings - Fork 181
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Is there an existing issue for the same bug?
- I have searched existing issues and this is not a duplicate.
Bug Description
I ran into an issue while resuming a conversation from persisted state.
- Run the conversation using conversation.run()
- Whenever it encounter's an error due to API rate limit or some other system issues. It stores the cost tracking at that point in base_state.json in persistnat dir.
- When I resume the conversation, the costs in telemetry logs dosent get updated, neither it is added to base_state.json.
Suspected root cause
ConversationStats.register_llm() restores metrics into the LLM via llm.restore_metrics(...).
However, restore_metrics() only updates self._metrics and does not update the Telemetry object if it has already been created.
So after restore, these two can diverge:
llm.metrics→ restored metrics objectllm.telemetry.metrics→ stale old metrics object
Since telemetry writes token/cost/latency intoself.metrics, resumed accounting may go into the wrong object.`
Expected Behavior
No response
Actual Behavior
No response
Steps to Reproduce
No response
Installation Method
uv pip install openhands-sdk
If you selected "Other", please specify
No response
SDK Version
1.11.4, main branch
Version Confirmation
- I have confirmed this bug exists on the LATEST version of OpenHands SDK
Python Version
3.12.7
Model Name (if applicable)
gpt-4o, gpt-5.1-codex-mini, GPT-5.2-chat
Operating System
Linux
Logs and Error Messages
No response
Minimal Code Sample
from openhands.sdk.llm.llm import LLM
from openhands.sdk.llm.utils.metrics import Metrics
llm = LLM(model="gpt-4o-mini", api_key="dummy")
# Force telemetry creation
_ = llm.telemetry
old_metrics = llm.metrics
restored = Metrics(model_name=llm.model)
llm.restore_metrics(restored)
print("llm.metrics is restored:", llm.metrics is restored)
print("llm.telemetry.metrics is restored:", llm.telemetry.metrics is restored)
print("llm.telemetry.metrics is llm.metrics:", llm.telemetry.metrics is llm.metrics)
# OBSERVED BEHAVIOR
# llm.metrics is restored: True
# llm.telemetry.metrics is restored: False
# llm.telemetry.metrics is llm.metrics: FalseScreenshots and Additional Context
No response
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working