-
Notifications
You must be signed in to change notification settings - Fork 20k
Description
Checked other resources
- This is a bug, not a usage question.
- I added a clear and descriptive title that summarizes this issue.
- I used the GitHub search to find a similar question and didn't find it.
- I am sure that this is a bug in LangChain rather than my code.
- The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
- This is not related to the langchain-community package.
- I posted a self-contained, minimal, reproducible example. A maintainer can copy it and run it AS IS.
Package (Required)
- langchain
- langchain-openai
- langchain-anthropic
- langchain-classic
- langchain-core
- langchain-cli
- langchain-model-profiles
- langchain-tests
- langchain-text-splitters
- langchain-chroma
- langchain-deepseek
- langchain-exa
- langchain-fireworks
- langchain-groq
- langchain-huggingface
- langchain-mistralai
- langchain-nomic
- langchain-ollama
- langchain-perplexity
- langchain-prompty
- langchain-qdrant
- langchain-xai
- Other / not sure / general
Example Code (Python)
from langchain_core.prompts import PromptTemplate
from langchain.agents import create_agent
from langchain_core.messages import HumanMessage
from langchain_openai import AzureChatOpenAI
import os
from dotenv import load_dotenv
from prompt_registry import REQUIREMENT_GATHERING_INSTRUCTION, SEARCH_HOTELS_INSTRUCTION
from uuid import uuid4
load_dotenv()
from langchain.agents.middleware import dynamic_prompt, ModelRequest
from langchain.agents.middleware import AgentState
from langchain_core.globals import set_debug
from langgraph.checkpoint.memory import InMemorySaver
from datetime import datetime
class GraphState(AgentState):
requirements_gathered: bool = False
@dynamic_prompt
def change_system_prompt(request: ModelRequest):
if request.state["requirements_gathered"]:
return SEARCH_HOTELS_INSTRUCTION
else:
return REQUIREMENT_GATHERING_INSTRUCTION
system_prompt = PromptTemplate.from_template(REQUIREMENT_GATHERING_INSTRUCTION)
system_prompt = system_prompt.format(now=datetime.now())
llm = AzureChatOpenAI(
api_version="2024-12-01-preview",
azure_deployment="gpt-4o-mini",
azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"),
api_key=os.getenv("AZURE_OPENAI_API_KEY"),
)
agent = create_agent(
system_prompt=system_prompt,
model=llm,
tools=[],
checkpointer=InMemorySaver(),
state_schema=GraphState,
middleware=[change_system_prompt]
)
user_input = input("Enter your query: ")
response = agent.invoke(
{"messages": [HumanMessage(user_input)]}, {'configurable': {'thread_id': uuid4()}}
)
last_msg = response["messages"][-1]
print(last_msg.content)Error Message and Stack Trace (if applicable)
Exception has occurred: KeyError (note: full exception trace is shown but execution is paused at: _run_module_as_main)
'requirements_gathered'
File "path\to\workspace\test.py", line 24, in change_system_prompt
if request.state["requirements_gathered"]:
~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^
File "path\to\workspace\.venv\Lib\site-packages\langchain\agents\middleware\types.py", line 1297, in wrapped
prompt = cast("str", func(request))
~~~~^^^^^^^^^
File "path\to\workspace\.venv\Lib\site-packages\langchain\agents\factory.py", line 134, in normalized_single
result = single_handler(request, handler)
File "path\to\workspace\.venv\Lib\site-packages\langchain\agents\factory.py", line 1098, in model_node
response = wrap_model_call_handler(request, _execute_model_sync)
File "path\to\workspace\.venv\Lib\site-packages\langgraph\_internal\_runnable.py", line 400, in invoke
ret = self.func(*args, **kwargs)
File "path\to\workspace\.venv\Lib\site-packages\langgraph\_internal\_runnable.py", line 656, in invoke
input = context.run(step.invoke, input, config, **kwargs)
File "path\to\workspace\.venv\Lib\site-packages\langgraph\pregel\_retry.py", line 42, in run_with_retry
return task.proc.invoke(task.input, config)
~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^
File "path\to\workspace\.venv\Lib\site-packages\langgraph\pregel\_runner.py", line 167, in tick
t,
...<12 lines>...
File "path\to\workspace\.venv\Lib\site-packages\langgraph\pregel\main.py", line 2633, in stream
[t for t in loop.tasks.values() if not t.writes],
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<4 lines>...
File "path\to\workspace\.venv\Lib\site-packages\langgraph\pregel\main.py", line 3050, in invoke
input,
...<12 lines>...
File "path\to\workspace\test.py", line 47, in <module>
{"messages": [HumanMessage(user_input)]}, {'configurable': {'thread_id': uuid4()}}
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "C:\Users\Ayush Gupta\AppData\Roaming\uv\python\cpython-3.13.9-windows-x86_64-none\Lib\runpy.py", line 88, in _run_code
exec(code, run_globals)
File "C:\Users\Ayush Gupta\AppData\Roaming\uv\python\cpython-3.13.9-windows-x86_64-none\Lib\runpy.py", line 198, in _run_module_as_main (Current frame)
return _run_code(code, main_globals, None,
KeyError: 'requirements_gathered'Description
I am using langchain 1.0.7. I want to change the system prompt based on the state of my graph. I have passed the custom graph state to create_agent function via state_schema argument.
I expected to see the new key 'requirements_gathered' within the state, but in dynamic_prompt middleware, an error is being thrown with KeyError.
System Info
System Information
OS: Windows
OS Version: 10.0.26100
Python Version: 3.13.9 (main, Oct 28 2025, 12:03:59) [MSC v.1944 64 bit (AMD64)]
Package Information
langchain_core: 1.0.5
langchain: 1.0.7
langsmith: 0.4.43
langchain_mcp_adapters: 0.1.13
langchain_ollama: 1.0.0
langchain_openai: 1.0.3
langgraph_sdk: 0.2.9
Optional packages not installed
langserve
Other Dependencies
httpx: 0.28.1
jsonpatch: 1.33
langgraph: 1.0.3
mcp: 1.21.2
ollama: 0.6.1
openai: 2.8.0
opentelemetry-api: 1.38.0
opentelemetry-exporter-otlp-proto-http: 1.38.0
opentelemetry-sdk: 1.38.0
orjson: 3.11.4
packaging: 25.0
pydantic: 2.12.4
pyyaml: 6.0.3
requests: 2.32.5
requests-toolbelt: 1.0.0
rich: 14.2.0
tenacity: 9.1.2
tiktoken: 0.12.0
typing-extensions: 4.15.0
zstandard: 0.25.0