-
Notifications
You must be signed in to change notification settings - Fork 2.5k
Description
This is probably a bit too early for a feature request, but might still worth discussing.
In some AI applications like Cursor and Claude Code (and maybe some other vibe coding products), users can provide extra inputs while the agent/application is running, without interrupting the execution flow. The agent/application is not explicitly waiting for human response (as compared to standard human-in-the-loop design), but takes the input into the current architecture in a natural way.
My questions are as follows:
(1) Are there any other AI applications that have this feature right now, and what purpose do they serve?
(2) What’s the recommended way to achieve that in Haystack? My reflection is that this might be tricky in haysatck, since the execution loop is rather pre-determined, (compared to pure agent/multi-a setup) and there is less room to backstep and fix the error.