-
Notifications
You must be signed in to change notification settings - Fork 319
Feature/compaction truncation #286
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature/compaction truncation #286
Conversation
Hi @janspoerer - this one is still in draft; i was about to do some tidying up in this area, but will hold off for the moment to make any potential merge less painful. what do we need to do to undraft this? |
Hi @evalstate, |
Hi @evalstate, I just realized you also wanted to make sure to make merging less painful. Missed that earlier, sorry! Please don't worry about conflicts from your other changes. I would be happy with merging any conflicts. |
Hey, @janspoerer I got this error when trying to test your feature:
Full stack:
|
Opened another PR that covers this feature: #311 |
Thank you very much for testing the feature! I opened another PR as the PR here (286) seemed hopeless. The method I can probably still improve it by not counting the tokens with this new class Please note that I've so far only implemented truncation/compaction/summarization for Anthropic and Gemini, not for the OpenAI models. I can see that you used o4. The OpenAI models would be the next in my list to support. |
I did make an adhoc solution to the error displayed earlier that fixed it for me in this branch, but currently checking out the branch you mentioned. I'm trying to get it working with Gemini, but encountered an error that I can post later. Would be extremely grateful if you could implement for OpenAI/Azure asap, that would be very beneficial to the project I'm working on! |
No description provided.