fix: Handle "prompt is too long" from Anthropic #1137
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
#1078 mentioned that context overflows were not handled, but I wasn't able to reproduce using the code changes in it. However, in testing (using @DEA's suggested test) I was able to reproduce and consistently got a "prompt is too long:" error. This change addresses that error.
I also added a test case using the expected error and validated that it works. If we swap out ConversationManagers, the agent loop does automatically trim the context (not propagating the error) as expected.
Best practices to handle prompts that are too long for the LLM API (eg., Anthropic, OpenAi)? - DEV Community also indicates that this is an error that is not unique to us.
Related Issues
N/A
Documentation PR
N/A
Type of Change
Bug fix
Testing
How have you tested the change? Verify that the changes do not break functionality or introduce warnings in consuming repositories: agents-docs, agents-tools, agents-cli
hatch run prepareChecklist
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.