Skip to content
This repository was archived by the owner on Mar 18, 2026. It is now read-only.

Improve chat history handling #64

@drivian

Description

@drivian

Chat history is silently truncated when going over max_token_limit, as we are using the LangChain class ConversationTokenBufferMemory for the chat_history object. The default max_token_limit in agent.yml is currently set to 4k, which is easily too little with current LLMs. Propose to:

  • Increase the limit to say 20k, and change default llm to gpt-4o
  • Add a logger if the limit is exceeded (maybe LangChain support this now?)

Metadata

Metadata

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions