ConversationBufferMemory Core Functions
- Conversation History Storage: Saves complete conversation records, including user inputs and system responses
- Context Maintenance: Provides complete conversation context for subsequent conversation turns
- Buffer Management: Saves the latest messages, with configurable maximum capacity limits
Application Scenarios
- Chatbots: Maintain conversation context so the bot can understand and respond to multi-turn conversations
- Customer Service Systems: Save complete conversation history between customers and agents
- Diagnostic Systems: Record complete interaction processes of symptom descriptions and diagnostic suggestions
- Teaching Systems: Save Q&A history between students and teaching systems
Code Example
from langchain.memory import ConversationBufferMemory
from langchain_core.runnables import RunnableLambda
memory = ConversationBufferMemory(return_messages=True)
memory.save_context({"input": "hi im wzk"}, {"output": "Hello Wzk!"})
# Subsequent conversations can access the previous context
Running Results
- First time asking “What is my name?”: The model doesn’t know
- After saving the memory and asking again: The model answers “Your name is ‘wzk’”
This demonstrates how ConversationBufferMemory enables persistent conversation context for LLM applications.