๐ฏ What You'll Learn Today
LangGraph Tutorial: Dynamic Conversation Summarization - Unit 1.2 Exercise 4
This tutorial is also available in Google Colab here or for download here
Joint Initiative: This tutorial is part of a collaboration between AI Product Engineer and the Nebius Academy.
This tutorial demonstrates how to implement dynamic conversation summarization in LangGraph, enabling efficient context management through automated summary generation. We'll create a system that maintains conversation context through intelligent summarization.
Key Concepts Covered
- Dynamic Summary Generation
- Message History Processing
- State-Based Summary Management
- Graph Integration of Summarization Logic
from typing import Annotated, TypedDict
#!pip install langchain-core
#!pip install langgraph
from langchain_core.messages import BaseMessage, HumanMessage
from langgraph.graph import START, StateGraph
from langgraph.graph.message import add_messages
Step 1: State Definition with Summary Support
We define a state structure that combines message management with summary tracking capabilities.
class State(TypedDict):
"""Advanced state container with summary management capabilities.
This implementation demonstrates three key features:
1. Message storage with LangGraph annotations
2. Dynamic summary tracking
3. Configurable history window
Attributes:
messages: Annotated list of conversation messages
summary: Running summary of conversation context
window_size: Maximum messages to maintain before summarization
Note:
The summary field is crucial for maintaining context when
older messages are pruned from the window.
"""
messages: Annotated[list[BaseMessage], add_messages]
summary: str
window_size: int
Why This Matters
Dynamic summarization is essential because
- It enables long-running conversations while managing memory
- Preserves crucial context even after message pruning
- Enables more intelligent agent responses
- Reduces token usage in LLM calls
Step 2: Summary Generation Implementation
We implement the core summarization logic for conversation tracking.
def summary_generation(state: State) -> State:
"""Generate and maintain dynamic conversation summaries.
This function implements several advanced concepts:
1. Threshold-based summary generation
2. Context preservation through summarization
3. State-aware summary updates
The summarization process follows this flow:
1. Check message threshold
2. Process message history
3. Generate contextual summary
4. Update state with new summary
Args:
state: Current conversation state with messages and existing summary
Returns:
State: Updated state with new summary
Example:
>>> messages = [
... HumanMessage(content="Hello"),
... HumanMessage(content="How are you"),
... HumanMessage(content="Goodbye")
... ]
>>> state = {"messages": messages, "summary": "", "window_size": 3}
>>> result = summary_generation(state)
>>> print(result["summary"])
"Conversation summary: Hello -> How are you -> Goodbye"
"""
if len(state["messages"]) > 2:
messages_text = " -> ".join([m.content for m in state["messages"]])
state["summary"] = f"Conversation summary: {messages_text}"
return state
Debug Tips
- Summary Generation Issues:
- Log message contents before summarization
- Verify summary format consistency
- Check threshold conditions
- State Management:
- Monitor summary field updates
- Validate message processing
- Check state preservation
- Common Errors:
- AttributeError: Verify message object structure
- String formatting issues: Check message content types
- State mutation problems: Verify proper state handling
Step 3: Graph Integration
We integrate our summarization logic into the LangGraph structure.
Initialize graph with summary support
graph = StateGraph(State)
Add summarizer node
graph.add_node("summarizer", summary_generation)
Configure entry point
graph.add_edge(START, "summarizer")
Why This Matters
Proper graph integration ensures
- Automatic summary updates
- Consistent state management
- Scalable conversation processing
Key Takeaways
- Summary Management:
- Use thresholds for efficient processing
- Maintain context through summarization
- Integrate with message windowing
- State Handling:
- Update summaries atomically
- Preserve message context
- Maintain state consistency
Common Pitfalls
- Missing message validation
- Inefficient summary generation
- Poor summary format design
- Incomplete state updates
Next Steps
- Add intelligent summary compression
- Implement topic extraction
- Add summary-based routing
- Integrate with LLM for better summaries
Example Usage
messages = [
HumanMessage(content="Hello"),
HumanMessage(content="How are you"),
HumanMessage(content="Goodbye"),
]
initial_state = {"messages": messages, "summary": "", "window_size": 3}
result = summary_generation(initial_state)
print(f"Generated Summary: {result['summary']}")
Variations and Extensions
- Enhanced Summarization:
- Use LLM for semantic summarization
- Implement topic-based summary clustering Example use case: Complex conversation tracking
- Conditional Summarization:
- Topic-based summary triggers
- Importance-based summarization Scenario: Adaptive conversation management
Expected Output
Generated Summary: Conversation summary: Hello -> How are you -> Goodbye
๐ฌ๐ง Chapter