LLM session management with Redis
About
When you’re building with LLMs, memory matters. Here, Ricardo Ferreira shows you how to give your AI app a brain—by storing and reusing conversation history with LangChain and Redis. See how to connect chat memory to an OpenAI-powered LLM so your app can pick up right where it left off.
Key topics
- Build chat memory using LangChain and Redis
- Reuse past messages to make LLM responses smarter and more contextual
Speakers

Ricardo Ferreira
Principal Developer Advocate
Latest content
See allGet started with Redis today
Speak to a Redis expert and learn more about enterprise-grade Redis today.


