New from O’Reilly: The memory architecture behind adaptive AI agents

Read the report

Meet Redis LangCache: Semantic caching for AI

About

Stop asking the same questions. Start building faster, cheaper AI apps.
Redis’ Jen Agarwal and Amit Lamba, CEO of Mangoes.ai, introduce Redis LangCache—a fully managed semantic cache that stores and reuses LLM queries to cut token costs by up to 70%, deliver responses 15X faster, and simplify scaling for chatbots, agents, and retrieval-based apps.

52 minutes
Key topics
  1. Reduce token usage by up to 70%
  2. Deliver responses 15x faster
  3. Improve user experience without added complexity
Speakers
Jen Agarwal

Jen Agarwal

Product Leader

Amit Lamba

Amit Lamba

CEO

Latest content

See all
Image
Redis Released 2025
Banking with ICICI: Delivering a fast user experience at scale
16 minutes
Image
Redis Released 2025
E-commerce & AI panel: See how CP Axtra & Purplle accelerated their sale
32 minutes
Image
Redis Released 2025
How Entain counts on Redis for the biggest gaming events in the world
28 minutes

Get started with Redis today

Speak to a Redis expert and learn more about enterprise-grade Redis today.