
Software Engineering Daily Redis and AI Agent Memory with Andrew Brookins
135 snips
Aug 26, 2025 In this engaging discussion, Andrew Brookins, a Principal Applied AI Engineer at Redis, shares insights into the challenges of building AI agents. He explains how large language models' statelessness affects continuity and the critical role of memory management. Topics include the significance of fast data retrieval in AI systems, advancements in Redis like vector search and semantic caching, and the comparison between hybrid search and vector-only methods. Andrew also touches on the complexities of maintaining relevant memory and the development of effective world models for dynamic environments.
AI Snips
Chapters
Transcript
Episode notes
Memory Abstractions Are Complicated By DB Quirks
- Standard memory APIs could help but DBs have different query quirks that complicate swapping providers.
- Filtering and faceting differences make pluggable memory nontrivial across vendors.
Treat Long-Term Memory As Time-Aware
- Long-term memory includes stable facts and time-bound episodic items, requiring time-aware retrieval.
- Agents must store timestamps and order recent episodic facts higher when constructing context.
Store Metadata And Schemas For Retrieval
- Store structured metadata like timestamps and source alongside text to enable ranked retrieval.
- Use schemas and fields so queries can boost recency, relevance, or other facets dynamically.

