
What's New In Data From Data Pipelines to Agentic Applications: Deploying LLM Apps That Actually Work
12 snips
May 1, 2025 Spencer Cook, Lead Solutions Architect at Databricks, helps financial services leverage cloud solutions for real-world challenges. He discusses the journey from data hype to practical AI, focusing on real-time data pipelines and vector search. Spencer elaborates on how to minimize AI hallucinations while maximizing business value through clean data governance. The conversation highlights innovations like Retrieval-Augmented Generation (RAG) for AI accuracy, and the critical role of data management in ensuring effective LLM deployment and customer experience.
AI Snips
Chapters
Transcript
Episode notes
Avoid LLM Hallucinations
- Certify AI answers with human validation to avoid hallucinations in LLMs.
- Track and augment training data using LLM ops techniques to improve accuracy across domains.
Build Reliable AI Data Pipelines
- Build reliable, low-latency data layers like feature stores and vector stores to power AI applications.
- Apply established data engineering practices for document and multimedia data as for structured data.
Agents Unify BI and AI
- Agents combine data from documents and analytical reporting to answer complex queries.
- Old BI and AI systems merge, creating richer AI application ecosystems.
