
This Day in AI Podcast EP60: Rabbit r1 Launch Party, LAMs, Microsoft's Phi-3, Hume AI EVI API, Llama3 Updates & Groq Speed
Apr 24, 2024
Explore the launch of Rabit r1 & the potential of LAMs. Discover Microsoft's Phi-3 innovation & its use cases. Dive into the world of Llama3 on Groq & updates on GPT-4. Uncover Hume AI's EVI API & Meta's integration with Llama 3.
AI Snips
Chapters
Transcript
Episode notes
LAMs Are A Practical Middle Ground
- Large action models (LAMs) can automate repetitive browser tasks and increase productivity across industries.
- They sit between simple macros and full autonomous agents, unlocking immediate workplace gains.
Run 5.3 Mini Locally For Speed And Privacy
- Try 5.3 mini for fast local inference and easy fine-tuning on commodity hardware.
- Use it for classification, automation, and embedding LLMs into device apps to avoid constant API costs.
Chris Ran 5.3 Locally And Fine-Tuned Fast
- Chris ran 5.3 locally on an M3 Mac and praised its speed and immediate usefulness.
- He fine-tuned it on large email data quickly and highlighted rapid iteration times.
