The Real Eisman Playbook cover image

Daniel Guetta on the Guts of AI, Agentic AI & Why LLMs Hallucinate | The Real Eisman Playbook Ep 46

The Real Eisman Playbook

00:00

LLM internals: embeddings and transformer attention

Daniel gives an intuitive embedding example, explains how training positions words and how attention handles context.

Play episode from 16:21
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app