Tokenized

AI Transformer Model Author Illia Polosukhin: Building the Intention Economy

10 snips
Mar 19, 2026
Illia Polosukhin, co‑founder of NEAR Protocol and coauthor of the transformer paper, blends AI research and blockchain building. He discusses agentic commerce as intent flows, scaling payments and coordination with NEAR, and securing AI agents through IronClaw. Short takes cover protocol aggregation, agent custody challenges, and the evolution of secure OSes for AI.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Transformers Enable Parallel Context Reasoning

  • Transformers process entire contexts in parallel instead of reading word-by-word, enabling massive parallelism and fast answers.
  • Illia explained this came from scaling search at Google and led to the "Attention Is All You Need" model powering modern LLMs like GPTs.
ANECDOTE

Near Started As An AI Coding Project

  • Near began as Near AI to teach machines to write code and built prototypes like speaking/drawing mobile UIs.
  • Early operations pain—paying global student annotators—pushed them to use crypto and eventually build Near Protocol.
ADVICE

Use Crypto For Microglobal Payments

  • Use crypto for low-value, high-overhead global payments because traditional rails and wires are too costly and slow.
  • Illia cited paying students ~15 cents per task where wire fees or Stripe minimums made fiat impractical.
Get the Snipd Podcast app to discover more snips from this episode
Get the app