
Last Week in AI #189 - Chat.com, FrontierMath, Relaxed Transformers, Trump & AI
104 snips
Nov 17, 2024 The podcast dives into the latest developments in AI, including OpenAI's new predicted outputs feature that increases GPT-4 task efficiency. It explores OpenAI's acquisition of Chat.com and X's plan to offer a free Grok AI chatbot. The discussion also highlights Saudi Arabia's ambitious $100 billion AI initiative and the potential implications of a Trump administration on AI policy. Furthermore, innovative techniques like Relaxed Recursive Transformers for enhancing language models are examined, alongside cybersecurity applications of AI.
AI Snips
Chapters
Books
Transcript
Episode notes
Frontier Math Challenge
- Frontier Math, a new benchmark with complex math problems, challenges even advanced LLMs.
- GPT-4 and Gemini 1.5 Pro solve under 2%, indicating the benchmark's difficulty.
HunYuan Large vs. LLaMA
- Tencent's HunYuan Large, a 52B parameter MoE model, challenges LLaMA's performance.
- Its surprisingly high performance with a relatively low compute budget raises questions.
Optimizing Small Models
- Relaxed Recursive Transformers improve small model performance by repeating layers and using LoRA.
- This technique optimizes memory usage, making it valuable for hardware with limited memory.



