
Elixir Mentor Thomas Millar on Machine Learning & Elixir
Feb 5, 2024
Thomas Millar, a software engineer specializing in machine learning and data engineering and creator of instructor_ex, talks about bridging Elixir with AI. He covers structured prompting and mapping LLM outputs to Ecto schemas. The conversation explores running models in Elixir, streaming partial JSON hydration, and practical projects for getting started.
AI Snips
Chapters
Transcript
Episode notes
Elixir Lets You Run And Stream Models Efficiently
- Running models inside the BEAM (via Bumblebee/NX) can cut marginal token costs and simplify ops compared with paying per-token external APIs.
- Elixir's concurrency and streaming primitives let you wire Whisper, LLMs, and agents together for live, composable pipelines.
Learn Instructor By Studying Docs And Cookbooks
- Read the Instructor documentation and cookbooks first to learn practical patterns and livebook examples.
- Follow the Python Instructor docs then apply analogous examples in instructor_ex to adapt ideas quickly.
Stable Low Level API Keeps Instructor Future Proof
- Instructor purposely stays low-level and avoids fully wrapping OpenAI so its API remains stable as models evolve.
- The library adds one parameter for pre/post-processing, letting it evolve with provider changes without breaking users.
