
Latent Space: The AI Engineer Podcast Emergency Pod: OpenAI's new Functions API, 75% Price Drop, 4x Context Length (w/ Alex Volkov, Simon Willison, Riley Goodside, Joshua Lochner, Stefania Druga, Eric Elliott, Mayo Oshin et al)
68 snips
Jun 14, 2023 In this engaging discussion, AI expert Alex Volkov, prompt injection specialist Simon Willison, and software engineer Riley Goodside dissect OpenAI's transformative Functions API. They dive into the significant 75% price drop and the increase in context length, exploring the implications for developers. Eric Elliott shares prompting techniques to enhance accuracy, while the panel addresses security concerns related to prompt injection. The conversation is rich with insights on coding efficiency, the future of AI tools, and the evolution of user interactions with these technologies.
AI Snips
Chapters
Transcript
Episode notes
Client-Side vs. OpenAI Embedding
- Client-side embedding offers advantages for sensitive data and latency but lacks the power of OpenAI's large models.
- OpenAI's cheaper embeddings accelerate the vector database space, while client-side remains relevant for smaller projects.
Function Selection and Control
- OpenAI's function calling allows specifying functions and types, enabling the model to choose or be forced to use specific functions.
- This raises questions about the limit, conflicts, and benchmarking of these functions.
Designing Effective Functions
- Design functions with broad capabilities based on languages like SQL or JavaScript.
- This allows for complex tasks with minimal prompting instructions.


