MLOps.community  cover image

The Future of Search in the Era of Large Language Models // Saahil Jain // MLOps Podcast #150

MLOps.community

00:00

The Challenges of Using GPUs to Train Models

I think it comes down ultimately to the user experience so what can we do to you know really trace back from there. I think one thing that we have noticed is that GPUs are quite expensive so in general you know when we can use CPUs we do and CPU utilization is important. If you're not careful if you don't account foruser queries that are really long you might end up spikingCPU utilization on those really long queries because your model ends up not batching it correctly or iterating through all of It's a great question but at a high level I would say, "GPS make a lot of sense"

Play episode from 28:26
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app