
The computing power behind AI
The Tech Brief
00:00
The Current Paradigm of Research and Development in Large Language Models
There's already signs that some, the current paradigm of research and development in large language models may be starting to reach a plateau. The complexity of the model grows super linearly and you have to use significantly more computing power than before. So at some point, the marginal benefit from adding additional parallel GPUs is entirely sort of countered by the marginal cost of the additional interconnection overhead. And if that's so, then it is possible for countries like countries in Europe that are currently not leading in AI development to catch up with this type of AI development since the pace of change slows down.
Play episode from 18:53
Transcript


