Connect with us

Tech

Confronting AI’s Next Big Challenge: Inference Compute

Published

on

[ad_1]

The computing demands of training AI models may get a lot of the attention from the tech industry — just ask NVIDIA’s shareholders. But the needs posed by AI inference may leave today’s cutting-edge GPUs in the dust.

“If you look at the world of pretraining, it has been kind of monolithic,” said Sid Sheth, founder and CEO of d-Matrix, in this episode of The New Stack Makers. “GPUs have dominated. Specifically, GPUs from one company have dominated the landscape. But as you enter the world of inference, it is not really a…

[ad_2]

Source link

Continue Reading
Click to comment

You must be logged in to post a comment Login

Leave a Reply