The Aha moment, part 9

Over the years, the ability of graphics processors to multiply matrices continued to grow at an exponential rate. This development allowed game designers to create ever more realistic scenes, with better animation, larger numbers of polygons and more realistic textures and lighting effects.

It would already have been interesting if this is all that ended up happening. But meanwhile, something else was happening. There was another application for all of those matrix operations which would become even more important than graphics.

Because in order to train the Convolutional Neural Nets that had been invented by pioneers such as Yann LeCun, as well as successive generations of A.I., you needed matrix math — lots and lots of matrix math. So that Nvidia hardware ended up finding a new purpose.

You have only to look at the history of Nvidia stock prices to see what happened next. A little less than a decade ago, A.I. began to undergo a revolution. It turned out that all of those graphics processors turned out to be the perfect devices for training ever more complex A.I. models.

And that leads us to where we are today. But the story doesn’t end there. More tomorrow.

Leave a Reply

Your email address will not be published. Required fields are marked *