Discussion about this post

User's avatar
Graham Seibert's avatar

From friend Steve, who is more knowledgeable than me:

Hi Graham -

I just read your latest substack post, including the comment about Nvidia. I thought I'd share a bit of insight in a way that I'm sure you uniquely will appreciate, if you don't already have this insight, benefiting in part from the fact that I'm currently working in AI.

The bottom line: Nvidia isn't just making boards for video games and also heavily invested in AI. Those video game cards have themselves have become the hottest item in Silicon Valley since the iPhone as the cards themselves ARE the heart and soul of virtually all AI work today. And the AI industry and most every corporate org on the planet cannot get enough of those cards.

The original video game boards were uniquely designed to be number-crunchers to drive the dynamic graphics generation that was, and is, behind video games. Games have had a voracious appetite for better and faster and more detailed and more realistic visual generations. Thus the investment into the video game cards which have always been number crunchers, nothing more.

Somewhere long the way someone realized those cards were also useful for the sort of unique number crunching behind most AI algorithms. AI algorithms uses a lot of "tensors". Tensors are just multi-dimensional arrays of numbers. Even text-based AI systems "tokenize" text - i.e., convert text into numerical representations - for most efforts.

Furthermore, AI algorithms start by performing a complex "training" process on data they ingest. The "training" is the real work in building AI systems, as it involves potentially complex data analysis. The output of such an effort is a knowledge base, a carefully filtered and structured set of data itself, resulting from the analysis of the ingested input data, and that stored knowledge base, combined with the algorithm, work together to perform the job of the trained agent. Without that knowledge base, the AI algorithm is nothing, just another trainer.

So that training process is the starting point, and a big deal. And it often requires enormous compute power. A typical AI training process can take hours, weeks, literally months, even years. And that's on something like an Nvidia card. On a CPU many of the processes would take so long as to be not worth the effort.

Thus the crazy spike in demand for Nvidia cards.

Expand full comment
Graham Seibert's avatar

My friend Chris has been a professional investor. He annotated this blog, correcting me on several points while generally supporting my thesis. I have uploaded his valuable comments here.

http://www.grahamseibert.com/blog/Chris%20comments%203-11-2024.pdf

Expand full comment
4 more comments...

No posts