Vivid News Wave

GPUs, CPUs And Serial Chips - Who's Going To Be Manufacturing, And Who Will Be Begging For Help?


GPUs, CPUs And Serial Chips - Who's Going To Be Manufacturing, And Who Will Be Begging For Help?

A recent post on Reddit in the mlscaling sub reads a little bit like a spy novel in looking at where we're going to get new hardware systems for next generation computing, and how it's going to enable artificial general intelligence.

The author starts out talking about what hyperscalers (big vendors with big systems) are likely to do - which is, apparently, invest in new hardware made for parallel processing.

However, there's also a good bit of speculation on how things will change quickly.

"Instead of getting much more parallel, training could get much less parallel," gwern posits. "It's worth noting that this is the reason so much scientific computing neglected GPUs for a long time and focused more on interconnect throughput and latency: actually, most important scientific problems are highly serial, and deep learning is rather exceptional here -- -which means it may regress to the mean at some point. There could be a new second-order SGD optimizer which cannot parallelize easily across many nodes but is so sample-efficient that it wins, or it eventually finds better optima that can't be found by regular first-order. There could be new architectures moving back towards RNN which don't have a 'parallel training mode' like transformers, and you inherently need to move activations/gradients around nodes a ton to implement BPTT. (My note: that's Backpropagation through time)"

A Different Hardware Model

As for the hardware that's going to power these systems, the same user suggests that "(The result) is going to look more like supercomputing than a mega-GPU datacenter."

What eventually comes up here is the option of having a single serial chip instead of GPU and CPUs, and here's where gwern names names, dropping "Cerebras" in a context that might make an investor's ears perk up.

"An ungodly fast chip is exactly the premise (for Cerebras)," the author intones.

Going into the logistics of taking technologies to market, gwern suggests that competitors don't have a lot of options - they don't have time to start their own startups and compete with Cerebras, or the means to purchase the company outright, as it's planning for an IPO. Instead, there's the suggestion that movers and shakers should be negotiating with Cerebras for options on future hardware.

A New Supercomputer? Two Big Names Collaborate

If you're interested in other technology gossip, someone who posted on this original article shows that Microsoft and OpenAI are planning a $100 billion data project called Stargate, which reports characterize as a "supercomputer" that will apparently be built by iterations.

What do we know about this? For one thing, it's going to hog a lot of energy.

"Stargate is estimated to be one of the largest and most advanced data centers in the world, spanning several hundred acres of land and using up to 5 gigawatts of power, according to three people involved in the proposal who spoke with the outlet," writes Alyse Stanley at Tom's Guide, citing sourcing from The Information on the subject. "Of the series of installations the companies plan to build over the next six years, Stargate has the largest scope, and it's considered crucial for OpenAI to train and operate more advanced AI models than ChatGPT-4. Given its power needs, the companies have discussed using alternative power sources like nuclear energy."

All of this suggests that people are working behind the scenes to come up with new hardware models that they're going to use to compete, tooth and nail. A while ago, we reported on the role of nuclear and Bill Gates' TerraPower company that's pioneering smaller, safer nuclear facilities.

Generally, too, there's the consensus that Microsoft is likely to be a big player. Many of us also believe that the corporate landscape is going to change in the tech sector as we move into using these kinds of new setups that we've never seen before. In other words, just because you're a blue chip company doesn't mean you can't become a dinosaur in fairly quick order if you don't stay relevant to the cutting edge of AI systems.

And as you can see from the above back-and-forth, hardware is going to be super-important. We reported on Cerebras's chips last week, and we're still looking at how this is going to shake out with the company, and, presumably, a lot of shareholder interest.

Then there's models, where OpenAI and Perplexity both recently got a lot of funding. Let's keep an eye on this, too. Investors are trying frantically to read all of the tea leaves, and in academia and elsewhere, others of us are hoping to be able to shape the future of AI for the better.

Previous articleNext article

POPULAR CATEGORY

corporate

7133

tech

8144

entertainment

8776

research

3930

misc

9179

wellness

6973

athletics

9179