pwshub.com

The Newest Artificial Intelligence Stock Has Arrived -- and It Claims to Make Chips That Are 20x Faster Than Nvidia

The artificial intelligence chipmaker Nvidia (NASDAQ: NVDA) has amassed close to a $3.2 trillion market cap, making it one of the world's largest chipmakers. It now consumes more than 6% of the broader benchmark S&P 500 index. Over the last five years, Nvidia has grown annual revenue by 458% and the stock is up an incredible 2,009%. Given the potential for AI to disrupt life as we know it, it's understandable that investors are so excited about the stock.

But the lure of these kinds of gains is naturally going to attract competition. Now, one of Nvidia's competitors is planning an initial public offering (IPO) and claiming to manufacture chips that can vastly outperform Nvidia at a fraction of the price. Let's take a look.

20x better than Nvidia?

Last week, the AI chipmaker Cerebras filed its registration statement with the Securities and Exchange Commission (SEC) with the intent to go public. In a press release from 2021, Cerebras said it had a valuation of $4 billion after a $250 million series F financing round. The company is targeting a $1 billion IPO at a $7 billion to $8 billion valuation.

In its registration statement, Cerebras cites Nvidia as a competitor, as well as other large AI companies such as Advanced Micro Devices, Intel, Microsoft, and Alphabet. Here is a description of what Cerebras does:

We design processors for AI training and inference. We build AI systems to power, cool, and feed the processors data. We develop software to link these systems together into industry-leading supercomputers that are simple to use, even for the most complicated AI work, using familiar ML frameworks like PyTorch. Customers use our supercomputers to train industry-leading models. We use these supercomputers to run inference at speeds unobtainable on alternative commercial technologies.

Cerebras' pitch is that bigger is better. That's because the company has designed a chip that is the size of a full silicon wafer, and the largest ever sold. The company believes that the size advantage leads to less time moving data. Furthermore, Cerebras has a flexible business model in which clients can buy Cerebras products to have at their facilities or through a consumption-based subscription through the company's cloud infrastructure.

Cerebras clearly wants investors to compare, or at least associate, the company with Nvidia. Nvidia is mentioned 12 times in the registration statement. Cerebras also provides a side-by-side comparison of its Wafer-Scale Engine-3 chip versus Nvidia's H100 graphics processing unit (GPU), which is considered the most powerful GPU on the market.

Cerebras Nvidia comparison.

Image source: Cerebras registration statement.

Cerebras CEO Andrew Feldman publicly said the company's inference offering is 20 times faster than Nvidia's at a fraction of the price. In 2023, Cerebras generated about $78.7 million of revenue, up 220% year over year. Through the first half of 2024, Cerebras has grown revenue to $136.4 million. The company still hasn't earned a profit, having reported a nearly $67 million loss through the first half of 2024. These numbers also pale in comparison to Nvidia, which recently reported second-quarter revenue of $30 billion and a profit of roughly $16.6 billion.

Will Cerebras make a splash?

With big publicity from news publications and claims of being 20 times faster than Nvidia, I think it's safe to say that Cerebras already has and will continue to make a splash.

Depending on the excitement investment bankers can drum up during the company's road show and market conditions, I wouldn't be surprised to see Cerebras go public at a higher valuation than expected. AI has been all the buzz and the IPO market has been flat for a few years now, so there could be pent-up demand on Wall Street.

Will Cerebras overtake Nvidia? Only time will tell. Its product offerings are impressive, but it still has a ways to go to get its financial profile in line with Nvidia. Furthermore, there may be some advantages to Nvidia having smaller chips and it remains to be seen whether Cerebras can compete with Nvidia's software language CUDA -- although the company does say that its software program "eliminates the need for low-level programming in CUDA."

While everything sounds great, there is likely still a "show me" component to this story. After all, the bulk of Cerebras' revenue comes from one customer. Nvidia also has a leading market share in the AI chip space and relationships with many large clients. Who's to say Nvidia couldn't use its size -- and likely resource -- advantage to develop a similar large wafer chip? There's a lot left to play out, but this could be one of the more interesting developments for market watchers to pay attention to.

Don’t miss this second chance at a potentially lucrative opportunity

Ever feel like you missed the boat in buying the most successful stocks? Then you’ll want to hear this.

On rare occasions, our expert team of analysts issues a “Double Down” stock recommendation for companies that they think are about to pop. If you’re worried you’ve already missed your chance to invest, now is the best time to buy before it’s too late. And the numbers speak for themselves:

  • Amazon: if you invested $1,000 when we doubled down in 2010, you’d have $21,266!*

  • Apple: if you invested $1,000 when we doubled down in 2008, you’d have $43,047!*

  • Netflix: if you invested $1,000 when we doubled down in 2004, you’d have $389,794!*

Right now, we’re issuing “Double Down” alerts for three incredible companies, and there may not be another chance like this anytime soon.

See 3 “Double Down” stocks »

*Stock Advisor returns as of October 7, 2024

Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool’s board of directors. Bram Berkowitz has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Advanced Micro Devices, Alphabet, Microsoft, and Nvidia. The Motley Fool recommends Intel and recommends the following options: long January 2026 $395 calls on Microsoft, short January 2026 $405 calls on Microsoft, and short November 2024 $24 calls on Intel. The Motley Fool has a disclosure policy.

The Newest Artificial Intelligence Stock Has Arrived -- and It Claims to Make Chips That Are 20x Faster Than Nvidia was originally published by The Motley Fool

Source: finance.yahoo.com

Related stories
1 week ago - One of the most-influential businesses in the artificial intelligence (AI) arena is conducting its first stock split in nearly a quarter of a century.
2 weeks ago - Two high-flying artificial intelligence (AI) stocks will take center stage in the upcoming week with respective 10-for-1 forward splits that follow in the footsteps of Nvidia and Broadcom.
3 days ago - A brand-name consumer electronics company is stepping into the spotlight, and a member of the "Magnificent Seven" appears ready to join the club.
6 hours ago - Warren Buffett, Bill Ackman, Terry Smith, and David Tepper are selling more stocks than they're buying -- and there's a good reason why.
1 month ago - The S&P 500 (SNPINDEX: ^GSPC) measures the performance of 500 companies that meet specific eligibility requirements, including positive earnings...
Other stories
28 minutes ago - At the company’s Delivering the Future event held this week in Nashville, Amazon highlighted its presence in Music City.
28 minutes ago - Super Micro Computer (NASDAQ: SMCI) split its shares this month and now they are trading at one-tenth of what they were before the split. For...
28 minutes ago - After the chipmaker's massive run-up over the past two years, what's the right move for investors now?
28 minutes ago - (Bloomberg) -- Earnings season is here, and the US stock market’s furious $9 trillion 2024 rally is facing perhaps its biggest test of the year. Most Read from BloombergThe Master Plan That Shaped Pakistan’s Capital Is No Longer...
1 hour ago - Photo: xPACIFICA (Getty Images), NurPhoto (Getty Images), Apu Gomes (Getty Images), A statue of Satoshi Nakamoto, a presumed pseudonym used by the...