
NVIDIA Dropped one Surprise announcement Just before Christmas: A $20 billion deal to license technology from artificial intelligence chip startup Groq and bring in most of its team, including co-founder and CEO Jonathan Ross. The move signals that Nvidia is no longer assuming that its GPUs will be the only useful chips in the next big phase of AI deployment: running AI models that have been trained to do everything from answering queries to generating code to analyzing images (a process called inference), and doing so at scale.
The Groq deal strengthens the positions of other startups building their own AI chips, including Cerebras, D-Matrix and SambaNova. Intel The company has reportedly signed a term sheet to acquire new players such as British chip startup Fractile. It has also boosted artificial intelligence inference software platform startups such as Etched, Fireworks and Baseten, bolstering their valuations and making them more attractive acquisition targets in 2026, analysts, founders and investors said.
Karl Freund, founder and principal analyst at Cambrian-AI Research, noted that Microsoft-backed D-Matrix raised $275 million last month at a valuation of $2 billion. Like Groq, D-Matrix focuses on sacrificing some of the flexibility of Nvidia GPUs in exchange for greater speed and efficiency when running AI models. “I’m sure D-Matrix is a very happy startup right now,” Freund said. “I suspect their next round of valuations will be much higher.”
Cerebras, another inference-focused chip company, also appears to be well positioned. Cerebras, known for its dinner plate-sized “wafer-scale” chips designed to run very large models on a single piece of silicon, has filed for an IPO after previous delays. The company is also increasingly viewed as a potential takeover target, Freund said. “You don’t want to wait until after the IPO, when it’s more expensive,” he said. “From that perspective, Cerebras is sitting pretty right now.”
Nvidia-Groq deal clears market direction
Executives at these companies said Nvidia’s move would help clarify the market’s direction. “When (the Nvidia-Groq deal) happened, we said, ‘The market finally recognizes this,'” said D-Matrix CEO Sid Sheth. wealth. “I think what Nvidia is really doing is they’re saying, okay, this approach is a successful one.”
Cerebras CEO Andrew Feldman Posted on X In the past, people thought that Nvidia GPUs were all they needed for artificial intelligence, a notion that acted as a moat to prevent AI chip startups from cannibalizing Nvidia’s market share. But that moat is now gone with the Groq deal, Feldman writes. “It reflects a growing industry reality – the inference market is fragmenting, a new category has emerged, and speed is no longer a feature – but the entire value proposition. This value proposition can only be achieved through a different chip architecture than GPUs.”
Still, not everyone is convinced every inference chip startup will benefit equally. Matt Murphy, a partner at Menlo Ventures, said the chip industry remains a difficult industry for venture investors given high capital requirements and long cycles. “A lot of venture capital firms stopped investing in chips 10 or 15 years ago,” Murphy said. “It’s capital intensive; it takes years to launch a product; and the results are difficult to predict.”
Still, he pointed to Fireworks, an artificial intelligence inference platform that raised $250 million in October at a $4 billion valuation, as a startup with a technological edge thanks to a founding team made up of the same engineers who created PyTorch. But he added that it was unclear to what extent the current enthusiasm reflected real technological differences. “It’s hard to say who really has something important when all boats are rising, and that seems to be happening,” he said, adding that consolidation across the industry now seems increasingly likely.
New entrants seek real disruption
But at least one veteran in the field of artificial intelligence hardware believes that even today’s inference-focused startups are not truly disruptive.
Naveen Rao, former senior vice president of AI at Databricks and founder of MosaicML, recently left Databricks to found Unconventional AI. The company confirmed last month that it had received $475 million in seed round financing led by Andreessen Horowitz and Lightspeed Ventures. His criticism: Companies like Groq, D-Matrix and Cerebras may be well-positioned in today’s market, but they are still optimizing within the same digital computing paradigm.
After Nvidia’s Groq deal confirmed the need for faster, more efficient inference, startups that fit perfectly into today’s AI stack suddenly look more valuable — not because they reinvented computing, Rao believes, but because they work within it. Unconventional AI is pursuing a more radical path: building new hardware that exploits the physical behavior of silicon itself, and redesigning neural networks to match it.
“We’ve been building the same basic machines for 80 years, which are digital-by-digital machines,” he said. “But there has never been one workload that accounted for more than 2 percent of all computing cycles.” That’s changing, he explains: In a few years, 95 percent of computing will be for artificial intelligence.
From this perspective, he said, it was important to build a machine that was completely different from what it is today. However, Rao said the effort could take five or more years to bear fruit and is not designed to capitalize on the current reasoning boom.

