Amazon is releasing an impressive new AI chip and teasing an nvidia-friendly road


Amazon Web Services, which already exists build your own AI Training Chips For several years, it has only introduced a new version known as Trainum3 which comes with some impressive specs.

Cloud providers, that is make that announcement Tuesday at AWS Re: Grandmother 2025, also teased the next product in the training path of AI training products: Train4, which is already in the works and will be able to work with Nvidia chips.

AWS used its annual technology conference to officially launch the Trainerserver3 ultraserver, a system developed by the company’s state-of-the-art Trainometer chip, 3 Trainometer3 chips, as well as home network technology. As you might expect, third-generation chips and systems offer a big boost in performance for AI training and inference over second-generation chips, according to AWS.

AWS says the system is 4x faster, with 4x more memory, not just for training, but for delivering AI applications at peak demand. In addition, thousands of ultraservers can be linked together to provide applications with up to 1 million Talium Chips – 10x the previous generation. Each ultraserver can host 144 chips, according to the company.

Perhaps more importantly, AWS says its chips and systems are also 40% more energy efficient than previous generations. Meanwhile the world race to build bigger data centers is on Astronomical gigawatts of electricity, The data center giant has tried to create a system that drinks less, not more.

It is, of course, in the direct interest of AWS that it should be done. But in a classic way, Amazon Cost-Weal-aware, promises that the system also saves AI customers money.

AWS customers like anthropopic (from Amazon as well as investors), Japanese Llm Karakuri, Splashmusic, and DECART have used the third gen chip and the system, Amazon.

TechCrunch events

San Francisco
I’m fat
October 13-15, 2026

AWS also presented several paths for its next chip, Training4, which is already in development. AWS Promise Chip will provide another step in performance and support Nvlink chip intersononect sync technology.

This means that Trainium4 powered systems will be able to interoperate and exceed performance with NVIDIA GPUs, but using Amazon’s low-cost server rack technology.

It’s probably not known, that Nvidia’s CUDA (Compute Device Architecture) has become the de facto standard it was built to support. Teaterium powered systems may make it easier to Woo BIG AI Apps built with NVIDIA GPUs in Amazon’s daytime.

Amazon has not announced a timeline for Traatium4. If the company follows the previous rollout time, we will be able to hear more about Train9 at next year’s conference.

Follow along with all of TechCrunch’s coverage from The Annual Tech Enterprise Event is hereSee rank-.

https://www.youtube.com/watch?v=ne-3tfhvf9c

Check out the latest news on everything from agentic AI and Cloud Cloud AI and Cloud Cloud to security and more from flags



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *