Tesla is eyeing HBM4 chips from Samsung and SK Hynix to power its Dojo supercomputer

You May Be Interested In:E-waste or Linux? Charities face tough choices as Windows 10 support ends


In a nutshell: Tesla finds itself at the forefront of a heated competition between South Korean semiconductor giants Samsung and SK Hynix. The electric vehicle manufacturer has reportedly reached out to both companies, seeking samples of their upcoming HBM4 memory chips.

Tesla wants to integrate the next-gen high-bandwidth memory into its Dojo system. Dojo is a custom-built supercomputer designed to train the company’s “Full Self-Driving” neural networks. Industry insiders suggest Tesla could deploy the upgraded memory not just in Dojo, but also in its data centers and future self-driving vehicles.

Currently, the Dojo system utilizes older HBM2e chips to train the complex AI models underlying Tesla’s Full Self-Driving capabilities. But, as a TrendForce report citing Maeli Business Newspaper highlights, the company wants to utilize the performance boosts promised by HBM4.

For those unfamiliar, high-bandwidth memory like HBM4 represents a specialized type of RAM designed to provide tremendous data throughput while operating with greater energy efficiency. These qualities make it ideal for the kind of processing power required by cutting-edge AI workloads.

SK Hynix claims its chip will deliver 1.4 times the bandwidth of the previous HBM3e generation while consuming 30% less power. If accurate, this indicates a bandwidth improvement exceeding 1.65 terabytes per second.

Another anticipated HBM4 innovation is an integrated logic die acting as a controller underneath the memory stack. This change could unlock further speed and power optimizations ideal for AI data processing.

SK Hynix and Samsung find themselves in a fierce battle to stake their claim in the HBM market, which is projected to swell to $33 billion by 2027. The two rivals are reportedly working hard on HBM4 prototypes specifically for evaluation by Tesla and other major US tech titans, including Microsoft, Meta, and Google.

Currently, SK leads this race by supplying chips to Nvidia, and it aims to kick off HBM4 production in late 2025. The company has also managed to pull ahead by launching its 321-layer TLC NAND flash memory first. Samsung looks determined to gain ground, though; it has partnered with TSMC to produce key components leveraging its advanced 4nm process node.

share Paylaş facebook pinterest whatsapp x print

Similar Content

Analyst claims the games industry hopes GTA VI will cost up to $100, raising average prices
Analyst claims the games industry “hopes” GTA VI will cost up to $100, raising average prices
Tesla Sues Customers & Journalists in China for Complaining About Its Cars
Tesla Sues Customers & Journalists in China for Complaining About Its Cars
iPhone 17
iPhone 17 ‘Air’ expected to be Apple’s thinnest smartphone ever
RTX 5090 early benchmarks show underwhelming performance uplift over the RTX 4090
RTX 5090 early benchmarks show underwhelming performance uplift over the RTX 4090
Dragon Age: The Veilguard is now Steam
Dragon Age: The Veilguard is now Steam’s best-selling game, breaking BioWare’s concurrent player record
An Nvidia supercomputer has been improving DLSS non-stop for the past six years
An Nvidia supercomputer has been improving DLSS non-stop for the past six years
Flash News Hub | © 2024 | News