Skip to content

Revolutionary 3D-layered memory technology aims to unseat HBM for AI processing, with d-Matrix asserting that 3DIMC offers speeds 10 times faster and efficiency 10 times improved.

Santa Clara-based tech firm d-Matrix aims to supersede HBM in AI inference using 3DIMC, or 3D digital in-memory-compute. The company's objective for this technology is to develop a memory chip that boasts a 10-fold increase in efficiency and speed over HBM when performing AI inference tasks.

Next-gen3D-stacked memory technology, as touted by d-Matrix, aims to oust High-Bandwidth Memory...
Next-gen3D-stacked memory technology, as touted by d-Matrix, aims to oust High-Bandwidth Memory (HBM) in AI inference applications. Purported to be 10 times quicker and more energy-efficient, the innovation dubbed 3DIMC could potentially reign supreme.

Revolutionary 3D-layered memory technology aims to unseat HBM for AI processing, with d-Matrix asserting that 3DIMC offers speeds 10 times faster and efficiency 10 times improved.

In the rapidly evolving world of artificial intelligence (AI) and high-performance computing, memory systems play a crucial role. One such innovation is d-Matrix's 3DIMC technology, which is poised to challenge the dominance of High-Bandwidth Memory (HBM).

D-Matrix's CEO, Sid Sheth, believes that the future of AI inference depends on rethinking not just compute, but memory itself. The company's 3DIMC technology, designed for AI inference, is a testament to this belief. The DIMC logic dies in this technology are specifically tuned for matrix-vector multiplication, a common calculation used by transformer-based AI models.

Unlike HBM, which stacks memory modules on top of each other to more efficiently connect memory dies and access higher memory performance, d-Matrix's 3DIMC technology performs computations within the memory itself. This unique approach allows for a significant speed boost, with claims of being up to 10 times faster and running at up to 10 times greater speeds than HBM.

Moreover, d-Matrix's next-gen, Raptor, is claimed to outperform HBM by 10 times in inference tasks while using 90% less power. This efficiency could make it an attractive option for cost-conscious AI buyers.

The HBM market, projected to grow by 30% annually until 2030, is expected to see rising price tags to match demand. An alternative like d-Matrix's 3DIMC technology could offer a more affordable solution for AI buyers.

For those interested in staying updated on the latest news and in-depth reviews in this field, the Tom's Hardware Newsletter is a valuable resource, delivering news straight to your inbox.

As AI continues to revolutionise various industries, advancements in memory technology like d-Matrix's 3DIMC could play a significant role in driving this transformation. The race to develop faster, more efficient memory systems is undoubtedly heating up, and d-Matrix is making a strong case for its place in this competitive landscape.

Read also: