ZeroPoint Technologies Unveils Groundbreaking AI-MX Memory Optimization Solution for AI and Datacenter Efficiency

ZeroPoint Technologies Unveils Groundbreaking AI-MX Memory Optimization Solution for AI and Datacenter Efficiency

ZeroPoint Technologies AB announced a breakthrough hardware-accelerated memory optimization product that enables the nearly instantaneous compression and decompression of deployed foundational models, including the leading large language models (LLMs).

The new product, AI-MX, will be delivered to initial customers and partners in the second half of 2025 and will enable enterprise and hyperscale datacenters to realize a 1.5 times increase in addressable memory, memory bandwidth, and tokens served per second for applications that rely on large foundational models. The full technical specifications of AI-MX are available here.

"Foundational models are stretching the limits of even the most sophisticated datacenter infrastructures. Demand for memory capacity, power, and bandwidth continues to expand quarter-upon-quarter," said Klas Moreau, CEO of ZeroPoint Technologies. "With today's announcement, we introduce a first-of-its-kind memory optimization solution that has the potential to save companies billions of dollars per year related to building and operating large-scale datacenters for AI applications."

"Futurum Intelligence currently predicts the total AI software and tools market to reach a value of $440B by 2029 and Signal65 believes that ZeroPoint is positioned to address a key challenge within this fast-growing market with AI-MX," said Mitch Lewis, Performance Analyst at Signal65. "Signal65 believes that AI-MX is currently a unique offering and that with ongoing development and alignment with leading technology partners, there is strong growth opportunity for both ZeroPoint and AI-MX."