d-Matrix specializes in the development of computing platforms tailored for managing GenAI inferencing workloads within data centers. It aims to overcome memory limitations prevalent in AI computing by advancing high-efficiency digital-in memory compute (DIMC) chips.
Its flagship product Corsair, scheduled for commercial release in Q2 2025, serves as an in-memory compute engine engineered to accommodate the entire AI model within the memory, giving developers the flexibility to deploy AI models. Further, the platform offers 2400 TFLOPs of eight-bit peak compute, 2 GB of integrated performance memory, up to 256 GB of off-chip capacity memory, and delivers ultra-high memory bandwidth of 150 Tbps.
By using this site, you agree to allow SPEEDA Edge and our partners to use cookies for analytics and personalization. Visit our privacy policy for more information about our data collection practices.