Launched to commercialize in-memory computing technology developed at DARPA and funded by the US Department of Defense, EnCharge AI offers high-performance edge AI microprocessors. The company leverages in-memory computing in its chips to accelerate AI applications in servers and edge devices. EnCharge also reduces its in-memory computing sensitivity to voltage fluctuations and temperature spikes by using capacitors rather than transistors. Additionally, the company provides a scalable software stack that enables users to connect their existing AI systems with its hardware and supports the operation, model, and resolution customizations of edge AI applications.
The company’s test chips (as of December 2022) and hardware deliver over 150 TOPS/watt of energy efficiency, outperforming its competition Axelera and Blaize, which offer 48 TOPS/watt and 2.7 TOPS/watt, respectively.
These features enable EnCharge to cater to a range of use cases, such as automotive sensing, advanced manufacturing, smart retail, smart warehouses and logistics, industrial robotics, and drones.
Funding and financials
In March 2024 , EnCharge AI received a USD 18.6 million grant from the Defense Advanced Research Projects Agency (DARPA), to explore advancements in AI applications using switched-capacitor analog in-memory computing chips developed by Princeton University and commercialized by EnCharge AI.
By using this site, you agree to allow SPEEDA Edge and our partners to use cookies for analytics and personalization. Visit our privacy policy for more information about our data collection practices.