Groq, an AI solutions company, has announced its participation in the National Artificial Intelligence Research Resource (NAIRR) Pilot. The program, under the US National Science Foundation, aims to provide AI research resources for US researchers and educators.
Through its participation, the company will be providing researchers with access to its LPU Inference Engine through GroqCloud. This engine is notable for its ability to deliver real-time AI inference. Groq's LPU Inference Engine offers a significant speed advantage while using considerably less energy compared to GPU-based systems for running AI inference tasks.
Researchers can also use Groq technology through the Argonne Leadership Compute Facility (ALCF), which features a GroqRack compute cluster. This cluster includes a network of nine GroqNode servers arranged in a rotating multi-node network topology.
By using this site, you agree to allow SPEEDA Edge and our partners to use cookies for analytics and personalization. Visit our privacy policy for more information about our data collection practices.