All Updates

All Updates

icon
Filter
Funding
Groq raises USD 640 million for capacity expansion and product development
Generative AI Infrastructure
Aug 5, 2024
This week:
Partnerships
T-Mobile partners with OpenAI to develop AI-powered customer service platform
Generative AI Applications
Today
Partnerships
Runway partners with Lionsgate to develop AI video tools using studio's movie catalog
Generative AI Applications
Yesterday
Funding
QMill raises EUR 4 million in seed funding to provide quantum computing industrial applications
Quantum Computing
Yesterday
Product updates
QuiX Quantum launches 'Bia' quantum cloud computing service for quantum solutions
Quantum Computing
Yesterday
Partnerships
Oxford Ionics and Infineon Technologies partner to build portable quantum computer for Cyberagentur
Quantum Computing
Yesterday
Product updates
Partnerships
Tencent Ai Lab launches EzAudio AI for text-to-audio generation with Johns Hopkins University
Foundation Models
Yesterday
Funding
TON secures USD 30 million in investment from Bitget and Foresight Ventures
Web3 Ecosystem
Yesterday
Funding
Hemi Labs raises USD 15 million in funding to launch blockchain network
Web3 Ecosystem
Yesterday
Product updates
Fivetran launches Hybrid Deployment for data pipeline management
Machine Learning Infrastructure
Yesterday
Product updates
Fivetran launches Hybrid Deployment for data pipeline management
Data Infrastructure & Analytics
Yesterday
Generative AI Infrastructure

Generative AI Infrastructure

Aug 5, 2024

Groq raises USD 640 million for capacity expansion and product development

Funding

  • Groq raised USD 640 million in a recent round led by Blackrock, with participation from Neuberger Berman, Type One Ventures, Cisco, KDDI, and Samsung Catalyst Fund. This funding raised the company's valuation to USD 2.8 billion.

  • Groq plans to use the raised funds to expand the capacity of its services and introduce new models and features to its platform.

  • Analyst QuickTake: Groq achieved a major performance breakthrough in November 2023 by processing over 300 tokens per second per user on Meta AI's Llama-2 70B model. The company has also gained recognition for its work in the semiconductor industry, specializing in high-performance computing solutions designed for AI workloads.

Contact us

Gain access to all industry hubs, market maps, research tools, and more
Get a demo
arrow
menuarrow

By using this site, you agree to allow SPEEDA Edge and our partners to use cookies for analytics and personalization. Visit our privacy policy for more information about our data collection practices.