MosaicML

Overview
Activities
News
Foundation Models?
Product stageSegments
Minimum Viable Product
?
Large language models (LLMs), Fine-tuned language models
?

MosaicML's solution is primarily centered around offering a comprehensive platform for foundation models in the field of GenAI. Its focus is on providing open-source, commercially usable large language models (LLMs) that are optimized for training and inference. These foundation models, collectively known as the MPT (MosaicML Pretrained) Foundation Series (launched in June/July 2023) include:

1) MPT-7B (seven billion parameters and 2,000-token context), designed to handle large, complex datasets using deep learning and neural networks. MPT-7B was reportedly trained on a one trillion token dataset, surpassing the training data size of other open-source models like LLaMA, Pythia, OpenLLaMA, and StableLM. It can be used in multiple industries, such as finance, healthcare, and manufacturing, for tasks like financial forecasting and predictive maintenance. In particular, MPT-7B excels at processing both structured and unstructured data. 

2) MPT-7B-8K offers seven billion parameters and an 8,000-token context. Trained on NVIDIA H100s with 500 billion tokens, it excels in document summarization and question answering.

3) MPT-30B (30 billion parameters), claims to outperform GPT-3, and is particularly strong in coding tasks. Variants such as MPT-30B-Instruct and MPT-30B-Chat are designed for instruction following and multi-turn conversations.

MosaicML's platform enables users to train and deploy custom LLMs based on these foundation models, offering flexibility and control over their AI applications. Additionally, MosaicML provides tools like Inference for APIs, fine tuning for further customization, and pre-training to create domain-specific LLMs from scratch. 

In July 2023, Databricks, a unified, open analytics platform, announced that it had acquired MosaicML, aiming to reduce the cost of hosting models and democratize AI by combining data infrastructure with MosaicML's model training platform.

Key customers and partnerships

As of September 2023, its customers include Personal AI, Replit, Dream3D, Twelve Labs, and Natural Synthetics. It has also partnered with cloud providers such as Amazon, Microsoft Azure, Google Cloud, and Oracle. 

Funding and financials 

The company emerged from stealth mode in October 2021 and raised USD 37 million in funding from Lux Capital, DCVC, Future Ventures, and Playground Global to support MosaicML's efforts to develop a cloud-based neural network training system to address challenges in AI model training. In addition, the funds are expected to support the company's ongoing research and development efforts.

HQ location:
160 Spear St 15th Floor San Francisco CA USA
Founded year:
2021
Employees:
101-250
IPO status:
Private
Total funding:
USD 37.0 mn
Last Funding:
-
Last valuation:
USD 222.0 mn (Jan 2023)
Key competitors
Filter by the segments to which the disruptor belongs
All Segmentsexpand
 
Loading...
Loading...
Loading...
Loading...
Product Overview
-
Loading...
Loading...
Loading...
Loading...
-
Loading...
Loading...
Loading...
Loading...
-
Loading...
Loading...
Loading...
Loading...
-
Loading...
Loading...
Loading...
Loading...
-
Loading...
Loading...
Loading...
Loading...
Product Metrics
-
Loading...
Loading...
Loading...
Loading...
-
Loading...
Loading...
Loading...
Loading...
-
Loading...
Loading...
Loading...
Loading...
-
Loading...
Loading...
Loading...
Loading...
-
Loading...
Loading...
Loading...
Loading...
Company profile
-
Loading...
Loading...
Loading...
Loading...
-
Loading...
Loading...
Loading...
Loading...
-
Loading...
Loading...
Loading...
Loading...
-
Loading...
Loading...
Loading...
Loading...
-
Loading...
Loading...
Loading...
Loading...
Funding data are powered by Crunchbase
arrow
menuarrow
Click here to learn more
Get a demo

By using this site, you agree to allow SPEEDA Edge and our partners to use cookies for analytics and personalization. Visit our privacy policy for more information about our data collection practices.