All Updates

All Updates

icon
Filter
Product updates
AI21 launches Jamba 1.5 Mini and Jamba 1.5 Large models for long-context language processing
Foundation Models
Aug 22, 2024
This week:
Management news
6K Additive awarded project to develop C-103 powder for additive manufacturing
Additive Manufacturing
Sep 27, 2024
Partnerships
WAAM3D partners with Innovative Space Carrier, Cranfield University, and Aichi Sangyo for space transport development
Additive Manufacturing
Sep 19, 2024
Product updates
Microsoft launches Face Check with Microsoft Entra Verified ID
Facial Recognition
Sep 11, 2024
Funding
3DEO raises USD 3.5 million in strategic investment from Mizuho Bank
Additive Manufacturing
Sep 6, 2024
Product updates
HyperWrite launches open-source AI model Reflection 70 billion
Generative AI Applications
Sep 5, 2024
Product updates
Google begins US rollout of AI-powered Ask Photos feature
Generative AI Applications
Sep 5, 2024
Funding
Management news
HDAX Therapeutics raises USD 3.2 million in seed funding to progress pipeline and advance preclinical candidates into trials
Precision Medicine
Sep 5, 2024
Partnerships
Sequentify partners with Dexter to expand genomic diagnostics in Romania
Precision Medicine
Sep 5, 2024
Partnerships
Elevate partners with Cultivatd for European expansion
Vertical Farming
Sep 5, 2024
Product updates
Moolec confirms harvest of its genetically engineered plant-grown products
Crop Biotech
Sep 5, 2024
Foundation Models

Foundation Models

Aug 22, 2024

AI21 launches Jamba 1.5 Mini and Jamba 1.5 Large models for long-context language processing

Product updates

  • AI21, a developer of foundation models, has introduced Jamba 1.5 Mini and Jamba 1.5 Large to offer high performance and efficiency for long-context language processing.

  • The models have a hybrid architecture combining Transformer and Mamba approaches, allowing quality responses with large context windows. Jamba 1.5 Large is a mixture-of-experts model with 398 billion total parameters and 94 billion active parameters, while Jamba 1.5 Mini is an enhanced version of Jamba-instruct. Moreover, both models have a context window of 256K tokens.

  • The company claims that the models outperform competitors in end-to-end latency tests. They are optimized for building RAG and agentic workflows, making them suitable for complex, data-heavy tasks in enterprise environments.

Contact us

Gain access to all industry hubs, market maps, research tools, and more
Get a demo
arrow
menuarrow

By using this site, you agree to allow SPEEDA Edge and our partners to use cookies for analytics and personalization. Visit our privacy policy for more information about our data collection practices.