Mistral AI has introduced Mistral Large 2, a model supporting a diverse range of languages and coding languages, available for non-commercial use.
The model has a 128,000-token context window and 123 billion parameters. Furthermore, it supports over 80 coding languages, including Python, Java, C, C++, JavaScript, and Bash, as well as French, German, Spanish, Italian, Portuguese, Arabic, Hindi, Russian, Chinese, Japanese, and Korean.
The company claims that the model offers performance-to-cost efficiency. Its focus on reasoning abilities reduces the likelihood of generating inaccurate or irrelevant information. Additionally, it can reportedly recognize when it lacks sufficient information.
By using this site, you agree to allow SPEEDA Edge and our partners to use cookies for analytics and personalization. Visit our privacy policy for more information about our data collection practices.