AI startup Reka has introduced its latest multimodal and multilingual language model, Reka Flash.
The new model includes 21 billion parameters and is trained on text from over 32 languages, claiming to outperform Llama 2, Grok-1, and GPT-3.5 in aspects such as reasoning, code generation, and question answering.
Additionally, the company released a compact version of the model called Reka Edge, which has seven billion parameters, for scenarios with limited resources, useful for on-device or local applications. Both models are available for beta testing.
The company also plans to make Reka Core (a more advanced model) available to the public in the coming weeks.
By using this site, you agree to allow SPEEDA Edge and our partners to use cookies for analytics and personalization. Visit our privacy policy for more information about our data collection practices.