Meta's AI has introduced MobileLLM, a compact language model for smartphones and other devices with limited resources.
The model has fewer than 1 billion parameters to provide AI features with a smaller footprint. Moreover, it uses preferential model depth, embedding sharing, grouped-query attention, and immediate block-wise weight-sharing methods.
The company claims the model offers better accuracy with reduced computational resource requirements than larger models.
By using this site, you agree to allow SPEEDA Edge and our partners to use cookies for analytics and personalization. Visit our privacy policy for more information about our data collection practices.