tech
February 17, 2026
Cohere launches a family of open multilingual models
Cohere's Tiny Aya models support over 70 languages

TL;DR
- Cohere launched Tiny Aya, a new family of open-weight multilingual AI models.
- The models support over 70 languages and can run on devices like laptops without internet access.
- Tiny Aya models are designed for offline use, ideal for applications like translation in areas with limited connectivity.
- Regional variants are available for specific language groups, including South Asian, African, and Asia Pacific languages.
- The models were trained using relatively modest computing resources on a cluster of 64 H100 GPUs.
- Cohere aims to provide systems that offer stronger linguistic grounding and cultural nuance for diverse communities.
- The models are accessible on HuggingFace, Kaggle, Ollama, and the Cohere Platform for download and local deployment.
- Cohere is also releasing training and evaluation datasets and plans to share its training methodology.
Continue reading the original article