Enterprise AI company Cohere has introduced Tiny Aya, a new family of open-weight multilingual AI models developed by Cohere Labs, designed to run efficiently on everyday devices without requiring internet connectivity. The models support more than 70 languages, including South Asian languages such as Hindi, Bengali, Tamil, Telugu, Punjabi, Urdu, Gujarati, and Marathi.
The base model contains approximately 3.35 billion parameters and is accompanied by variants including TinyAya-Global, TinyAya-Earth, TinyAya-Fire, and TinyAya-Water, tailored for regional language ecosystems. Trained on a cluster of Nvidia H100 GPUs, the models are optimized for low-compute environments and offline translation or application development.
The Tiny Aya models are available through Hugging Face, Kaggle, Ollama, and the Cohere Platform, supporting developers building AI applications for multilingual and connectivity-limited environments.




