Cohere Labs Launches Tiny Aya, Making Multilingual AI Accessible

Cohere Labs

Cohere Labs has announced the launch of Tiny Aya, a new family of open-weight multilingual models designed for efficiency and on-device use. Tiny Aya allows researchers and developers to create AI applications that can understand and produce natural language in more than 70 languages, including many that have been underrepresented in AI models in the past.

Tiny Aya is a set of models that has a strong multilingual foundation and cultural understanding, with a 3.35 billion-parameter base model. The set of models also includes TinyAya-Global, which is an instruction-optimized variant designed for well-rounded performance on a broad set of languages. There are also complementary regional variants that enhance capabilities with more in-depth performance on particular linguistic environments: TinyAya-Earth, which targets African and West Asian languages, TinyAya-Fire, which targets South Asian languages, and TinyAya-Water, which targets Asia Pacific, West Asian, and European languages.

Tiny Aya’s architecture was designed to be run on optimized hardware, including laptops and mobile devices, without the need for constant internet connectivity. This is very useful for offline deployment, making the models ideal for translation, conversational AI, and language access in areas with poor connectivity.

Also Read: Cognizant Expands Google Cloud Partnership for Enterprise AI

The Tiny Aya models are released as open weights, which allows developers, researchers, and contributors to analyze, extend, and adapt the models for various tasks and applications. The models can be downloaded for local use through the major distribution platforms, which include HuggingFace, Kaggle, Ollama, and Cohere Platform.

Through the provision of a multilingual platform that is both broad and in-depth, Cohere Labs is working towards its goal of ensuring that AI is accessible and inclusive by enabling language communities around the world to take part in AI innovation.