Cohere is making a bold play for the multilingual AI market. The enterprise AI startup just launched its Tiny Aya family of open models supporting over 70 languages, positioning itself as a serious challenger to OpenAI and Google in the race to democratize AI beyond English. The move signals a strategic shift toward lightweight, accessible models that can run offline - a capability that could reshape how businesses deploy AI globally.
Cohere, the Toronto-based AI startup valued at $5.5 billion, is betting big on language diversity. The company's new Tiny Aya models represent a direct challenge to the English-centric dominance of large language models, bringing support for more than 70 languages to developers and enterprises worldwide.
What makes this launch particularly significant is the "tiny" designation. While competitors race to build ever-larger models demanding massive compute resources, Cohere is zigging where others zag. These lightweight models are explicitly designed for offline-first deployment, a capability that Meta and others have been chasing but struggling to perfect at scale.
The timing isn't coincidental. Enterprise AI adoption has hit a wall in non-English markets, where existing models from OpenAI and Google often stumble over nuanced translations or simply lack training data for less common languages. Cohere's built its reputation on enterprise-focused AI tools, and Tiny Aya extends that strategy into emerging markets where connectivity remains spotty but smartphone penetration is exploding.
Cohere has been quietly building toward this moment. The company previously released Command R and Command R+, models optimized for retrieval-augmented generation in business contexts. But those were primarily English-focused tools. Tiny Aya represents a fundamental expansion of Cohere's addressable market, potentially unlocking customers across Southeast Asia, Africa, and Latin America who've been underserved by existing AI infrastructure.











