Cerence AI and SiMa.ai Partner to Power Next-Gen Conversational AI at the Edge

Cerence AI

Cerence Inc., a global leader pioneering conversational AI-powered user experiences, and SiMa.ai, which builds ultra-efficient machine learning system-on-chip (MLSoC) platforms with best-in-class multi-modal inference, announced a new strategic partnership. The collaboration brings CaLLM™ Edge, Cerence AI’s automotive-grade embedded small language model (SLM), to SiMa.ai MLSoC Modalix, delivering intelligent, low-power in-car conversational AI experiences.

The joint solution will debut at IAA Mobility 2025 in Munich, where Cerence will showcase Cerence xUI™, the company’s agentic AI assistant platform, with CaLLM Edge running on SiMa.ai hardware.

“Our partnership with SiMa.ai represents a major leap forward in deploying efficient on-device AI for current and next-generation vehicles,” said Cerence AI Executive Vice President, Product & Technology, Nils Schanz. “Within Cerence xUI, CaLLM Edge delivers advanced reasoning, multi-turn conversation, and proactive engagement, all running seamlessly on the MLSoC Modalix. With cross-platform compatibility and hardware flexibility, we empower OEMs to adapt quickly, reduce complexity, and deploy bespoke in-car assistants across different vehicle platforms.”

The integration of CaLLM Edge with SiMa.ai’s Modalix platform delivers on the needs of automakers and end users alike by providing a robust, high-performing technical foundation for in-vehicle AI at the edge. Cerence’s hybrid AI creates efficient and fluid transitions between edge and cloud execution, a critical requirement in delivering an uninterrupted, high-quality conversational user experience. By leveraging the edge capabilities of Cerence xUI on MLSoC Modalix, OEMs can optimize the user experience by delivering real‑time, multi-modal interaction and processing sensor inputs as well as conversational inference directly on-device with industry-leading power efficiency.

Also Read: QAD and Esker Announce Strategic Alliance to Optimize Accounts Payable Processes and Strengthen Digital Transformation Practices

Today’s drivers expect features to work everywhere, putting pressure on OEMs to deliver smart, reliable experiences. However, cloud-only solutions rely heavily on connectivity, and can result in degraded user experiences during connectivity loss. Cerence’s hybrid AI architecture tackles these challenges by combining embedded AI for ultra-low-latency responsiveness and enhanced privacy with enriched, context-aware information from the cloud. With CaLLM Edge running on Modalix, the in-car AI experience for the end user is faster, more responsive, more conversational, and available regardless of connectivity.

For many automakers, increased flexibility and resilience are key considerations. As OEMs face increasing platform complexity and navigate supply chain volatility, there is a growing need for hardware flexibility and cost-effective performance. CaLLM Edge is engineered for cross-platform compatibility and built to run efficiently on a wide range of hardware architectures. With the addition of MLSoC Modalix to Cerence AI’s platform ecosystem, OEMs now have access to a compact, power-efficient, and scalable edge solution that enhances flexibility in deployment.

“The ultimate vision is voice control that understands the user as naturally as a human would. While LLMs can deliver this capability, they typically depend on the cloud or power-hungry graphics cards,” said Harald Kröger, President of Automotive at SiMa.ai. “Together, SiMa.ai and Cerence offer a breakthrough: a production-ready device that delivers extraordinary performance with exceptional energy efficiency. Paired with the LLiMa™ toolchain, it significantly reduces integration effort and accelerates development time,” he added.

Source: GlobeNewswire