Liquid AI to Unveil First Products Built on Liquid Foundation Models (LFMs) at Exclusive MIT Event

Liquid-AI

Liquid AI, an MIT spin-off and foundation model company, will unveil its first products at an exclusive event held at MIT’s Kresge Auditorium on Wednesday, October 23, 2024, from 10 am to 1 pm ET.

The event will showcase AI products for financial services, biotech, and consumer electronics, built using Liquid AI’s pioneering Liquid Foundation Models (LFMs), a new generation of generative AI models that achieve state-of-the-art performance at every scale while maintaining a significantly smaller memory footprint both during training and inference beyond what was possible before. This particularly enables on-device and private enterprise use cases.

More than 1,000 AI leaders, scientists, and executives will attend this highly anticipated event. Liquid AI’s LFMs promise to enhance the landscape of AI applications, providing powerful solutions for industries demanding increased quality, control, efficiency, and explainability in their AI systems.

Also Read: VAST Data is a Leader in the 2024 Gartner Magic Quadrant for File and Object Storage Platforms

Liquid AI’s LFMs are designed to handle complex tasks, including multi-step reasoning and long-context recall while being computationally efficient. The first series of language LFMs, available in 1B, 3B, and 40B configurations, deliver robust performance and broad knowledge capacity across various domains, enabling them to solve tasks such as question answering, translation, composition, and summarization amongst other skills.

Key features of Liquid Foundation Models include:

  • Increased Quality for Reliable Decision-Making: LFMs offer advanced knowledge capacity, enabling them to excel in knowledge-based tasks.
  • Sustainability Through Efficiency: With reduced memory usage and near-constant inference speeds, LFMs are highly efficient for both training and deployment. Their on-device computing capabilities minimize reliance on cloud services, reducing costs and energy consumption.
  • Enhanced Explainability: Built from first principles, LFMs provide more white-box explainability than transformer-based architectures, allowing users to better understand and manage the decision-making processes of the models.

Liquid AI’s models are versatile and can be applied across industries, offering high-performance solutions for natural language processing, audio analysis, video recognition, and any sequential multimodal data.

SOURCE: BusinessWire