Databricks, the Data and AI company, announced the upcoming Preview of Lakeflow Designer. This new no-code ETL capability lets non-technical users author production data pipelines using a visual drag-and-drop interface and a natural language GenAI assistant. Lakeflow Designer is backed by Lakeflow, the unified solution for data engineers to build reliable data pipelines faster with all business-critical data, which is now Generally Available.
Traditionally, enterprises have faced a big tradeoff — either letting analysts create pipelines with no-code/low-code tools, while sacrificing governance, scalability and reliability. Or, they’ve relied on technical data engineering teams to code production-ready pipelines, but those teams are overloaded, and their backlog is long. In the end, most enterprises adopt a combination of both approaches, resulting in complex environments to manage and maintain. What data-driven enterprises really want is the best of both worlds: no code pipelines with governance, scalability and reliability.
“There’s a lot of pressure for organizations to scale their AI efforts. Getting high-quality data to the right places accelerates the path to building intelligent applications,” said Ali Ghodsi, Co-founder and CEO at Databricks. “Lakeflow Designer makes it possible for more people in an organization to create production pipelines so teams can move from idea to impact faster.”
Also Read: Adeptia Introduces AI-Driven Intelligent Document Processing, AIDP, Ensuring Highly Accurate Data Extraction and Seamless Data Integration for the Enterprise
Lakeflow Designer: AI-Native Drag-and-Drop Data Prep for the Business Analyst
The new Lakeflow Designer empowers business analysts to build no-code ETL pipelines with natural language and a drag-and-drop UI that provides the same scalability, governance, and maintainability as those built by data engineers. Backed by Lakeflow, Unity Catalog, and Databricks Assistant, Lakeflow Designer eliminates the divide between code and no-code tools. With this new approach, non-technical users gain the speed and flexibility they require to solve business problems without burdening data engineers with maintenance issues and governance headaches.
Additional Lakeflow Capabilities Launching
- Lakeflow Enters GA: Today, Lakeflow became generally available, providing a unified data engineering solution from ingestion to transformation and orchestration. Notably, the new Lakeflow Declarative Pipelines capabilities allow data engineers to build end-to-end production pipelines in SQL or Python without having to manage infrastructure.
- New IDE for Data Engineering: Lakeflow is debuting a brand new development experience that speeds up data pipeline development with AI-assisted coding, debugging and validation in an integrated UI.
- New Ingestion Connectors: New point-and-click ingestion connectors for Lakeflow Connect are launching for Google Analytics, ServiceNow, SQL Server, SharePoint, PostgreSQL, and SFTP, joining connectors for Salesforce Platform and Workday Reports that are already available.
- Direct Write to Unity Catalog with Zerobus: Zerobus enables developers to write high volumes of event data with near real-time latency to their lakehouse without the need to manage extra infrastructure like a message bus. This streamlined, serverless infrastructure provides performance at scale for IoT events, clickstream data, telemetry and other event-driven use cases.
Source: PRNewswire