Databricks, the Data and AI company, announced the upcoming Preview of Lakeflow Designer, a powerful new no-code ETL solution designed to help non-technical users create production-grade data pipelines. The intuitive tool features a drag-and-drop interface and is enhanced by a generative AI assistant, allowing business analysts to build data workflows with ease. Lakeflow Designer is built on Lakeflow, Databricks’ unified data engineering framework, which is now Generally Available (GA).
For years, organizations have had to choose between agility and control-either enable analysts with low-code tools that compromise on governance and scalability, or rely solely on overburdened engineering teams to build and maintain pipelines. This approach often leads to fragmented systems and operational inefficiencies. With Lakeflow Designer, Databricks bridges this divide, offering a solution that combines ease-of-use with enterprise-grade performance and oversight.
“There’s a lot of pressure for organizations to scale their AI efforts. Getting high-quality data to the right places accelerates the path to building intelligent applications,” said Ali Ghodsi, Co-founder and CEO at Databricks. “Lakeflow Designer makes it possible for more people in an organization to create production pipelines so teams can move from idea to impact faster.”
Empowering Business Analysts with AI-Powered Data Prep
Lakeflow Designer introduces an AI-native, drag-and-drop interface backed by Unity Catalog, Databricks Assistant, and Lakeflow. This integration empowers business analysts to independently build and manage data pipelines without writing code—while still adhering to the governance, reliability, and scalability standards required by IT and data teams. By closing the gap between technical and non-technical users, Lakeflow Designer reduces engineering bottlenecks and accelerates business impact.
New Features Now Available with Lakeflow GA
Alongside the launch of Lakeflow Designer Preview, Databricks has made Lakeflow generally available, expanding the platform with several enhancements:
-
Declarative Pipelines: Data engineers can now build end-to-end production pipelines using SQL or Python without managing complex infrastructure.
-
New Development Environment: A reimagined IDE provides AI-assisted coding, debugging, and validation, streamlining the data pipeline creation process.
-
Expanded Ingestion Connectors: Lakeflow Connect adds point-and-click support for Google Analytics, ServiceNow, SQL Server, SharePoint, PostgreSQL, and SFTP, in addition to existing connectors for Salesforce Platform and Workday Reports.
-
Zerobus for Real-Time Event Ingestion: Developers can now write massive volumes of event data-such as telemetry and IoT streams-directly to Unity Catalog with minimal latency and no extra infrastructure, thanks to Zerobus’ serverless architecture.
Customers See Immediate Benefits
Early adopters are already reporting significant improvements in productivity and insights:
“The new editor brings everything into one place – code, pipeline graph, results, configuration, and troubleshooting. No more juggling browser tabs or losing context. Development feels more focused and efficient. I can directly see the impact of each code change. One click takes me to the exact error line, which makes debugging faster. Everything connects – code to data; code to tables; tables to the code. Switching between pipelines is easy, and features like auto-configured utility folders remove complexity. This feels like the way pipeline development should work.” – Chris Sharratt, Data Engineer, Rolls-Royce.
Also Read: ORO Labs Unveils Agentic AI for Secure Autonomous Procurement
“Using the Salesforce connector from Lakeflow Connect helps us close a critical gap for Porsche from the business side on ease of use and price. On the customer side, we’re able to create a completely new customer experience that strengthens the bond between Porsche and the customer with a unified and not fragmented customer journey,” said Lucas Salzburger, Project Manager, Porsche Holding Salzburg.
“Joby is able to use our manufacturing agents with Lakeflow Connect Zerobus to push gigabytes a minute of telemetry data directly to our lakehouse, accelerating the time to insights – all with Databricks Lakeflow and the Data Intelligence Platform.” – Dominik Müller, Factory Systems Lead, Joby Aviation.
With the introduction of Lakeflow Designer and the expansion of Lakeflow’s capabilities, Databricks is setting a new standard for unified, scalable, and accessible data engineering-empowering both analysts and engineers to innovate faster and smarter.