AI on Pi Day Highlights

We are excited to announce a new column in our Forecaster article where we’ll be showcasing our sessions! You’ll find them listed in chronological order, and as a bonus, we’ll be providing some complimentary gifts. To find them, just check the presentation links. Easy peasy!

Optimizing Lakehouse Strategies with the Modern Data Stack by Vincent Heuschling

Traditional data warehouses, while structured and optimized for querying, can be costly and inflexible. Data lakes, on the other hand, offer a more affordable and scalable solution for storing all data types, but lack the structure for easy analysis. This article looks at what data lakehouses are, what they do, and how modern tools can make data transformation easier.

What is a data lakehouse?

A data lakehouse is a way of managing data that combines the flexibility of a data lake with the structure and governance of a data warehouse. It allows organizations to store all their data, structured and unstructured, in one place. This data can then be transformed and prepared for analysis based on specific needs.

Here are some of the advantages of data lakehouses:

  • Cost-effectiveness: Data lakehouses use cloud storage like Amazon S3, which lets you pay for what you use. This eliminates the high upfront costs associated with traditional data warehouses.
  • Scalability: Data lakehouses can easily scale to accommodate growing data volumes. Storage and compute resources can be scaled independently, allowing for efficient utilization.
  • Flexibility: Data lakehouses can store all kinds of data, not just structured data. This makes them ideal for modern data analysis needs like machine learning and AI.
  • Faster Time to Insights: Data lakehouses enable faster access and analysis of data. By storing all data in a central location, organizations can readily discover valuable insights from various data sources.


Data lakehouses are a great way for companies to manage and analyze all their data. They combine the scalability and cost-effectiveness of data lakes with the structure and governance of data warehouses. This lets companies use all their data to find valuable insights. In-place transformation tools like DuckDB and DBT take this approach to the next level by making data transformation easier within the lakehouse, which means faster insights and more efficient data use. As data is still the lifeblood of modern businesses, data lakehouses are set to be a key technology for data-driven decision making.

Watch the video.

We work on

Bitol is a Linux Foundation AI & Data Sandbox project. Open Data Contract Standard (ODCS) v2.2.2 has been published. Here is the update and a quick roadmap by Jean-Georges Perrin, chair of the TSC.

Share and Stay Tuned

Become a Member if you want to be a part of the story.

Share Events and News you find interesting with us here! We will give it a shout on our new newsletter AIDA Forecaster!

For exciting updates and valuable insights, visit us at and on LinkedIn. Stay tuned for more!

Leave a Reply