Article

How Lakehouse technology can help solve your siloed data problem

8 December 2025
Learn how our experts help implement your data strategy

Dirk van Bastelaere Communication Manager CFO Services and Management Information & Systems Connect on Linkedin

Many organizations are waking up to a hard truth: technology is only one chapter of the data story. The pressing question is not just about technical specifications, but about how data architecture functions within the living organism of a business.  If teams continue to operate in silos, even the most sophisticated tools fail to create value .

To explore how technology and organization reinforce one another, we sat down with Maarten Lauwaert, Business Unit Leader for Management Information & Systems, and Data & Analytics Project Manager Evert Augustyns. For them, a technically perfect architecture is useless if it doesn't encourage teams to look beyond their departmental borders. As Evert Augustyns notes, without a shared approach, the promise of a uniform data layer vanishes. What remains is a landscape of disconnected islands.

The Importance of Data Architecture

Evert Augustyns emphasizes this point clearly. “Lakehouse is first and foremost a technology, but structuring your organization in a way that removes siloed thinking is equally important. You can have the technology, even 101 Lakehouses, but if every team builds its own island, you’ll never create a unified data layer. You end up with a Lakehouse for Finance, a Lakehouse for HR, a Lakehouse for Supply Chain, and so on.”

Creating a robust lakehouse architecture helps break down these silos and enables a central, consistent source of truth. The real challenge, therefore, lies in creating a data architecture and an organizational culture that actively dismantles those silos.

This awareness is not new. Ten to twenty years ago, big data was already a major topic: companies collected massive amounts of information but failed to fully leverage it. Yet today’s context is different. New data types, new tools, and a much broader group of people working with data mean that the traditional, highly centralized approach is no longer sufficient.

Maarten Lauwaert notes that the essence of the problem has remained fairly constant over the years. “Back then, companies used data warehouses; now we talk about lakehouse architecture. It’s not only about technology, but mainly about how the organization handles data.”

The need today is greater because data is no longer used only by IT teams, but also by businesspeople, analysts, and domain specialists. That only works when the data architecture encourages collaboration and accessibility. Where older technology did not offer that flexibility, lakehouse architecture does. According to Lauwaert, this is a real game-changer: “With the right skills, an organization can grow in its data usage and break through existing silos.”

Today, there are different kinds of data and far more people need to work with it. This is only possible if your architecture enables non-IT users to work with data as well. Older technology simply wasn’t capable of this. That is why the Lakehouse concept is a true game-changer. "With the right data engineering skills," Maarten emphasizes, "an organization can expand its data use and break down existing silos."

Lakehouse is first and foremost a technology, but structuring your organization in a way that removes siloed thinking is equally important

Evert Augustyns, Project Manager Data & Analytics, TriFinance

Why Lakehouse technology is the key to breaking through data silos

Evert Augustyns: “Microsoft uses several maturity levels. Many organizations are still in the stage where each department focuses only on its own reporting. In many cases, teams do not look at what neighboring departments are doing. Finance, for example, exports data from SAP, FMO or NetSuite and creates its own reports.

"There are also companies that have already structured and automated the process, where the data typically ends up in a SQL data warehouse. But many organizations still rely on Excel dumps that are then loaded into Power BI.

"The advantage of a Lakehouse and of Microsoft Fabric in particular is that the technical expertise required to automate data entries and store data in a structured way is much lower. This makes it far easier to make data broadly accessible and move away from siloed reporting."

Maarten Lauwaert: “I find the scale Microsoft uses to determine an organization’s maturity level very useful. The first level is about centralizing data. You ensure that there is a team within the organization that brings all data together in one place, in a central technology. In the past, this was usually a data warehouse. Today, it can also be a Lakehouse. 

"From there, you create reports that are primarily backward-looking. At the next maturity level, you attempt to develop predictive insights using that data, starting from a predictive analytics strategy, or even work with ai in your financial reporting. But that is only possible once you have taken that first step. Without that central structure, it simply won’t work. Only once you have that foundation can you truly look forward based on the data stored in your data warehouse or lakehouse.”

The advantage of a Lakehouse and of Microsoft Fabric is that the technical expertise required to automate data entries and store data in a structured way is much lower

Evert Augustyns, Project Manager Data & Analytics, TriFinance

The Impact of Lakehouse technology on different data types

Evert Augustyns: “Lakehouse and big data technologies emerged partly to enable the analysis of semi-structured and unstructured data. To be honest, we haven’t encountered truly unstructured data in our projects yet. At one client, it did come up because they wanted to analyze phone conversations, but that project never took off.

"We do see semi-structured data more frequently. More and more systems are being exposed via APIs, which provide data in formats such as JSON or XML. These do not fit in traditional tables.

"The Lakehouse concept helps bring these various data types together in one environment, with a transaction layer on top of the data lake. This allows you to manage and analyze data with the performance of a data warehouse but with the flexibility of a lake.”

Maarten Lauwaert: “Semi-structured means that structure exists, but that structure is not the same for each record. Normally a table has ten columns. In this type of data, the first row in the file may have ten columns, the second may have five, and the third fifteen. There is structure, but it is not generic across all data.”

Why traditional SQL structures fall short

Evert Augustyns: “These JSON files don’t work with fixed columns anymore. They contain areas or fields that can appear at different levels. In the first record you might see the first name ‘Arno’ and then the last name ‘Muylaert’. In the second record, it might be reversed: first the last name ‘Peeters’, then the first name ‘Jos’.

"If you want to add information about children to HR records, that field will be empty for many young employees, but for others the file will open a second level with data about their children. Because that level doesn’t appear for everyone, it becomes difficult to fit everything into a single table.

"That's why solid data management is crucial: you need an architecture that can manage different data types without them being lost or incorrectly mapped. Fields like ‘first name’ or ‘last name’ may appear not only at the employee level but also beneath it, at the level of the children. If you try to force that data into a traditional SQL table, it breaks, because you can’t have ‘last name’ twice in the same table.

"Data Lake technology was created specifically to handle these kinds of transformations and to work flexibly with semi-structured data.”

Siloed data prevents you from seeing the full picture, leads to multiple versions of the truth and forces you to spend extra time combining data

Maarten Lauwaert, Expert Practice Leader Data & Analytics, TriFinance

Lakehouse Explained for Non-Experts

Many companies today still use a traditional Data Warehouse. “That is not a problem in itself,” says Maarten Lauwaert. “But conceptually, a Lakehouse is built differently, even though both start from the same idea: bringing data from different systems together in one place. This creates a uniform data source for anyone who wants to work with that siloed data.”

By 'siloed data',  he means information stored in separate systems or departments that do not communicate effectively. “This prevents you from seeing the full picture, leads to multiple versions of the truth and forces you to spend extra time combining data,” he explains.

The traditional Data Warehouse attempted to break down these silos but reached its limits. “With the rise of Big Data, semi-structured and unstructured data became part of the picture. You could no longer store that properly in a traditional Data Warehouse,” says Lauwaert. The Data Lake offered a solution by storing raw data regardless of structure. But it had one major drawback: “Initially, it wasn’t very performant. You couldn’t run efficient queries or analyses.”

This is why the Lakehouse emerged. It combines the flexibility of a Data Lake with the performance of a Data Warehouse. That performance comes not just from the combination itself but from placing a metadata and transaction layer on top of the Data Lake—such as Delta Lake or Apache Iceberg. 

A well-thought-out data strategy can help determine which data layers are relevant to whom and how predictive analytics can be optimally deployed in finance.

Medallion is not a methodology but a design principle to structure and build quality data within a data Lakehouse architecture

Maarten Lauwaert, Expert Practice Leader Data & Analytics, TriFinance

How the Medallion architecture strengthens Lakehouse models

Architecturally, the approach is often supported by the Medallion model, with the well-known Bronze, Silver and Gold layers. “It’s not a separate methodology,” says Maarten Lauwaert. “It’s a design principle for how you structure and build quality data within a data Lakehouse architecture.”

Examples of Lakehouse technology include Microsoft Fabric and Databricks. “A Data Lake is essentially a catch-all,” says Lauwaert. “In a Lakehouse, an engine (notebooks or Spark engines) sits on top to ensure the data can be used efficiently.”

Evert Augustyns clarifies how the Medallion layers work. “In principle, you have three rooms side by side. The first room is Bronze, filled with raw data, for example from SAP or RNO. In the Silver layer, you add more structure within the same source. And in the Gold layer, you bring together aggregated and cleaned data.”

This last level is crucial for usability. “There you decide, for example, which ten of the thirty cost-center fields you still need for reporting. Sometimes only ten remain. For self-service reporting that’s convenient because the layer is clean. But for people doing deeper analysis, the Gold layer is the place to be because the data is already significantly cleaned.”

This approach supports a business strategy for data analysis and analysis, and can even serve as a basis for developing a strategy for data analysis.

Augustyns emphasizes that the different Medallion layers not only play a technical role but also determine which type of user works with which layer. “Depending on the level of analysis you want to perform and your skill set, you use either the Silver or the Gold layer,” he says. “In fact, it wasn’t so different in the past. The concept was similar; only the Silver layer wasn’t explicitly named.”

A different user segmentation

Evert Augustyns emphasizes that the different Medallion layers are not just technical constructs; they also determine which type of user engages with them. “Depending on the level of analysis you want to perform and your own skillset, you’ll work with either the Silver or the Gold layer,” he explains. “In fact, that wasn’t so different in the past. The concept was very similar, except the Silver layer wasn’t explicitly named.”

In classic data warehouse architectures, there was often a staging layer where raw data was imported. Today, we call this the Bronze layer. It is essentially the same step, but with a different name, and where we now have the Gold layer, people used to talk about dimension and fact tables that were prepared for the cube or for visualizations. But that part in between, the Silver layer, is conceptually the most interesting."

According to Augustyns, the Silver layer is not exclusively linked to a Lakehouse. “You could also build that perfectly well in SQL, in a classic Data Warehouse,” he says. "For me, Lakehouse architecture is equivalent to Medallion architecture. Whether you apply it to a data lake or a data warehouse doesn't really matter, although you see that everyone who starts with Lakehouse today automatically follows that approach."

The Medallion architecture not only changes the way data is structured, but also how end users work with it. “By placing that layer in between, you give end users an extra dataset that they can experiment with themselves,” he says. This also creates new user segments. Those who want to work with the Silver layer need more technical knowledge. In the past, only the cleaned-up data from the end layer existed.

This change has an impact on how teams work together. In the past, IT had to step in if someone needed a detail, an extra variable, or an additional field. “That then had to be added to the staging layer,” says Augustyns. “If that data was even there in the first place. Today, organizations are evolving towards a model where users can work much more autonomously.”

Greater autonomy with Lakehouse

With the arrival of the Lakehouse, that dynamic changes. “Now we’re moving toward a situation where people who are skilled with data can work independently,” he explains. “They start with validated data in the Gold layer but may notice that an attribute they need isn’t there, even though it exists in Silver. They can connect directly to the Silver layer and experiment themselves. That simply wasn’t possible before.”

The Silver layer plays a crucial role for those experimenting with data. “It’s a layer of cleaned but not yet fully constrained data,” he says. “For machine learning or other predictive models, you use this layer far more than Gold. Gold already contains a lot of structure and filtering. For predictive analytics, you want to cast a wide net to see what might be relevant.”

It’s important that this doesn’t lead to new data silos. “We want to avoid a situation where predictive teams work on one dataset while Gold reporting runs on another,” Augustyns stresses. “It must remain a single chain within the same technology. Everything starts in the big ‘junk drawer’ of the Data Lake and then moves into the layers that fit the use case: Silver for experimentation, Gold for reporting.”

He sees this approach as a major opportunity for organizations. “You can even create a ‘sandbox layer’ for strong analysts on top of Silver,” he suggests. “There they can link data, add extra Excel files, and experiment—but still within the central environment. It no longer lives on individual laptops. It remains part of the whole, visible, monitorable, and manageable.”

That requires thoughtful choices. “If a company carefully considers how to design its Lakehouse and architecture, and brings its people along in the process, it can extract far more value from its data than it does today,” says Augustyns.

Assessing maturity across domains

But technology alone is not enough. “What is the maturity level of an organization with a Lakehouse? That depends entirely on how it is used,” he says. “Installing technology alone does not automatically raise your maturity level. You must use it, and your organization must be able to support it.”

He refers to well-known maturity models. “From level two onward, silos gradually disappear and a central analytics team typically emerges,” Augustyns explains. “Such a coordinating entity ensures consistency and quality across that large pool of data.”

A maturity assessment, he says, relies on multiple domains: analytical techniques, people, culture and governance. “You may already have a central analytical department but still score low organizationally because your people haven’t adopted the approach yet,” he concludes. “Maturity is therefore never purely technological. It is a combination of structure, skills and how your organization works with data.”