INDUSTRY: Financial Services

LOCATION: Amsterdam, Netherlands



  • Save approximately three years in data/application migration time
  • Enable nearly $12 million per year in IT capex/opex savings
  • Provide transparency into data lineage upstream and downstream
  • Deliver one-to-one transpilation from Teradata SQL to Azure SQL 
  • Help to simplify data architecture by revealing unused data and other resources
  • Streamline regulatory compliance by making it easier to show how modelling results were achieved
  • Save time, reduced cost and hedged risk when it came to large-scale data pipeline migrations
  • Migrate Informatica PowerCenter workflows to Azure Data Factory pipelines



“Without CompilerWorks software, we would not have been able to migrate our critical risk models to our new platform in the targeted timeframe. For us, CompilerWorks provides insurance against unforeseen delays.”

— Marcel Kramer, Director of Data Engineering

Company Overview

ABN AMRO Bank N.V. is the third largest bank in the Netherlands with more than $465.8 billion in assets. The bank can trace its roots back through a series of mergers to the 1700s. In 1991, the current bank configuration was achieved when the two largest banks in the Netherlands—Amsterdam-Rotterdam Bank (AMRO) and Algemene Bank Nederland (ABN)—joined forces.

Headquartered in Amsterdam, the bank has 19,000 employees and maintains branches in 10 other countries. ABN AMRO is focused on the Dutch and northwest European markets. More than 80% of the bank’s operating income is generated in the Netherlands where more than a quarter of the population has an account with the bank.

The bank has four main businesses: Retail, Commercial, Private, and Corporate and Institutional Banking. These businesses are supported by group functions such as Innovation and Technology, Finance, Risk Management, HR and Transformation, Group Audit, Strategy and Sustainability, Legal, Corporate Office and Brand, Marketing and Communications.


According to ABN AMRO’s most recent annual report, the bank will continue to push ahead with IT transformation, with the goal to increase the proportion of teams working on the public cloud to at least 55%, up from 30% in 2020. From a data point of view, an important part of this IT transformation began in 2019 with the transition of its 90 TB on-premise appliance-based Teradata enterprise data warehouse (EDW) to a platform as a service (PaaS) architecture based on Microsoft’s Azure cloud services.

ABN AMRO’s IT leadership imposed a firm target of two years for the initial migration of applications and services that depended on their legacy EDW. This target was set in part to push adoption of a more modern data architecture but also to optimize cost and hedge against the 2021 expiration of support for the version of Teradata running the EDW. In order to meet this target, 62 user groups were asked to re-engineer their EDW workloads from scratch on Azure, many of which had been developed over 10 to 15 years. After an initial assessment, about 80% of the user groups reported they believed this was feasible; however, a critical 20% of the user groups advised leadership they would be unable to meet the two year timeline due to regulatory requirements such as Basel III/IV, ECB, DCB, AFM, and GDPR.

According to Marcel Kramer, ABN AMRO’s director of data engineering, “A limited but critical number of users told us that they were running business critical applications using data sets that had been developed for the past 10 to 15 years on our Teradata EDW and they were not going to be able to re-create those data sets in the time allotted.”

One user group in particular was responsible for maintaining 24 financial and non-financial SAS-based risk models. “These models are highly regulated and would have to be re-validated by regulators if changes were made to the data or logic,” Kramer notes. “We estimated each model would take a couple of people eight months to re-engineer. With 24 models that amounts to 32 man-years worth of work. If we parallelized the work, we estimated that it was going to take at least four years to rebuild those models manually and have them revalidated by regulatory bodies. It was going to delay our effort to decommission the on-premise EDW.”

As the team deliberated how to address this, they started to hear additional concerns from a subset of the user groups that had embarked on re-engineering their workloads from scratch in hopes of meeting the target timeline. “A lot of things were not documented. A lot of the business logic we had built into our EDW was never documented by previous employees so it was difficult to explain why the logic was written the way it was. It was kind of a black box we were looking at.” With this discovery, it was starting to become clear to the team that a purely manual re-engineering effort was not going to be adequate in all cases.

ABN AMRO data architect

The Results

According to Kramer, the combination of CompilerWorks Lineage and Transpiler solutions provides a form of insurance. “Now that we are able to offer CompilerWorks to the critical 20% of highly regulated user groups that can not take the re-engineering approach, we know we are able to decommission our EDW within the timelines set. We are able to show them that their models are going to be computed using the exact same logic on Azure as on Teradata and Informatica thus giving them the exact same output.” Ultimately, the team is now able to condense what was originally estimated to be four man years worth of work into one year and successfully migrate these critical user groups to Azure within the target timeframe. “Because we can do a full comparison on screen of the Teradata datasets and the Azure datasets we will be able to decommission the legacy EDW and reduce our TCO related to the EDW by more than 10 million euros ($12 million) per year”.

In addition to being a crucial lifeline for user groups that can not take the re-engineering approach, CompilerWorks brought clarity to, and accelerated the re-engineering efforts of these user groups. In some cases, organizational memory for key data processing pipelines had been lost, either due to poor documentation or to the absence of employees that had originally built the pipelines. CompilerWorks Lineage automatically extracted the necessary information to generate a deep understanding of the Teradata and Informatica logic and present it in such a way that these pipelines can be successfully re-engineered on Azure. “Even if we chose not to automatically transpile a pipeline but instead manually recreate it, we could use Lineage to tell us we need to look at X, Y and Z tables and from table X we need to pick a certain field and from table Y we need to pick another field and based on certain logic we need to join to a field in table Z. Lineage was offering a depth of information and understanding that we wouldn’t have had otherwise,” Kramer notes. CompilerWorks Lineage opened the “black box” and cast a light on all the dark corners of the code base that would need to be addressed. CompilerWorks Lineage accelerated and enabled accurate re-engineering of the Teradata and Informatica code on top of Azure.

Finally, beyond the financial benefits of shortening the migration timeline and retiring the EDW, CompilerWorks enabled the organization to adopt a modern data architecture years earlier than otherwise possible.  “Of course there’s a cost driver,” Kramer notes. “But even more importantly,  CompilerWorks is enabling some of our user groups to move to the more scalable and robust environment of Azure faster. As soon as our users adopt Azure’s capabilities, many more possibilities emerge – such as the ability to use unstructured data and streaming data instead of just batches and a purely structured format. Being able to put a firm end date on the old platform enabled us to push adoption of the new capabilities offered by Azure and our modern architecture.” 

ABN AMRO data architect—showing a multifaceted data mesh architecture

Future View

ABN AMRO believes that modern data architectures will be much more distributed than their centralized EDW ancestors, especially when run in the cloud. To support this decentralized model, they are building a self-service Data Marketplace on Azure that offers their user groups access to data from across the enterprise. Information such as data quality, ownership, and lineage over time will be built into the marketplace and allow data consumers to demonstrate internal and external compliance at any given point in time. 

“We’re considering CompilerWorks Lineage for the ongoing lineage capability we would like to offer on our Data Marketplace. It could help us reduce the complexity of our data architecture and save cost by decommissioning datasets that are no longer used. But also, it allows us to explain in detail to regulators how each figure across our enterprise is calculated.” 

In addition to Lineage, Kramer also sees a place for Transpiler in the Data Marketplace to aid the assessment and execution of future migrations. “CompilerWorks Transpiler can be executed via an API or web app frontend that allows users to submit their code for transpilation into any given database dialect. It would allow our users to compare how code is translated to different dialects and perform a risk assessment before migrating to different platforms.”

“We’re considering CompilerWorks Lineage for the ongoing lineage capability we would like to offer on our Data Marketplace. It could help us reduce the complexity of our data architecture and save cost by decommissioning datasets that are no longer used. But also, it allows us to explain in detail to regulators how each figure across our enterprise is calculated.”

— Marcel Kramer, Director of Data Engineering

Automatically analyze, convert, and optimize enterprise data processing code

Learn more about how our two core applications can change how you migrate and maintain enterprise-wide data processing.