Transforming Financial Data Management
A leading insurance provider partnered with Agile Lab to implement a comprehensive FinOps strategy, achieving substantial cost savings while maintaining innovation capabilities during their digital transformation journey.
Customer Context
Financial Services regularly need to update their records, especially in a multinational context with dozens of subsidiaries in multiple countries and cities. One of our customers sought a scalable and reliable data platform capable of handling complex data ingestion and processing workflows while ensuring high data quality.
Additionally, the company required a solution that could support diverse data consumption needs across various teams, enabling seamless collaboration and consistent access to trusted financial information.
The Challenge
Several key obstacles needed to be addressed:
- Data Ingestion from On-Premises Sources: The need for quick data availability led to an initial workaround using Microsoft PowerBI flows to export data in Common Data Model (CMD) format. This data was then ingested into Snowflake through Azure Data Factory (ADF) pipelines. While this approach provided a temporary solution, a more sustainable and efficient method was required for long-term success.
- Orchestration Complexity: Coordinating multiple stages — data export, ingestion, and processing — posed a challenge, requiring a well-structured workflow to ensure smooth data operations.
- Monitoring Data Freshness and Quality: Reliable financial reporting and decision-making depended on accurate and timely data. Establishing robust monitoring mechanisms was crucial to maintaining data integrity and freshness.
- Diverse Data Sharing Requirements: Different teams within the organization used varied architectures and methods for consuming financial data. This required a standardized yet flexible approach to data sharing that could cater to different business needs.
The Solution
To tackle these challenges, the project introduced a comprehensive cloud-native data platform built on Snowflake, DBT, Airflow, Terraform, ADF Pipelines, and PowerBI. The implementation strategy focused on four key initiative.
1. Central Data Hub
2. Data Transformation
3. Enhancing Engineering Practices
4. Data Sharing Methods
By defining clear data-sharing frameworks, the platform facilitated seamless collaboration and efficient data exchange among different teams and stakeholders.
By defining clear data-sharing frameworks, the platform facilitated seamless collaboration and efficient data exchange among different teams and stakeholders.
Powering Digital Transformation through Data Platform Enablement



Data-driven organizations are three times more likely to report significant improvements in decision-making speed, helping them to respond faster to market changes
(Source: HARVARD BUSINESS SCHOOL)
Data Platforms can allow companies to realize cost savings of up to 15% through minimized redundancies, optimized resource utilization and streamlined processes.
(Source: McKinsey&Company)
Companies focusing on structured data management can improve data accuracy and consistency by 10-20% through centralized data platforms
(Source: McKinsey&Company)
Real-World Impact and Benefits
The implementation of this cloud-native platform yielded several key benefits
Operational Area | Before Implementation | After Implementation |
---|---|---|
Data Engineering Practices | Manual data exports (PowerBI flows), complex orchestration with no mention of CI/CD or version control. | Enhanced practices with Git-based version control, CI/CD pipelines, and improved orchestration for streamlined, reliable workflows. |
Data Sharing & Collaboration | Diverse teams used varied architectures and methods for consuming data, leading to fragmented access. | Standardized, structured framework for data exchange enabling multiple teams to access and utilize financial data effectively, fostering better collaboration. |
Infrastructure & Scalability | Legacy on-premises systems with temporary cloud workarounds (ADF pipelines for Snowflake ingestion). | Progressive transition to a modern, scalable cloud-native data platform (Snowflake, DBT, Airflow, Terraform), ensuring a future-ready, agile, and cost-effective ecosystem. |
Conclusions
Adopting a cloud-native data platform has revolutionized financial data management across the organization. Centralizing data in Snowflake provides a scalable, high-performance foundation that seamlessly connects on-premises and cloud sources, eliminating bottlenecks and enabling greater automation.
Integrating DBT ensures consistent, governed, and transparent data workflows, improving data quality and reliability for accurate reporting and informed decisions. Airflow and CI/CD pipelines automate and streamline operations, reducing manual workload and risk of errors.
Standardized access allows teams to collaborate easily and ensures that financial data is available to all relevant stakeholders. Moving from legacy on-premises systems to a cloud-first approach delivers long-term scalability and cost efficiency.
By combining Snowflake, automated pipelines, and modern engineering practices, the organization’s data infrastructure is now ready to support innovation, drive smarter decisions, and adapt quickly to the future of financial services.