“Frase di Pavan… Frase di Pavan… Frase di Pavan… Frase di Pavan… Frase di Pavan… Frase di Pavan… Frase di Pavan… Frase di Pavan… Frase di Pavan… Frase di Pavan… Frase di Pavan… Frase di Pavan… Frase di Pavan… Frase di Pavan… Frase di Pavan… Frase di Pavan… Frase di Pavan… Frase di Pavan… Frase di Pavan… Frase di Pavan… Frase di Pavan… “
Head of Data Preparation & Automated Testing, Retail Digital Transformation
Real Time Banking
About four years ago, Intesa Sanpaolo, Italy’s largest bank, began a process of digital transformation, with the definition of the new digital architecture of the bank. One of the most ambitious goals has been the transition toward Real Time Banking and Real Time Analytics. This means speeding up the data transfer between back-end and channel application of the bank to better serve the customers, explore the new business opportunities that arise from real-time or near-real-time gathering and processing of data to understand and fulfil customer expectations, desires and needs, strengthen the fraud detection capabilities and last, but not least improve back-end efficiency by eliminating the resource intensive batch processing through a switch to event-based architectures and data flows.
Intesa Sanpaolo S.p.A. is an Italian international banking group. It is Italy’s largest bank by total assets and the world’s 27th largest. It was formed through the merger of Banca Intesa and Sanpaolo IMI in 2007, but has a corporate identity stretching back to its first foundation as Istituto Bancario San Paolo di Torino in 1583.
In 2020 the bank served approximately 14.6 million customers in Italy and 7.2 million customers in Eastern and Central Europe, the Middle East and North Africa through several brands. The company is a component of the Euro Stoxx 50 stock market index.
One of the first steps in the ban’s digital transformation and migration towards Real Time Banking was the creation of an event bus with the role of central hub and fulcrum of the entire new architecture, developed in Apache Kafka. The new architecture also includes Change Data Capture solutions to publish data on Kafka as events and capture in real time all changes in the source databases, as well as application to consume, publish and process all these data. A new framework has been created to convert existing application in the new microservices pattern and introducing the necessary Kafka connectors with an asynchronous near real time communication system and new streaming event processing tools. To achieve these goals numerous open source technologies have been adopted. Alongside the aforementioned Apace Kafka, the bank chose Apache Nifi, a codeless tool to quickly implement stream processing pipelines, and Apache Storm to code those business logics that are too complex to be created in Apache Nifi. While the open source technologies brought with them innovation and positive change, they also came with a series of challenges, amongst which a very rapid evolution and in some instances a sudden deprecation and desupport. This was the case for instance with Apache Storm, that got deprecated and desk-ported when the bank had already used this technologies for a number of applications it was using n its production environment. Moreover, as projects and scope of the process kept growing, the bank realized that some of these technologies did not offer all the capabilities required of modern, cloud-ready systems, namely scalability, isolation and resilience.
Last, but not least, the bank was facing a cultural challenge related to the adoption of the new data architecture. For decades, banking IT systems have been organised in separate silos and the democratization of data was a new concept and value that had to be spread throughout all the functional units of the bank to really start the engine of transformation.
Intesa Sanpaolo chose Agile Lab as a partner to create a “mixed” developed team comprising technical experts from Agile Lab and internal personnel from the bank to develop a new framework based on Flink. The benefits of this approach proved to be twofold: on one hand the inclusion in the development team of internal personnel helped the inclusion of stakeholders, from the bank’s IT central department, the application factories and the various departments that will use the framework according to its roadmap; on the other hand the bank’s personnel received technical training on the job, familiarised themselves with Flink, acquired specific competencies and skills while directly contributing to the framework development with their knowledge of the bank’s business processes and requirements. Agile methodologies have been used with the aim of making knowledge of this framework “open source” within the bank and with the external stakeholders such as application developers.
Development of the framework is nearing completion and it is already implemented in a series of pilot projects and in the near future the goal is that of completing the migration to this new Flink based framework developed in conjunction with Agile Lab and implementing a large number of new streaming analytics projects.