Categories
Blog

Exploring the powerful business impact of Data Mesh

The topic of modernisation is not just highly relevant in terms of updating existing payments infrastructure, but also for the purpose of introducing entirely new forms of it.

Exploring the powerful business impact of Data Mesh

We were recently joined by our Data Mesh specialists as part of the esynergy Tech Series, to gain a glimpse into the truly transformational results the team has delivered. Representing this outstanding team at the event was Sunny Jaisinghani, Data Mesh Platform Owner, and Simon Massey, Data Mesh Lead Technologist.

The Data Mesh approach is a new way of designing and developing data architectures. In simple terms, it links together data that is stored in different devices, locations, and even organisations. This offers a range of game changing advantages for companies by making data easier to find, more available, and interoperable with different applications. Once implemented, the Data Mesh facilitates easy, secure access to information stored in single repositories, like data lakes, and in optimised data bases such as data warehouses.

To bring this innovative methodology and its benefits to life, Sunny and Simon walked us through the work they did for a large financial organisation. This involved supporting the company to transition from a Monolithic Data Lake architecture to a federated, self-service Data Mesh based on the Google Cloud Platform (GCP). The team worked together to overcome key challenges, they conducted numerous experiments to identify the right approach, and deployed the solution to great effect.

 

Assessing the challenge

 

Sunny set the scene by explaining that the financial organisation we supported had ‘over 1,000 IT staff, more than 100 downstream systems, and over 30 core platforms in place.’ The primary problem the organisation faced was the extremely long lead times generated by reporting and analytical use cases.

>>>Watch the full Data Mesh webinar here<<<

This situation became so arduous ‘that it almost became unfeasible to continue using the existing platforms,’ according to Sunny, which he said were made up of ‘a variety of vendor platforms, old AS/400 systems, and mainframes that just could not handle the reporting loads.’ Limited opportunities to scale and a complete lack of autonomy were also among the central challenges the organisation faced.

 

Dynamic experimentation

 

Experimentation was key to the success of the project, and Sunny said that this is because the team strives to ‘continuously innovate with data.’ To tackle the organisation’s existing challenges, we defined three key things we needed to promote, which included data innovation and enterprise scale, data infrastructure scalability, as well as autonomy and self-service. Presented with a short period of time in which to achieve these goals, the team established a dynamic, agile approach.

 

‘We ultimately decided on a time-boxed hackathon methodology,’ said Sunny, which involved four engineers with an acute focus on the task at hand. In two weeks, the team conducted and delivered seven high-value use cases, which ‘ranged from reporting, to ML, to data archival cases.’ Ultimately, three of these experiments were selected by the organisation’s leading management and implemented as solutions, transforming areas where the traditional infrastructure could no longer cope. Sunny went on to highlight the fact that ‘experimentation is what the Mesh is about, it provides the ability to experiment with data without having to commit upfront.’

 

Time to ‘Go Live’

 

The team focused on putting Client Sentiment Analysis and Transfer Agency Active Archive use cases into action, ensuring that they were high-impact, end-to-end solutions. Simon explained the approach taken by the specialist team, stating that ‘even though these solutions weren’t interlinked they ran on shared infrastructure, meaning that we had to implement multi-tenancy from the ground up.’ Another central focus for the team was implementing self-service tooling, which ‘would ultimately enable the organisation to handle data in a more nimble way,’ said Simon.

 

A trajectory of approximately 2,000 releases per month was achieved by the team, which, as Simon told us, was achieved by taking ‘a model-based approach to the data architecture.’ Using this methodology, the team had convenient access to ‘all of the bindings and templates they needed to generate the cloud-based infrastructure’ upon which the new solutions would operate.

 

To drive flexibility and bake it into the new capabilities, the team worked to enhance the process of auditability. Simon said that ‘it is critical to implement these use cases in an auditable and compliant way, and we did this while enabling the organisation to incorporate customisations.’ This involved an internal open-source model, which gave some of the first tenant users the opportunity to contribute to the early toolkits. The approach proved to be highly successful, with the organisation even providing their own code for certain customisations while we handled the core.

 

The Data Mesh in action

 

Sunny offered an impactful summary by explaining that ‘by the end of the process, we had created more than 80 GCP projects that were owned by a multitude of teams. It was all model-driven, so the team was using product specification to drive all of the data structures and pipelines.’ Initially, the team intended to just use standard templates, but during industrialisation they realised there were a number of extensions that needed to be made in a more agile, efficient way. With this enhanced, model-driven approach deployed, our team was able to help the organisation pivot to the cloud, engage in multi-tenancy, define clear contracts, and achieve full observability.

 

New call-to-action

Leave a Reply

Your email address will not be published.