Gaining transparent and actionable insights through intelligent data fabrics

Aging data lakes, while popular in previous years through their promise of being able to provide an overview of useful business data, haven’t lived up to the expectations they were due to deliver. Instead they have struggled to provide the true visibility for transparent business decision making. By Joe Lichtenberg, Product and Industry Marketing, InterSystems.

  • 3 years ago Posted in

Now, as technology continues to advance and the need for real-time access to data becomes increasingly important, particularly in light of ongoing macro-economic trends, new data management solutions are required to utilise existing architectures in place and to clear the murky data created in these lakes. Alongside other advantages, utilisation of intelligent data fabrics are now allowing businesses to retrieve key information from their data to provide actionable insights. By having this ability to access data in real-time, businesses can move forward confidently during a crisis.

Bringing together real-time and historical data

It has become increasingly difficult to integrate, transform, normalise, and harmonise the many different data points in data lakes so that organisations can gain a consistent and comprehensive overview and are able to use them effectively. Further complexity has been added due to the growing availability of real-time data and the subsequent requirement to harmonise this alongside batch data. Additional complications arise when businesses need to use real-time and historical data to make decisions in the moment. Ultimately, data lakes have shown themselves to be incompatible with these requirements, and many organisations are now looking for a way to combine both real-time data and batch data in a way that allows them to gain actionable insights.

Intelligent data fabrics provide the opportunity for businesses to complement their current technology with new innovations, and enables them to continue to extract value from their existing data architecture and investments without having to rip and replace old systems. In many businesses that are operating in a highly siloed, distributed environment with many legacy applications and data stores, this need is coupled with the requirement for technology that can create interfaces to their existing infrastructure. They also need to be able to aggregate, integrate, transform, and normalise the data on demand. With data lakes proving themselves to in effect be just another silo, a new approach

is needed for businesses to get the most out of the data at their disposal. This is where intelligent data fabrics can provide a steppingstone to the next generation of data architecture.

Why intelligent data fabrics are different

In their ability to transform and harmonise the data so that it is actionable, intelligent data fabrics possess a key differentiator. They are able to incorporate a wide range of analytics capabilities, from analytic SQL to machine learning, to support the needs of the business. By allowing existing applications and data to remain in place, intelligent data fabrics enable organisations to get the most from previous investments, while helping them gain business value from the data stored in lakes quickly and flexibly to help meet the needs of a variety of business initiatives. This includes everything from scenario planning and risk modelling, to running simulations for wealth management to identify new sources of alpha.

Using traditional technologies with data lakes can create challenges in trying to gain these capabilities. Via traditional means, multiple architectural layers would be needed, including scalable data stores, an integration layer, transformation, normalisation, and harmonisation capabilities, a metadata layer, as well as a real-time, distributed caching layer. Then, there is also a need for an intelligence layer, with application logic and analytics capabilities, and a real-time layer. Building such an architecture traditionally required a wide range of products as well as integrations and maintenance of the products, making it extremely complex and costly to build and maintain.

New advances in technology can assist the stack in terms of implementation, maintenance, and an application development standpoint, making it more streamlined and simplified. There is no longer a need for different development paradigms to manage the various application layers. It is also higher performance as latency is reduced due to the removal of interfaces to connect the different layers of the architecture, allowing organisations to incorporate transaction and event data into analyses and processes in near-real-time.

The role of data fabrics in digital transformation strategies

Crucially, increases in data volumes and workloads can be accommodated through the scalability of modern intelligent data fabrics. For example, in the finance sector this is particularly critical where markets and volatility levels have spiked during the COVID-19 pandemic. Furthermore, it can assist a business’ long-term goal for digital transformation as it breaks down data siloes, helping to remove operational inefficiencies and streamline processes which are the central aims of all digital transformation strategies. Once siloes have been broken down, organisations gain an overarching view of the enterprise data from internal and external sources, and with that comes the synergy to be able to use that data for a wider range of purposes.

Along with the benefit of information being accessed from all corners of the organisation, alongside the all-important metadata, it also enables data provenance and lineage. This is critical for businesses to be able to understand the source of the data and what actions have been applied to it so that they can validate and trust the data which is being used to make significant business decisions.

Real-time insights from the cockpit perspective

Incorporating an intelligent data fabric can provide a comprehensive, real-time operational “cockpit” to the business. To see the value of this in practice, look at flying a plane - a scenario in which pilots need to synthesise a variety of data to do so safely. Thanks to advances in technology, pilots now have all the signals they need being combined and analysed in real-time and presented in a display and with alerts that can predict the risk of incidents and suggest corrective actions in real-time, without requiring the pilot to manually interpret different signals from the various parts of the plane. In times of crisis, such as an imminent stall, these capabilities become critically important. Similarly, businesses today want this same capability to filter out the data that isn’t important and to bring the information that is to the surface. These capabilities can steer the business in normal times and become critically important in times of crisis as we’re seeing now.

Transparency is key for future architectures

With external crises impacting on organisations’ decision making, businesses are looking for transparency and insights in data to help devise better strategies now and for the future. Whether batch or real-time data, intelligent data fabrics allow businesses to deliver value back to their customers and gain increased efficiency. Through use of these techniques, data transparency can be achieved without businesses having to upgrade their technology infrastructure, helping to save on costs and ultimately reduce risk moving forward.

 

By Ram Chakravarti, chief technology officer, BMC Software.
By Darren Watkins, chief revenue officer at VIRTUS Data Centres.
By Steve Young, UK SVP and MD, Dell Technologies.
By Richard Chart, Chief Scientist and Co-Founder, ScienceLogic.
By Óscar Mazón, Senior Product Manager Process Automation at Ricoh Europe.
By Chris Coward, Director of Project Management, BCS.