Big data good - but fast data better

For Tibco, the integration of data generation, data analysis, business intelligence and event management is what makes data both fast and far more useful in a world working on real time

  • 10 years ago Posted in

It is not an original thought to say that any vendor talking about `big data’ is missing the real point. It is not an original observation to suggest that many vendors are still more than happy to talk in just such terms. In practice, however, getting value out of big data is not about an ability to handle large data volumes, it is about making real sense of the stories the data can tell, and selecting the actions that then need to be made to best exploit those stories.

In practice, this means any business having to hand the tools to integrate multiple data sources, analyse the data, and then act upon the data. These, according to Matt Quinn, CTO of Tibco,  are the three planks of the company’s product suite.

“Big Data is now really about fast data,” he said during his keynote presentation in Paris to the first of the Tibco Transform series of one-day conferences in Europe. “Just about everything these days is both creating and consuming vast amounts of data, and the need now is to capture and exploit that data `in the moment’.”

This is a reference to Tibco’s core mantra – `the two-second advantage’. In practical terms this highlights one of the main differences between Tibco and some other vendors offering tools and services in the big data arena.  This is based around a simple question: is it better to be able to extract enough information from all the data being generated when an event is actually happening, so that appropriate proactive actions can be taken to affect that event in a positive way that adds value, or be able to fully analyse every aspect of that event, and all its ramifications, sometime afterwards?

As Quinn pointed out, both options are important, but the latter can only affect future decisions, and by then of course, it may all be too late. Fast data, rather than big data, is what is needed to have any chance of affecting the here and now. And this will become ever more important as the Internet of Things becomes an integral part of managing every business.

The key to creating this real time, proactive environment is Tibco’s traditional stamping ground of integration. Quinn suggests that we have now reached the emergence of  Integration 3.0.

“Integration 1.0 was hardwired and hand coded,” he said, “while Integration 2.0 brought us things like Enterprise Service Buses and the integration of big corporate services like ERP and CRM. Integration 3.0 is being driven by the move from IT `control’ moving away from the CIO and now resting with the CMO. It is now being driven by the need for action to be taken in the moment that data becomes available.

“This also means there is a data deluge, and much of the data now has a shelf-life. Data generated two years ago may not be relevant anymore, no matter how much you analyse it. For example, such data rarely has any bearing on a retail business offering real time special offers to individual customers – at the check out - depending on their recent purchasing patterns. This means doing things in real time, working with real time data.”

In his view, what businesses now need is a spectrum of information, from real time analytics, through to post hoc analysis of the whole picture of an event.

Fast data, to Quinn, means being able to understand the past in order to predict the future based on what is happening in the now. It means having the ability to process big data in real time so that appropriate actions can be taken. In effect, this is about finding needles in current haystacks time and again, rather than finding one needle, once, in a very large haystack. The requirement in real life is to be able to influence what is just about to happen, in the context of why it is about to happen.

The need, therefore, is to ability to integrate data generated by the business infrastructure, its service providers, its customers and its employees together. And from the Tibco perspective, that means the use of its core integration tools, its Spotfire data analysis tools, its Tibbr social media data generation and collation tools and its Event Management suite where resultant interventions and actions can be initiated, managed and indeed automated.

Tibco is expanding its capabilities in all these areas, both with its own developments and through acquisitions. For example, Quinn pointed out that just three weeks ago the company acquired JasperSoft, with the aim of building a comprehensive Business Intelligence platform so that the company can not only collect and analyse data but fully interpret the results.

Commvault provides cloud-first organisations with greater choice and flexibility to protect and...
On the morning of September 20, Executive Director of the Board of Huawei and CEO of Huawei Cloud...
Global IT Business-to-Business (B2B) revenues, coming from data centers, IT services and devices,...
CrowdStrike has unveiled AI Security Posture Management (AI-SPM) and announced the general...
Research released recently shows that 67% of IT decision makers favour a hybrid hosting...
New private cloud contract re-affirms HPE GreenLake Cloud as a core pillar of Barclays’ hybrid...
CAS leverages upgraded mission-critical private cloud environment to support cutting-edge,...
AWS’s planned investments are estimated to contribute £14 billion to the UK’s total GDP over...