The Internet of Things – the straw that will break legacy storage’s back

By Tarkan Maner, chairman and CEO, Nexenta  

  • 8 years ago Posted in

From entry level document processing to automated production lines, technology is empowering businesses to lower costs and drive profitability. As such, any new innovation is often hailed as the future, with experts quickly identifying how it can be used by businesses to drive better performance. In recent years, we’ve had the rise of tablets and smartphones, cloud computing and big data, all of which have had an impact on how business is conducted. However, the wheel of innovation continues to turn and now we have the Internet of Things (IoT).

By transmitting data in real-time from anywhere within a company network, the potential for IoT deployments to improve operational efficiencies in the enterprise is well-documented. Giving owners the power to monitor even the slightest fluctuation, data can then be analysed for more in-depth information or devices can be programmed to act for a particular purpose – from your smart-fridge telling you when you’re out of eggs to farmers attaching trackers to their cows so they know the best time to milk them.

Many of today’s firms are already reaping the benefits and the IoT market shows no sign of slowing – indeed, Gartner predicts that by 2020 there will be 25 billion connected devices globally. So, what’s the problem? However, the issue from a business perspective, is that we already have too much data to deal with. And, the IoT brings with it a unique twofold challenge: how to manage this surge of information coming from devices dotted around the company infrastructure and, secondly, how to extract critical, actionable business intelligence from it.

Unfortunately, for many legacy data centre setups, the pressure of big data is already too great – already creaking with the surge of information being generated by the enterprise as simply par for the course. At the same time, competition has grown, while margins remain razor-thin.

Storage has rightly been identified as the clot in the system. Unlike, its data centre counterparts, such as servers and network infrastructure, investment still lags and is bound up with established storage vendors. Unsurprisingly, for lack of information, many firms continue to expect their legacy storage solutions to cope with the avalanche of data they are generating and consuming – a naïve approach. Such solutions, based on proprietary hardware, were never designed to be scalable to changing demands, in fact, they were developed to deploy from a single location and manage a consistent flow of basic data. Without a storage solution that can adapt to accommodate bursts in demand for capacity, businesses will soon find themselves missing out on many opportunities an IoT-supported enterprise will create, and falling behind their competitors.

So, what’s the answer? As the other aspects of the data centre have evolved to meet modern requirements, a new technology, Software Defined Storage (SDS) solutions, have been developed to give storage equal operational efficiency. It provides businesses with an agile infrastructure, one that is flexible and scalable; ensuring it can adapt to the fluctuating capacity demands of IoT without choking. Furthermore, SDS provides consistent high performance, a streamlined data flow – achieved through automatic deduplication – and can be deployed on demand. All of these capabilities empower businesses to turn the masses amount of data produced from their connected devices into valuable, actionable business intelligence, which, in turn, can be used to drive better performance.

Also, thanks to its open-source design, SDS presents sustainable and affordable IT investment. For instance, when it comes to expansion efforts, when businesses can see that they’re nearing their SAN capacity, they can simply purchase more and cluster – there isn’t the need to purchase an entirely new solution. This means SDS can be utilised by companies of any size, with only the number of clustered SANs changing to meet increasing demand. Moreover, being hardware-agnostic, SDS solutions can fully integrate with existing data centre hardware, enabling firms to build bespoke setups which perfectly match their needs.

Ultimately, as more companies turn to the power of IoT to drive performance and improve efficiencies, the pressure it will exert on legacy storage solutions cannot simply be ignored. Developed for a time without fluctuating avalanches of complex data, their rigidity and high expansion cost means they’re now obsolete in this fast-paced internet connected world. To date, news headlines have been awash with stories about technologies that can efficiently crunch and analyse large quantities of data within tolerable elapsed times. Now, the time has come to see how a software-defined approach can enable businesses to not only achieve greater flexibility, scalability and cost-effectiveness, but also future-proof their operations for the brave, new, data-everywhere world ahead. Failing to do so will mean that many businesses will find themselves on the back foot, missing out on the many opportunities an IoT-supported enterprise can bring about.

 

Quest Software has signed a definitive agreement with Clearlake Capital Group, L.P. (together with...
Infinidat has achieved significant milestones in an aggressive expansion of its channel...
Nearly all senior business decision-makers (96%) surveyed report data strategies as essential to...
SharePlex 10.1.2 enables customers to move data in near real-time to MySQL and PostgreSQL.
NetApp extends its collaboration to accelerate Ducati Corse’s digital transformation and deliver...
Partnership to be featured at COP26, highlighting how data-driven solutions and predictive...
Next-Gen solutions to deliver market-leading enterprise cloud scalability, cyber resilience and...
he EMEA external storage systems market value was up 3.3% year on year in dollars but down 5.5% in...