Software-defined Storage will change the Data Storage Industry in 2015

By George Teixeira, DataCore Software.

  • 9 years ago Posted in

Our industry is changing. Major data storage vendors have seen their business erode as new trends emerge. Software-defined everything and software-defined storage will continue to evolve in 2015, driven by the productivity benefits it can provide to customers. The disruption will have greater impact on traditional storage hardware system suppliers since software-defined storage promises to commoditize underlying storage devices and raise the scope and span of storage features and services to a higher, more productive level versus being locked down to specific devices. True software-defined storage platforms will allow these devices to ‘do more with less’ by increasing utilization and working cross-platform and infrastructure-wide. Bottom-line, the compelling economic benefits, better productivity and the need for greater agility to meet future requirements will drive software-defined storage to become a mainstream solution in 2015. Therefore, prediction #1: Software-defined storage will finally go mainstream in 2015.

Other storage-related predictions include:

#2: Servers in 2015 will become a more significant factor in displacing traditional storage arrays. Servers will combine with software-defined storage and continue to build momentum for a new class of ‘storage servers’ and hyper-converged virtual SANs. The latest generation of servers is powerful and will continue to support even large amounts of storage. Software-defined storage software solutions such as DataCore’s Virtual SAN software are designed to eliminate the hassles and complexity of traditional storage networks and yet provide a growth path. Virtual SAN software is maturing rapidly and it will further drive the transformation of servers into powerful storage systems that go beyond today into full blown enterprise-class virtual SANs. (See Dell 4Enterprise Blog, “Dell PowerEdge Servers Make Great Software-Defined Storage Solutions.”)

#3: Disk and flash must play nicely together in 2015; software stacks must span both worlds. This past year saw the continuation of the “flash everywhere” trend, with flash devices rapidly moving along the growth path from being utilized in servers to being used across the board. This brought a lot of new storage companies into the market initially, but has also now brought on a high degree of consolidation, as evidenced by SanDisk’s acquisition of Fusion-io and the number of start-ups that have disappeared. As flash finds its place as a key technology and is further commoditized, the market can’t sustain the number of companies that were introduced in the early stages of the market, so further consolidation will happen.

The coming year will show us that flash can be used as a more practical technology, and that it needs to be reconciled with how it will work with existing disk technologies. Flash is excellent for specialized workloads that require high speed reads such as databases, but it is not a cost-effective solution for all workloads and still makes up a very small fraction of the installed storage base overall. On the other side of the spectrum are low-cost SATA disk drives that continue to advance and use new technologies like helium to support huge capacities, up to 10 TB per drive, but they are not highly performant and are slow. Write-heavy transaction workloads also need to be addressed differently. (See New Breakthrough Random Write Acceleration and Impact on Disk Drives and Flash technologies). The industry would have us believe that customers will shift 100% to all flash, but it is not practical due to the costs involved, and the large installed base of storage that must be addressed.

In 2015, we will need smart software that has the feature stack that can optimize the cost and performance trade-offs and migrate workloads to the right resources needed whether flash or disk. Software-defined storage done right can help unify the new world of flash with the existing and still-evolving world of disks. Both have a future.

#4: Hybrid clouds and cloud-based disaster recovery solutions will become increasingly practical to implement in 2015. Businesses are continuing to struggle to figure out how best to use the cloud. Increasingly, enterprises are dealing with managing both on-premise storage and off-site cloud storage (hybrid cloud). This will become a much bigger issue in 2015 as customers are becoming smarter about what workloads are practical to plan in the cloud and what are not. On-premise storage is usually allocated for active data such as databases and transaction-oriented business. The cloud continues to be used typically for back up, archive and disaster recovery versus production workloads, because of the speed of the internet. New solutions are emerging such as DataCore and Microsoft StorSimple, which combine to allow data (from any storage) to be seamlessly migrated from on-premise to a cloud such as Microsoft Azure. This will fuel the larger trend, which is for enterprises to do a mix of on-premise and cloud. In addition, while doing disaster recovery from the cloud remains complex, new integration tools and more automated processes are on the way to make this a more practical solution.


#5: Managing internal investments with a cloud model will become a bigger trend in 2015. Enterprises want to emulate the productivity of what cloud providers can achieve. However, to do so, they need to move to a Quality of Services (QoS) model providing comprehensive virtual data services. For example, as storage needs continue to grow, enterprises must be able to manage, regulate and create a segregation of storage resources to match the utilization patterns of different departments. A case in point might be finance, which may need a higher level of performance than those simply doing word processing. Quality-of-Service settings are needed to ensure that high-priority workloads competing for access to storage can meet their service level agreements (SLAs) with predictable I/O performance. See (DataCore Quality of Service Capabilities). These QoS controls enable IT organizations to efficiently manage their shared storage infrastructure. Storage resources can be logically segregated, monitored and regulated on a departmental basis.


Software-defined storage must continue to evolve to address these needs. Virtual data services need to be created that provide storage services without regard to the storage devices that provide the ability to regulate how resources are being utilized across the infrastructure, while making the process more automated.
 

Quest Software has signed a definitive agreement with Clearlake Capital Group, L.P. (together with...
Infinidat has achieved significant milestones in an aggressive expansion of its channel...
Nearly all senior business decision-makers (96%) surveyed report data strategies as essential to...
SharePlex 10.1.2 enables customers to move data in near real-time to MySQL and PostgreSQL.
NetApp extends its collaboration to accelerate Ducati Corse’s digital transformation and deliver...
Partnership to be featured at COP26, highlighting how data-driven solutions and predictive...
Next-Gen solutions to deliver market-leading enterprise cloud scalability, cyber resilience and...
he EMEA external storage systems market value was up 3.3% year on year in dollars but down 5.5% in...