Analysts have queued up to champion the software-defined data centre (SDDC) as the next big thing, while vendors have raced to have their voices heard as a leader in this growing field.
And who can blame them. The origins of the SDDC vision can essentially be drawn back to VMware’s virtualisation of the x86 architecture, an advancement which has seen compute virtualisation become a near-ubiquitous deployment throughout the enterprise world.
The SDDC basically extends the premise of virtualising servers, and wrapping them with highly automated software, across an organisation’s entire infrastructure. Thus moving from the traditional model of managing infrastructure, to one of automating infrastructure and delivering IT as a service.
‘We’ve found customers taking a real interest in the idea of the SDDC, its different components, and the real benefits to be reaped,’ says Bob McEwan, UK&I presales manager at HP. ‘We’re well on our way to a software-defined ecosystem, and now many vendors offer systems to support virtualisation across the data centre.’
Indeed, given the mass user uptake of compute virtualisation, it’s no wonder that the promises of SDDC have caused such a buzz. It’s that buzz, however, that appears to have stunted its growth.
Frankly, the vast vendor noise has left CIOs with conflicting stances on exactly what an SDDC is, what it can do for their business, and what it means for their IT setup.
‘At the moment we are in the early stages of adoption of the SDDC and, for many businesses, a true SDDC will be some distance off,’ says Steve Riley, technical director at Riverbed Technology.
‘Organisations are still trying to figure out which vendor’s “vision” they should follow or, assuming that SDDC is still in its testing phase for many, they might look at a variety of vendors they can use at the same time without being tied to one.’
Wishy washy
With servers and applications already virtualised by most organisations, it is the components of networking and storage that stand in the way of achieving an SDDC.
>See also: Dissecting the software-defined data centre
However, exactly what qualifies a network or storage array as ‘software-defined’ has caused dispute in the industry, with the term ‘software-defined washing’ even being spawned to dismiss those trying to deceitfully rebrand old products as such.
‘There’s a perception that programmable means software-defined, but this is not true,’ Riley says. ‘Just because a product has an API (application programming interface) does not mean it is software-defined.
‘If we insisted that software-defined be limited to the classical definition of something that used to be done in specialised hardware but is now done in software running on general purpose hardware, then it may be easier to identify products that are truly software-defined – as opposed to those just using the term as a marketing spin.’
For end-users, this is causing nothing but confusion as they try to comprehend how software-defined technologies will work for them, and how it will affect their business processes.
Despite this, the market is expected to grow as these new models of deployment and service delivery can offer significant benefits.
What businesses must remember, however, is that these benefits will be negated if they are not fully prepared, warns Tony Thompson, VP of marketing at Silver Peak.
‘It is therefore important that businesses are able to recognise how they can better transform their business and overcome the challenges this technology poses.’
Of the two components, it appears that storage poses the greater challenge, mostly because it’s very much affected by server virtualisation.
The virtualisation of networks has seen greater traction to date, albeit not quite to the extent of what a so-called ‘true’ software-defined model wishes to deliver.
‘For software-defined networks, we expect the use cases to improve considerably and we expect enterprise adoption to increase in the next three-to-five years,’ says Nishikant Nigam, VP and global head of infrastructure services at ZK Research predicts that software-defined will reduce TCO in the data centre from 40% with legacy environments to 20%. But, with all IT delivered as a service, this wildly disrupts the make-up of a traditional IT team.
>See also: Storage ‘gravity’: the key to unlocking the software-defined data centre
The ability, or indeed willingness, of the CIO to take on this realignment is likely to be the key factor of when, or whether, this deployment takes off.
Furthermore, with the virtualisation of the data centre likely to transform the formation of the IT team, the CIO will also have to handle any employee resistance it may cause
According to Doug Hazelman, VP of product strategy at Veeam, the employees should in fact be the ones embracing this change.
‘The more the data centre can operate autonomously, the more time the CIO and IT employees have to gain further efficiencies and deliver the always-on business that their employers are increasingly demanding,’ he says.
‘Every new, disruptive technology will result in a shake-up of the job market. However, these are not lost jobs but rather shifting skillsets. If IT employees can follow these shifts then they can move along with the industry, rather than competing for a shrinking number of positions in a limited market.’
Indeed, a software-defined model will not succeed if the IT team still possesses a hardware-defined model.
In the world of technology, change is inevitable, and whether it’s embraced or not, it will likely happen nonetheless.
And for experienced CIOs, this is not the first time they’ve experienced this technological and consequently cultural change in IT.
‘There was resistance to change from the mainframe team when open systems took off, there was no such thing as a virtualisation team when VMware was first introduced, and only now are we seeing converged infrastructure teams being established despite the market existing for over three years,’ says Archie Hendryx, principal vArchitect EMEA at VCE.
‘For the traditional IT teams to accept this change, they need to recognise how it will inevitably benefit them.’
The role of today’s CIO is to make IT a strategic business enabler, rather than a cost centre, and software-defined can help in this strategic imperative.
IT must dramatically change how it delivers services to the business, embracing agility, speed and coordination, so that it can immediately respond to business requirements.
To do that, it must address the deeply entrenched organisational silos that exist within IT, where typical infrastructure deployments can take months. The SDDC is posed as the answer.
‘Software-defined will deliver faster time-to-service, optimise infrastructure utilisation and move provisioning time from months to hours or even minutes,’ says Randall Cross, director of fabric and infrastructure at Avaya.
Hendryx adds: ‘The software-defined model provides speed and agility to the extent that organisations can encapsulate their business requirements into business delivery processes.’
Waving the flag
The next couple of years will be marked by experimentation and learning everywhere, from product developers to service providers to network operators.
Incumbent vendors will wave the software-defined flag over everything they produce, so end-users must examine claims very carefully before committing.
The adoption curve will be most limited by the lack of user awareness, so education and training will be essential. The UK’s first software-defined anything (SDx) event, SDx Symposium, on May 15, is a good start.
In the meantime, the next couple of years will see an explosion of orchestration and application software choices, from incumbent vendors, ISVs, system integrators, and a raft of startups focused on specific needs of enterprises.
‘As the ability to program a pool of networking, compute, and storage components to operate dynamically into a customisable portfolio of functions gets into the hands of network operators of every scale, there will be no turning back to the days when we could not program everything,’ says Dan Pitt, executive director at the Open Networking Foundation.
‘Indeed, software-defined will become the new norm for IT.’
The ideal strategy should offer a simple yet evolutionary path from existing legacy technologies to a software-defined model.
This will likely see popularity around solutions offering the flexibility to phase software-defined technologies into a hybrid environment while gradually eliminating the expensive rip-and-replace upgrades and complex integration issues of traditional infrastructure.
For those organisations transitioning, it is also essential that the software-defined infrastructure can be activated one element at a time in a controlled manner to allow for monitoring.
‘The emphasis in the market for the next couple of years will be helping CIOs identify where to start and helping them through the process,’ says Andi Falkner, sales engineering director EMEA for networking at Dell.
‘As technologies mature, the adoption will likely start in smaller areas and, based on benefits it offers, it will spread, offering new use cases for other CIOs to examine.’
What the experts say
‘The market will drive a degree of standardisation. Organisational bodies such as the IEEE will (if they haven’t already) form working parties to drive standards, but, ultimately, the market demands will force vendors to interoperate across the data centre environment. Over time, the ecosystem that grows around the market will deliver pseudo standards. The end customer benefits not only with choice, but also gains increasing control.’
– Kevin Linsell, head of service development, Adapt
‘The main value of software-defined networks is not necessarily to increase utilisation, but instead to simplify configure, foster workload mobility, and make networks more application-aware. Storage is different again, and the appetite to layer on abstractions may not be as high. Many organisations are already suffering from multiple levels of thin provisioning, making storage complex to manage. Any move to more software-defined approaches will need to make this better, not worse, and will need to prove that they are worthy of hosting an organisation’s crown jewels.’
– Andrew Hillier, CTO and co-founder, CiRBA
‘SDDC is a pretty new thing at the moment but over the coming years it will transition into one of those functions that is just the norm. Technologies developed to support the model will evolve into fundamental parts of infrastructure systems. New languages will mature that are used as a common way of describing infrastructures within an SDDC. The model itself will mature to an extent that the products and services that support and provide it will be very mainstream and open to all organisations no matter their size and complexity.’
– Russel Ridgley, head of cloud services, Pulsant
‘There are still gaps in the software stack required for the SDDC to be fully effective. Many data centres are constrained by the realities of existing infrastructure and architectures, thus full automation is currently an aspiration rather than a reality. A combination of tools and techniques are required across both the network and the data centre to achieve the kind of benefits being associated with SDDCs. Capacity planning and management of all asset classes is central to this as it provides best optimisation of assets and a clear path for future investment.’
– Tony Fallows, CEO, Aria Networks
‘At a business level, reducing the time taken needed to manage the network and deploy new resources or applications can have a major impact. If an employee does not have to manually provision the compute, storage and network resources needed to deliver an application, businesses are able to get new services up and running far more quickly. As well as greatly increasing an organisation’s agility, this also boosts competitive advantage by reducing the time it takes to get new offerings to market.’
– Nick Williams, senior product manager, EMEA data centre IP, Brocade