The Software-Defined Data Centre Explained: Part II of II

Steve Phillips, CIO, Avnet Inc.

In part one of this article, we examined how data centres have evolved, and why current solutions are leaving some businesses searching for what they believe is the next evolution in data centre architecture: the software defined data centre (SDDC).

In this post we’ll examine how soon SDDC’s may become a reality, what obstacles are holding it back, and identity a few of the vendors to watch out for as SDDC becomes a reality.


According to Gartner’s Hype Cycle for 2014, the SDDC—part of what they refer to as “software-defined anything,” or SDA/SDX—is still firmly in the Cycle’s first stage, where the promise of technology has yet to be grounded in the context of real-world application.

That hasn’t stopped Gartner from calling software-defined anything “one of the top IT trends with the greatest possible impact on an enterprise’s infrastructure and operations,” according to Computer Weekly.


While the potential for SDDC may be great, embracing it is more of a revolution than an evolution. The migration to a virtualized environment could be embraced by traditional data centres as time, budget and business need allowed, with virtualized racks next to traditional hardware racks.

On the other hand, software-defined data centres require common APIs to operate: the hardware can be pooled and controlled by software or it can’t. As a result, companies with significant legacy infrastructures may find it difficult to adopt SDDC in their own environments.

One way for existing data centres to avoid the “all or nothing” approach of SDDC is by embracing what Gartner began referring to as “bimodal IT” in 2014. Bimodal IT identifies two types of IT needs:

  • Type 1 is traditional IT, which places a premium on stability and efficiency for mission-critical infrastructure needs.
  • Type 2 refers to a more agile environment focused on speed, scalability, time-to-market, close alignment with business needs, and rapid evolution.

A bimodal IT arrangement would allow large legacy IT operations to establish a separate SDDC-driven environment to meet business needs that call for fast, scalable and agile IT resources, while continuing to rely on traditional virtualized environments for applications and business needs that value uptime and consistency above all else.

Over time, more resources could be devoted to the new SDDC architecture as the needs of the business evolve, without requiring the entire data centre to convert to SDDC all at once.


Given how different software-defined data centre architectures are from traditional and virtualized environments, it’s a golden opportunity for new and emerging vendors to gain a first-mover advantage on some of the entrenched data centre giants.

APIs: The critical components of SDDC are the APIs that control the pooled resources. OpenStack’s API is the open source market leader at this point, since many vendors currently rely on their own proprietary APIs to control their hardware.

Computing & Storage: Emerging players like Nimble Storage and Nutanix are at the forefront of the SDDC movement, but data centre incumbents like IBM, HP, Dell, NetApp, Cisco and EMC are right there with them.

Networking: While Cisco, Juniper and HP are certainly the focus of the software defined networking space, startups like Big Switch and Cumulus Networks are gaining significant market interest, funding and traction as the SDDC model gains momentum.

Converged Infrastructure: Two additional initiatives worth keeping an eye on are VCE and their VBlock solutions, as well as NetApp’s Flexpods integrated infrastructure solutions. These products are designed to meet the needs of both “clean sheet” and legacy IT environments interested in pursuing the bimodal IT approach.

So while the reality of the SDDC may be a few years away for many IT environments with considerable legacy investments, it’s certainly a new and compelling vision for the data centre.

More importantly, it appears to be the solution IT is looking for in the always on, mission critical, cloud-ready and data-rich environment we operate in today. Expect to hear more on this topic in future Behind the Firewall blog posts.

Posted under Storage, Virtualisation

The importance of Big Data for the channel

Big data specialist

Wayne Gratton, EMEA SolutionsPath Business Development Director

According to IBM, 90% of the world’s big data has been produced in the last two years but despite this, for some reason, I can see big data is definitely not as important as it should be to the channel. Much like Cloud Computing, resellers are often confused about what’s hype and what’s reality. They’re also concerned about actually having a big data strategy and how they can capitalise on this market.

With regards to customers, there are pockets of understanding but some are also blissfully unaware of big data’s potential. There certainly needs to be more education about how big data can match business strategies, taking into account appropriate spend and how far it might affect future business decisions.


Overall, when big data is analysed to look at patterns, it can provide invaluable business insight and improve decision making across entire organisations. However, there are a wide range of benefits to wielding big data and these can vary dependent on vertical sectors. For example, it allows retailers to optimise supply chains by analysing sales data so they can stock up on the most popular items to remain profitable and to keep customers happy. Without data in healthcare, analysis on the recent outbreak of legionnaire’s disease in Europe during the summer wouldn’t have been as well documented. In the future that helps combat and prevent more disease outbreaks.

Channel opportunity: big data versus business needs

Resellers need to look at the opportunities big data brings to their business but to do this, they need to be better educated about ultimately what it means for them. They should consider how they’re dealing with big data in the first instance e.g. are they looking at it from a storage point of view or do they deal with software analytics?

It’s true that big data means different things to different companies and analysing a company’s needs is the first stage to deciding how important this growth sector is. It’s about turning big data into an asset. A simple way of doing this is by using asset discovery tools to gather intelligence on resources and usage of a clients’ storage, helping to implement solutions at a lower cost of ownership to gain more control over this rapid growth in data.

An approach to get sales rolling

Defining your company’s and your customers’ position(s) in the big data ecosystem can be achieved by working with a partner that has global expertise yet local knowledge to help tap into market opportunities.

We believe the best way to go about this is to take a three-pronged approach, working with said partner to:

  1. Envision – look at what big data means to your company and your customer’s company. We think it’s important to understand this growth sector on a global and local level through in-depth market analysis
  2. Enable – use training schemes to gain the skills and local knowledge in this area to become ‘big data advisors’ to your customers
  3. Execute – implement what has been learnt to drive sales and scout out local market opportunities

Posted under Big data, Storage