Software-Defined Anything (SD-X): How do partners and resellers adapt to the new sales game?

Marcus Adae, Vice President Strategic Suppliers, Avnet Technology Solutions EMEA

Marcus Adae, Vice President Strategic Suppliers,
Avnet Technology Solutions EMEA

Marcus Adae, Vice President Core Suppliers & Technology Groups, Avnet Technology Solutions EMEA, discusses the key challenges around SD-X and what partners can do to overcome them

There is one question circling the industry at the moment and causing quite a stir; “what do we mean by ‘Software-Defined Anything’?” As it stands today, not all organisations within the channel eco-system necessarily agree on a common standard for the term. To me, ‘Software-Defined Anything’ (SD-X) is an all-encompassing term covering a broad spectrum of technologies. And partners, resellers and distributors alike have slightly differing agendas in terms of where they position these technologies.

However, this means resellers are now faced with a totally new sales game. They’re tasked with finding the most appropriate solutions to take to market in order to start their transition into the brave new world of a ‘business-led consultancy approach to IT’. So, what are the key challenges and how can partners and resellers look to overcome them? Read More…

Posted under Agile Development, IT Software

The Software-Defined Data Centre Explained: Part II of II

Steve Phillips, CIO, Avnet Inc.

In part one of this article, we examined how data centres have evolved, and why current solutions are leaving some businesses searching for what they believe is the next evolution in data centre architecture: the software defined data centre (SDDC).

In this post we’ll examine how soon SDDC’s may become a reality, what obstacles are holding it back, and identity a few of the vendors to watch out for as SDDC becomes a reality.

HOW PROMISING AND HOW SOON?

According to Gartner’s Hype Cycle for 2014, the SDDC—part of what they refer to as “software-defined anything,” or SDA/SDX—is still firmly in the Cycle’s first stage, where the promise of technology has yet to be grounded in the context of real-world application.

That hasn’t stopped Gartner from calling software-defined anything “one of the top IT trends with the greatest possible impact on an enterprise’s infrastructure and operations,” according to Computer Weekly.

EARLY OBSTACLES TO ADOPTION

While the potential for SDDC may be great, embracing it is more of a revolution than an evolution. The migration to a virtualized environment could be embraced by traditional data centres as time, budget and business need allowed, with virtualized racks next to traditional hardware racks.

On the other hand, software-defined data centres require common APIs to operate: the hardware can be pooled and controlled by software or it can’t. As a result, companies with significant legacy infrastructures may find it difficult to adopt SDDC in their own environments.

One way for existing data centres to avoid the “all or nothing” approach of SDDC is by embracing what Gartner began referring to as “bimodal IT” in 2014. Bimodal IT identifies two types of IT needs:

  • Type 1 is traditional IT, which places a premium on stability and efficiency for mission-critical infrastructure needs.
  • Type 2 refers to a more agile environment focused on speed, scalability, time-to-market, close alignment with business needs, and rapid evolution.

A bimodal IT arrangement would allow large legacy IT operations to establish a separate SDDC-driven environment to meet business needs that call for fast, scalable and agile IT resources, while continuing to rely on traditional virtualized environments for applications and business needs that value uptime and consistency above all else.

Over time, more resources could be devoted to the new SDDC architecture as the needs of the business evolve, without requiring the entire data centre to convert to SDDC all at once.

WHAT VENDORS ARE LEADING THE SDDC CHARGE?

Given how different software-defined data centre architectures are from traditional and virtualized environments, it’s a golden opportunity for new and emerging vendors to gain a first-mover advantage on some of the entrenched data centre giants.

APIs: The critical components of SDDC are the APIs that control the pooled resources. OpenStack’s API is the open source market leader at this point, since many vendors currently rely on their own proprietary APIs to control their hardware.

Computing & Storage: Emerging players like Nimble Storage and Nutanix are at the forefront of the SDDC movement, but data centre incumbents like IBM, HP, Dell, NetApp, Cisco and EMC are right there with them.

Networking: While Cisco, Juniper and HP are certainly the focus of the software defined networking space, startups like Big Switch and Cumulus Networks are gaining significant market interest, funding and traction as the SDDC model gains momentum.

Converged Infrastructure: Two additional initiatives worth keeping an eye on are VCE and their VBlock solutions, as well as NetApp’s Flexpods integrated infrastructure solutions. These products are designed to meet the needs of both “clean sheet” and legacy IT environments interested in pursuing the bimodal IT approach.

So while the reality of the SDDC may be a few years away for many IT environments with considerable legacy investments, it’s certainly a new and compelling vision for the data centre.

More importantly, it appears to be the solution IT is looking for in the always on, mission critical, cloud-ready and data-rich environment we operate in today. Expect to hear more on this topic in future Behind the Firewall blog posts.

Posted under Storage, Virtualisation

The Software-Defined Data Centre Explained: Part I of II

Steve Phillips, CIO, Avnet Inc.

The traditional data centre is being challenged harder than ever to keep up with the pace of change in business and technology.

Three recent megatrends—the growth of data and data analytics, the rise of cloud computing and the increasing criticality of technology to the operations of many businesses—have shown that legacy infrastructures are too inflexible, too inefficient and too expensive for a growing number of businesses.

In this first of two posts on this topic, we’ll briefly recap the evolution of the data centre to this point, examining where and why current architectures are falling short of many businesses’ needs.

THE TRADITIONAL HARDWARE-CENTRIC MODEL

From the beginning, data centres have been built around a hardware-centric model of expansion. Years ago, if your company wanted to launch a new application, the IT team would purchase and provision dedicated servers to handle the computing duties, pairing them with a dedicated storage unit to manage the application database and backup needs.

In order to ensure that the platform could handle surges in demand, IT would provision enough computing power and storage to meet the ‘high water mark’ of each application’s forecast demand.  As a result, many servers and storage spent most of the time running at a fraction of capacity—as little as 8% according to a 2008 McKinsey study.

To make matters worse, these server and storage pairs used a dedicated high capacity network backbone that kept each platform in its own distinct silos. As a result, data centres were overbuilt by design, overly expensive, and slow to adapt to the changing needs of the business.

 

THE VIRTUALIZED DATA CENTRE

The adoption of server and storage virtualization technologies into the enterprise about 10 years ago addressed some of these issues by allowing one server or storage array to do the job of many.

Provisioning a portion of a server or storage unit for a new application was faster and less expensive than buying new hardware, and it went some way to reduce the issue of underutilisation…but not far enough. According to a Gartner report, utilisation rates only grew to 12% by 2012.

 

However, increased computing densities allowed the data centre to provide significantly more computing horsepower to the business, without the need to expand the centre’s square footage in the process.

THE DATA CENTRE’S PERFECT STORM

But over the last five years, data centers have been buffeted by three megatrends that have pushed current virtualization technologies to their limits:

  1. The exponential growth of data and the rise of data analytics has exceeded the most aggressive scenarios for many storage and networking infrastructures.
  2. The adoption of cloud computing and the “hybrid cloud” model, where the company data centre shares computing and storage responsibilities with third-party cloud vendors in remote locations.
  3. Lastly, the increasing reliance of many businesses on always-on technology requires the IT team to provision and scale IT resources rapidly to accommodate new initiatives and business opportunities.

Cloud computing as a whole has also increased the rate of innovation and the expectations of the business, leaving IT teams and their data centres working hard to keep up.

One solution to this paradigm is the software-defined data centre, or SDDC: A new architecture that allows IT to deliver applications and network capacity with speed, agility and an eye on the bottom line.

 THE SOFTWARE-DEFINED DATA CENTRE

In a software-defined data centre (SDDC) the focus on hardware-related silos is removed. All the essential elements of a computing platform—computing, storage, networking and even security—are pooled, virtualized and implemented through a common set of application programming interfaces, or APIs.

 With all of the data centre’s hardware resources pooled together, the computing, storage, networking and security needs of the business can be monitored and provisioned much more rapidly.

Applications experiencing a surge in demand can have more computing power allocated in an instant. New applications can be provisioned just as quickly, speeding deployment and time-to-value. Data-rich analytics reports and backup routines receive the bandwidth they need, only when they need it.

It’s a compelling vision, but is it real?

We’ll answer that question in part two, predicting how soon SDDC’s may become a reality, what obstacles are holding it back today, and identity a few of the vendors to watch out for as SDDC gains traction in the marketplace.

Posted under Storage, Virtualisation

Ditch the Digital Duck Tape – Consider All Options with Hyper-Converged

Tom Corrigan, sales director, Avnet Technology Solutions, UK, explains why the channel is perfectly positioned to take advantage of new modular hyper-converged systems and why there is no time to lose.

The channel is buzzing with terms such as the software-defined data centre, converged systems, hybrid cloud and more recently, hyper-converged infrastructure.  But what does it all mean and what is the difference between converged and hyper-converged systems?  Most importantly, how can the UK channel develop strategies that drive profitable growth from this technology sector?

Analyst predictions are suggesting >150% market growth over 2015 and beyond and this dramatic growth has seen a large number of technology start-ups delivering very capable solutions focussed in this area. Equally as exciting is the breadth of emerging technologies being launched from the established vendors who have strategic relationships with Avnet both in the UK and across the region.

Alongside this vendor push we are seeing significant volumes of requests from business partners and their end-user communities around the delivery of integrated platforms to manage new application deployments rather than the provision of disparate compute, network and storage systems in legacy fashion. In line with this change in demand, reference architectures and converged systems have emerged as pre-defined combinations of server, storage and network solutions to help simplify platform design for what are often fairly complex workload requirements.

Hyper-converged infrastructure takes this to the next level and offers a fully integrated platform across compute, network, storage and hypervisors that are designed, configured and delivered as a single appliance. This modular design means they are quick to design, simpler to deploy and can be scaled out by adding more appliances as required.

So what does all this mean to business partners looking to broaden their capabilities beyond selling compute systems, storage arrays and perhaps introducing a hypervisor for virtualisaton? With potentially simplified technology for end-users, in terms of design and deployment, now is the time for partners to expand their skills to include the application stack and delivery of margin-rich services to support a hyper-converged infrastructure. By taking this approach, opportunities will open up around private cloud and application consulting in addition to application deployment, which is where the best margin opportunities reside for the channel community.

However, the first step to hyper-converged is to carefully choose which vendors to partner with.  Which eco-systems offer the most benefits? While hyper-converged is an emerging technology it has already been validated by many established vendors. The building blocks of most hyper-converged platforms may not yet be one size fits all, but certainly one size fits many.

Within our global markets Avnet is are seeing this change and we feel that now is the time to look beyond the digital duct tape that holds disparate hardware and software stacks together and consider a more consolidated approach to delivering applications and related services.

How can Avnet help? Well for starters, the strategic partnerships we hold globally, regionally and locally are with the leading technology providers and this gives us a huge head start as we can access the technology stack that best fits the customer requirements. To supplement this we have a dedicated in-house technical and sales team focused only on converged platforms. We also have immense capability to build these systems to order and at a scale across EMEA from our Tongeren facility, which is certified to the highest levels as required by our supplier partners.

Bringing new technologies to market and enabling the channel to capitalise on this clear market opportunity is hugely important as Avnet continues on its journey to transform technology into business solutions for customers around the world.

Posted under IT infrastructure