The Software-Defined Data Centre Explained: Part II of II

Steve Phillips, CIO, Avnet Inc.

In part one of this article, we examined how data centres have evolved, and why current solutions are leaving some businesses searching for what they believe is the next evolution in data centre architecture: the software defined data centre (SDDC).

In this post we’ll examine how soon SDDC’s may become a reality, what obstacles are holding it back, and identity a few of the vendors to watch out for as SDDC becomes a reality.

HOW PROMISING AND HOW SOON?

According to Gartner’s Hype Cycle for 2014, the SDDC—part of what they refer to as “software-defined anything,” or SDA/SDX—is still firmly in the Cycle’s first stage, where the promise of technology has yet to be grounded in the context of real-world application.

That hasn’t stopped Gartner from calling software-defined anything “one of the top IT trends with the greatest possible impact on an enterprise’s infrastructure and operations,” according to Computer Weekly.

EARLY OBSTACLES TO ADOPTION

While the potential for SDDC may be great, embracing it is more of a revolution than an evolution. The migration to a virtualized environment could be embraced by traditional data centres as time, budget and business need allowed, with virtualized racks next to traditional hardware racks.

On the other hand, software-defined data centres require common APIs to operate: the hardware can be pooled and controlled by software or it can’t. As a result, companies with significant legacy infrastructures may find it difficult to adopt SDDC in their own environments.

One way for existing data centres to avoid the “all or nothing” approach of SDDC is by embracing what Gartner began referring to as “bimodal IT” in 2014. Bimodal IT identifies two types of IT needs:

  • Type 1 is traditional IT, which places a premium on stability and efficiency for mission-critical infrastructure needs.
  • Type 2 refers to a more agile environment focused on speed, scalability, time-to-market, close alignment with business needs, and rapid evolution.

A bimodal IT arrangement would allow large legacy IT operations to establish a separate SDDC-driven environment to meet business needs that call for fast, scalable and agile IT resources, while continuing to rely on traditional virtualized environments for applications and business needs that value uptime and consistency above all else.

Over time, more resources could be devoted to the new SDDC architecture as the needs of the business evolve, without requiring the entire data centre to convert to SDDC all at once.

WHAT VENDORS ARE LEADING THE SDDC CHARGE?

Given how different software-defined data centre architectures are from traditional and virtualized environments, it’s a golden opportunity for new and emerging vendors to gain a first-mover advantage on some of the entrenched data centre giants.

APIs: The critical components of SDDC are the APIs that control the pooled resources. OpenStack’s API is the open source market leader at this point, since many vendors currently rely on their own proprietary APIs to control their hardware.

Computing & Storage: Emerging players like Nimble Storage and Nutanix are at the forefront of the SDDC movement, but data centre incumbents like IBM, HP, Dell, NetApp, Cisco and EMC are right there with them.

Networking: While Cisco, Juniper and HP are certainly the focus of the software defined networking space, startups like Big Switch and Cumulus Networks are gaining significant market interest, funding and traction as the SDDC model gains momentum.

Converged Infrastructure: Two additional initiatives worth keeping an eye on are VCE and their VBlock solutions, as well as NetApp’s Flexpods integrated infrastructure solutions. These products are designed to meet the needs of both “clean sheet” and legacy IT environments interested in pursuing the bimodal IT approach.

So while the reality of the SDDC may be a few years away for many IT environments with considerable legacy investments, it’s certainly a new and compelling vision for the data centre.

More importantly, it appears to be the solution IT is looking for in the always on, mission critical, cloud-ready and data-rich environment we operate in today. Expect to hear more on this topic in future Behind the Firewall blog posts.

Posted under Storage, Virtualisation

The Software-Defined Data Centre Explained: Part I of II

Steve Phillips, CIO, Avnet Inc.

The traditional data centre is being challenged harder than ever to keep up with the pace of change in business and technology.

Three recent megatrends—the growth of data and data analytics, the rise of cloud computing and the increasing criticality of technology to the operations of many businesses—have shown that legacy infrastructures are too inflexible, too inefficient and too expensive for a growing number of businesses.

In this first of two posts on this topic, we’ll briefly recap the evolution of the data centre to this point, examining where and why current architectures are falling short of many businesses’ needs.

THE TRADITIONAL HARDWARE-CENTRIC MODEL

From the beginning, data centres have been built around a hardware-centric model of expansion. Years ago, if your company wanted to launch a new application, the IT team would purchase and provision dedicated servers to handle the computing duties, pairing them with a dedicated storage unit to manage the application database and backup needs.

In order to ensure that the platform could handle surges in demand, IT would provision enough computing power and storage to meet the ‘high water mark’ of each application’s forecast demand.  As a result, many servers and storage spent most of the time running at a fraction of capacity—as little as 8% according to a 2008 McKinsey study.

To make matters worse, these server and storage pairs used a dedicated high capacity network backbone that kept each platform in its own distinct silos. As a result, data centres were overbuilt by design, overly expensive, and slow to adapt to the changing needs of the business.

 

THE VIRTUALIZED DATA CENTRE

The adoption of server and storage virtualization technologies into the enterprise about 10 years ago addressed some of these issues by allowing one server or storage array to do the job of many.

Provisioning a portion of a server or storage unit for a new application was faster and less expensive than buying new hardware, and it went some way to reduce the issue of underutilisation…but not far enough. According to a Gartner report, utilisation rates only grew to 12% by 2012.

 

However, increased computing densities allowed the data centre to provide significantly more computing horsepower to the business, without the need to expand the centre’s square footage in the process.

THE DATA CENTRE’S PERFECT STORM

But over the last five years, data centers have been buffeted by three megatrends that have pushed current virtualization technologies to their limits:

  1. The exponential growth of data and the rise of data analytics has exceeded the most aggressive scenarios for many storage and networking infrastructures.
  2. The adoption of cloud computing and the “hybrid cloud” model, where the company data centre shares computing and storage responsibilities with third-party cloud vendors in remote locations.
  3. Lastly, the increasing reliance of many businesses on always-on technology requires the IT team to provision and scale IT resources rapidly to accommodate new initiatives and business opportunities.

Cloud computing as a whole has also increased the rate of innovation and the expectations of the business, leaving IT teams and their data centres working hard to keep up.

One solution to this paradigm is the software-defined data centre, or SDDC: A new architecture that allows IT to deliver applications and network capacity with speed, agility and an eye on the bottom line.

 THE SOFTWARE-DEFINED DATA CENTRE

In a software-defined data centre (SDDC) the focus on hardware-related silos is removed. All the essential elements of a computing platform—computing, storage, networking and even security—are pooled, virtualized and implemented through a common set of application programming interfaces, or APIs.

 With all of the data centre’s hardware resources pooled together, the computing, storage, networking and security needs of the business can be monitored and provisioned much more rapidly.

Applications experiencing a surge in demand can have more computing power allocated in an instant. New applications can be provisioned just as quickly, speeding deployment and time-to-value. Data-rich analytics reports and backup routines receive the bandwidth they need, only when they need it.

It’s a compelling vision, but is it real?

We’ll answer that question in part two, predicting how soon SDDC’s may become a reality, what obstacles are holding it back today, and identity a few of the vendors to watch out for as SDDC gains traction in the marketplace.

Posted under Storage, Virtualisation

SaaS and the cloud: the channel should look to service provider models

Director Cloud Practice (EMEA)

Kevin Collins, Director Cloud Practice (EMEA)

SaaS (Software as a Service) and the cloud is in a state of flux at the moment with traditional hardware vendors delivering converged solutions to service providers who in turn are looking to grow their IaaS (Infrastructure as a Service). At the same time independent software vendors are looking to move their products to SaaS to meet increasing customer demands.

 

Gartner recently reported that 71 percent of organisations in 10 countries have been using SaaS for less than three years. According to the research, investments in SaaS are expected to increase across all regions; 77 percent of respondents expected to increase spending, while 17 percent planned to keep it stable.

Licence to SaaS

The interest in SaaS comes with end users seeing the benefit in not holding onto a lot of software licences for long periods of time. They now want to be able to use the right quantity of licences ‘on demand.’ If they’re increasing the size of their workforce, they want to easily be able to upscale their licences. The industry is beginning to look to a ‘pay-as-you-go’ approach through SaaS – something service providers have been doing for years and have had plenty of success implementing.

 

Dispelling SaaS myths

Implementing the SaaS model is not necessarily a ‘cheaper’ solution but what it can do is focus finances exactly where they’re needed. A company can reduce licence expenditure in ineffective areas of the business while trying out a new system more cost effectively: they don’t have to fork out lump sums up front. This dynamic approach to software usage allows a company to try out whole new business models utilising specific software solutions for say – six months – and still keep a grasp of IT expenditure.

 

I believe SaaS is on a journey right now – we have started with IaaS which is becoming more mature; customers are running this quite happily now in a number of key areas such as test and dev, DR and big data analysis. This will lead to larger PaaS (Platform as a Service) requirement as partners start to write their own software that end users will consume as SaaS.  Business partners are beginning to evolve their business strategies. While offering IaaS they are becoming aware that the real opportunity is with the applications and application integration. Some partners will switch their focus to building applications in the cloud, while others will integrate these disparate applications into a cohesive business solution; it is this area that is largely untapped and very interesting.

 

Look at LinkedIn, Facebook, Google Mail and Twitter – they’re all managed independently and all have their own interfaces. Now here is the challenge, could you synchronise all the contacts across these applications? Probably yes. Could you segregate it so LinkedIn and Facebook entries were keep apart but were all in your Gmail? With tags maybe. Would it be robust and reliable? I doubt it. Could it be done reliably by everyone without any IT skill? Realistically no.

 

Moving forwards in the Big Data business world, we really can’t work in those silos. Therefore as SaaS matures there is going to be a great opportunity for business partners to integrate those applications and build connectors which will enable data to flow more freely and securely between them. This won’t be a short term process – it’ll be over the next five to ten years.

 

Key benefits

Essentially, with SaaS, management overheads decrease considerably as the interface must be intuitive and therefore less complicated to use. It should be as easy as setting up a personal Gmail or Yahoo account and managing that on a daily basis. You can offload the software complexity to experts who manage it for you and the interface is something you just ‘plug in’ and use.

 

Key challenges

A key challenge is security. Let’s take your Google account again; what are the consequences of letting the world see your personal calendar as a friend of mine did. No big deal until I pointed out that I could see a party he had three months ago, which included his address on the invite and he was telling the world he was about to go on holiday for two weeks “so help yourself”. Now map that mistake onto your business, multiply it by your staff count and overlay your compliance and you can see the risk you are taking. This is where your companies IT staff will deliver their value in the future. Far from being redundant in this new SaaS world they will be there to work with the business partner to not only deliver on the SaaS that is being demanded by the business but also ensure it is done in a compliant, secure and efficient manner.

 

Learning from service providers

In the future it will become more important to remove yourself from the traditional ‘solutions sell’, or in other words, you sell a solution, move on to the next piece of business, returning in two years to hopefully repeat the process.

 

It’s a ‘hands on’ approach now, which is new to some partners in the channel. It’s about being in touch with the customer every single month and being flexible enough meet customer needs. Similar to the way a service provider works, resellers are dealing with smaller amounts of money up front but over the long-term, continuous billing can be rolled out to reduce customer churn and improve loyalty. Resellers offer the scalability service providers need. This isn’t going to be a quick win. Resellers need to work with trusted and experienced partners to help them to migrate slowly from traditional solution sales today to an annuity based XaaS business in the future.

 

Read our second blog in the SaaS series to learn ‘top tips for selling SaaS’ coming up next week.

 

Posted under Cloud Computing

Healthcare – it’s time for the channel to make a difference

 

Healthcare IT specialist expert

Peter Blythe, Solutions Development Manager at Avnet Technology Solutions UK

Recently, Gartner announced that it believed 2012 would be a year in which channel leaders would redefine how IT operates and how it is employed by enterprises. A strong IT system is essential for an innovative and amplified organisation. One market that is certainly developing its IT and technology services is the healthcare sector. Recent research indicates that the healthcare sector is willing to spend more and more of its budget on IT solutions and technology in order to improve patient care, as IT is used increasingly frequently to strengthen the customer experience, as well as drive operational automation and control.

This is a great opportunity for the channel as long as it delivers good value solutions, allowing clinicians and IT departments to improve patient care (which is of course of the utmost importance) and also achieve more with less. It seems that healthcare organisations are experiencing a record surge in unstructured data, whilst IT departments and clinical data centres are also struggling with the need to leverage the inherent value of clinical data through increased analytics while managing the many medical applications demanded by clinicians and regulators. One key aspect to understanding how data growth can spiral out of control is often evident in the way that hospitals and trusts are funded. e.g. A new medical scanner may be bought using the hospital trust charity budget, however the management and responsibility of the system is managed by the IT department, but the data is the responsibility of the radiography department.  This type of situation further exacerbates the uncontrolled data growth and leads to further inefficiencies.

Data is of course essential to any organisation, especially in the healthcare industry, now that everything has become digitalised and such high volumes of personal and valuable information are being managed on a daily basis. Recent NHS data showed that 17,000 British males were admitted to hospital between 2009 and 2012 for obstetric appointments (related to childbirth) – the official figures are available at www.HESonline.nhs.uk. Data management errors clearly need to be solved and reducing data input errors is a good starting point, as illustrated by the new University College Hospital Macmillan Cancer Centre in London, open since April of this year. The centre allows patients to check in at kiosks (like at the airport!) and technology such as a bar code permits automatic check-ins. To make the process even simpler for patients and reduce data errors, a listing of all future appointments will even appear on screen.

In addition, by bringing the clinical and patient data together with technologies like the Hitachi Clinical repository, clinicians are able to link all of a patient’s data together leading to a more accurate and efficient diagnosis.

I believe that there’s now a chance for the channel to deliver solutions to provide productive and easily accessible data, especially as patient data has to be held and lasts longer than the technology that runs it.

The increasing use of scanning equipment for diagnoses and treatments emphasises that the value of technology when dealing with patients is already clear to healthcare and clinicians. The aptly-named “Surgery by mouse” is becoming common place with technology providing the ability to carry out intricate keyhole surgery leading to improved recovery times.  However with the NHS facing the Nicholson Challenge to reduce spending by £20 billion in efficiency savings by 2015 and 4% savings year-on-year, the channel will need to prove that new solutions will save money and improve patient care. For a long time now healthcare organisations have understood the advantages of technology, so I can’t help but ask, is now the time for the IT channel to catch up and make a difference?

Posted under HealthPath

Cloud Computing: Forget the hype, what are the roles the channel can play?

Cloud computing specialist

Stephen Ennis, Director of Services, EMEA

A real opportunity

Having worked with cloud services for some time now, I’ve seen plenty of confusion around what’s reality and what’s hype where the Cloud is concerned. With well respected analysts like Gartner predicting Cloud Computing services will become a £95.7 billion ($150 billion) market in the next two years though, resellers do need to act now to seize a slice of the opportunity.

In my experience, resellers across EMEA want to know how they can step up to take real and practical cloud service propositions to market but it can be pretty daunting to work out a strategy if they’re not sure how to deal with the Cloud in general. One thing’s for sure, they need to look at achieving high ROI in their local markets.

So… how do they do this? Here are a few tips:

  • Look at how to expand your product portfolio to include the most in-demand cloud solutions for your market to quickly develop technology expertise to drive business growth
  • Work with a well-known IT partner who has the necessary skills and who possesses local market knowledge to enable you to develop strong know-how in cloud services
  • Recognise that cloud is not dissimilar to other technologies in that customers need it to be positioned, evaluated, implemented and integrated into their IT environment. Also, feel assured that the channel can fulfil these key customer requirements very well

The roles of the channel:

To deduce what cloud strategy can work best for you and your customers’ needs, you can consider five roles. You should be aware you can fit into one or more of these roles and can take on different roles depending on your customer engagement.

  1. Cloud Advisors: If you help customers demystify and understand the Cloud and provide advice on key decisions, this is your role. If an IT manager is considering outsourcing some applications, you as the Advisor, are there to offer counsel on which functions to move to the Cloud, you outline the pros and cons and create the migration strategy as well as complete a risk versus reward analysis. Building customer trust and establishing early credibility is the key to being successful as a Cloud Advisor.
  2. Cloud Builders: If you build private Cloud infrastructures either on or off-premise for your customers, you fit into this category. You will deliver cloud solutions, often turn-key, designed and built for your individual customer’s requirements. If you’re a Cloud Builder reseller you don’t generally own or operate the resulting cloud solution.
  3. Cloud Providers: Your role is similar to that of a Cloud Builder – you create cloud infrastructures. Where you differ is that you’ll deploy this “as a Service” (XaaS) and will host it yourself, making it available to your customers. As a Provider, you need to take more of an advisory role. You help your customers understand business transformation and how to evaluate the financial and technical merits of an off-premise cloud solution.
  4. Cloud Resellers: Quite simply, you sell cloud services from another organisation or a supplier. You help your end user select the correct cloud service(s) and evaluate which solution out of your portfolio best suits an organisation’s needs. For this role, you require in-depth knowledge of your customers’ businesses, which can often be a challenge if you’re not working with the right partner to show customers how to implement the Cloud.
  5. Cloud Integrators: You construct ‘the glue’ between private and public clouds or between traditional IT and other cloud infrastructures. You help take away much of the complexity of cloud solutions by providing customers with fully integrated multi-dimensional solutions whilst incorporating the best of traditional IT and cloud.

The reality of Cloud Computing for the channel

Resellers can take on many roles. If we consider a traditional IT landscape today, you may have a private cloud being built on-premise (Cloud Builder role) which may also include some external cloud services from a 3rd party (Cloud Reseller role) and they’d be integrating those together (Cloud Integrator role). In that scenario the reseller would be fulfilling three of the aforementioned channel roles.

Start with defining your role

Using these categories to define your own role in the Cloud Computing phenomenon is a great place to begin. Working with an experienced partner can reassure you about what roles you can fulfil and more importantly, can ensure you’re getting the in-depth training you need to advise customers proficiently. This will enable you to help customers make more informed business decisions for today and tomorrow’s cloud requirements.

Posted under Cloud Computing