Riding the Information Wave: IoT and Big Data Analytics

Max Chan, Vice President, Global Information Solutions, Avnet Technology Solutions Asia Pacific

IoT and its uses in actionThere have been a number of exciting developments around the Internet of Things at Avnet lately.  First, we kicked off the calendar year 2016 announcing a newly created role of vice president, Internet of Things in the person of Eric Williams, who will steer the company’s global IoT strategy. The previous year, Tim FitzGerald who had led Avnet’s Cloud Solutions business, was appointed vice president of digital transformation. These key appointments underscore our steadfast commitment to investing in the right resources to capitalise on the opportunities in the rapidly growing IoT market.

Read More…

Posted under Big data, Internet of Things (IoT), IT infrastructure

The Software-Defined Data Centre Explained: Part I of II

Steve Phillips, CIO, Avnet Inc.

The traditional data centre is being challenged harder than ever to keep up with the pace of change in business and technology.

Three recent megatrends—the growth of data and data analytics, the rise of cloud computing and the increasing criticality of technology to the operations of many businesses—have shown that legacy infrastructures are too inflexible, too inefficient and too expensive for a growing number of businesses.

In this first of two posts on this topic, we’ll briefly recap the evolution of the data centre to this point, examining where and why current architectures are falling short of many businesses’ needs.

THE TRADITIONAL HARDWARE-CENTRIC MODEL

From the beginning, data centres have been built around a hardware-centric model of expansion. Years ago, if your company wanted to launch a new application, the IT team would purchase and provision dedicated servers to handle the computing duties, pairing them with a dedicated storage unit to manage the application database and backup needs.

In order to ensure that the platform could handle surges in demand, IT would provision enough computing power and storage to meet the ‘high water mark’ of each application’s forecast demand.  As a result, many servers and storage spent most of the time running at a fraction of capacity—as little as 8% according to a 2008 McKinsey study.

To make matters worse, these server and storage pairs used a dedicated high capacity network backbone that kept each platform in its own distinct silos. As a result, data centres were overbuilt by design, overly expensive, and slow to adapt to the changing needs of the business.

 

THE VIRTUALIZED DATA CENTRE

The adoption of server and storage virtualization technologies into the enterprise about 10 years ago addressed some of these issues by allowing one server or storage array to do the job of many.

Provisioning a portion of a server or storage unit for a new application was faster and less expensive than buying new hardware, and it went some way to reduce the issue of underutilisation…but not far enough. According to a Gartner report, utilisation rates only grew to 12% by 2012.

 

However, increased computing densities allowed the data centre to provide significantly more computing horsepower to the business, without the need to expand the centre’s square footage in the process.

THE DATA CENTRE’S PERFECT STORM

But over the last five years, data centers have been buffeted by three megatrends that have pushed current virtualization technologies to their limits:

  1. The exponential growth of data and the rise of data analytics has exceeded the most aggressive scenarios for many storage and networking infrastructures.
  2. The adoption of cloud computing and the “hybrid cloud” model, where the company data centre shares computing and storage responsibilities with third-party cloud vendors in remote locations.
  3. Lastly, the increasing reliance of many businesses on always-on technology requires the IT team to provision and scale IT resources rapidly to accommodate new initiatives and business opportunities.

Cloud computing as a whole has also increased the rate of innovation and the expectations of the business, leaving IT teams and their data centres working hard to keep up.

One solution to this paradigm is the software-defined data centre, or SDDC: A new architecture that allows IT to deliver applications and network capacity with speed, agility and an eye on the bottom line.

 THE SOFTWARE-DEFINED DATA CENTRE

In a software-defined data centre (SDDC) the focus on hardware-related silos is removed. All the essential elements of a computing platform—computing, storage, networking and even security—are pooled, virtualized and implemented through a common set of application programming interfaces, or APIs.

 With all of the data centre’s hardware resources pooled together, the computing, storage, networking and security needs of the business can be monitored and provisioned much more rapidly.

Applications experiencing a surge in demand can have more computing power allocated in an instant. New applications can be provisioned just as quickly, speeding deployment and time-to-value. Data-rich analytics reports and backup routines receive the bandwidth they need, only when they need it.

It’s a compelling vision, but is it real?

We’ll answer that question in part two, predicting how soon SDDC’s may become a reality, what obstacles are holding it back today, and identity a few of the vendors to watch out for as SDDC gains traction in the marketplace.

Posted under Storage, Virtualisation

Waking up to the value in the IT estate: Are you sitting on a gold mine?

Big Data Expert

Simon Ellis, Director Estate Management Solutions and Services EMEA

Imagine a world where your sales force receives real-time customer opportunities derived from your own data. Where opportunities that not only improve your top and bottom line but also lead to new service and product business, improved renewal rates and deeper insight into customer estates. A world that allows you to grow your business whilst making you even more valuable to your customers. Interested? Thought so.

Managing and governing an IT estate using CMDBs (configuration management databases) and ITAM (asset management) tools and processes provides today’s organisations with a wealth of untapped potential in terms – opportunities for cost optimisation, cross-sell, up-sell and new services. Exploiting this potential gold mine allows the channel to deliver enhanced levels of service excellence and deeper value to internal and external customers, suppliers and partners. The challenge for the majority of players in the IT channel is how to enable and then unleash that potential and cash in on the data gold mine.

Recent years have seen the rise of data analytics and business intelligence (BI) where organisations move their IT spend – and focus – to sweating their investments and gaining business advantage through leveraging their business data. It could be argued that this is not the case with the channel with their various (and traditionally spreadsheet-based) new business, renewal management and customer asset management solutions.

Many of today’s asset management solutions remain siloed – contracts managed in one system, renewals in another, procurement in another, CMDBs, discovery databases.… the list goes on.

This can be understood given how the channel and indeed the end customer have traditionally operated – with separation of interests between procurement, IT and of course the business itself. Moreover, market dynamics within the channel very much drives a focus on transactional net new business.

Therefore it has never been as important to get a single view of yours and your customer’s estate – across contracts, renewals, installed base as they are sold, renewed or managed. I believe end-to-end Estate Management with intelligent mining of data is where the channel needs to evolve to in order to remain competitive and to prosper.

The proliferation of big data shows what is possible when you bring together different data sets to augment each other; add dimensionality and with deep insight, real value can be derived.

When transactional data is brought together with historical trends –new sales with renewals – sales out data with vendor cross-sell and up-sell rules — new insights and opportunities can be delivered.  Joining up the informational supply chain – vendor, distribution, reseller and end customer –allows ‘real-time’ opportunities to be uncovered.

Estate Management differs from pure ITAM through delivering insight into opportunities for all elements in the channel – the reseller, vendor, distributor and ultimately, the end customer.

Why does the channel need to wake up to this now?

  • Barriers to entry – some key players are doing this now – if you don’t then they will be able to eat your food from your table
  • Differentiation – the data you have allows you to differentiate your service offerings from others in the marketplace, allowing you to evolve from just being another commodity player
  • Deeper value – your customers will see greater value from you – through generating insight, opportunity and value for not just yourself but also for them
  • Smart partnering – channel players joining up to share data and intelligence in order to enter, disrupt new markets is going to become more commonplace. Who are your partners?

In short, new players are coming up behind you and will continue to enter this market – using data to derive new opportunity for themselves and their partners.  The market conditions continue to be tough and being able to identify new opportunities that create value automatically from data that you already have must surely be something we all need to wake up to?

For more information, please visit http://www.ts.avnet.com/uk/value_added_services/partner_services/estate_management/ or email estatemanagement@avnet.com.

Posted under Estate Management