Cloud Computing Simply Isn’t That Scary Anymore? Who Says?

Are You Scared of Cloud Technology?

Are You Scared of Cloud Technology?

Cloud computing just isn’t as scary as it once was to companies and their CIOs. A new survey of 785 companies finds a meager 3% considering it to be too risky — down from 11% last year. Only 12% say the cloud platform is too immature, and that’s down from 26% a year ago. Furthermore, 50% of the survey respondents now say they have “complete confidence” in the cloud — up from 13% a year ago.

Of course, looking at it another way, that means 50% aren’t quite comfortable. But still, cloud is finding its way into day-to-day business. Hold the Press! Who took this survey anyway? Since we all know surveys can make subjects lean one way or another based on the way the questions are phrased…..well ahem….whaddya know its a “Venture Capital” firm that holds many cloud computing investments….hmmm no bias there! No reason for them to care one way or another if Cloud Service Providers (most of whome they have invested heavily in) go under or not! No way!

So lets get this straight, a company who invests Billions (with a BIG B) in “Cloud Service Providers” such as Amazon and others, decides to create a totally honest and unbiased survey and foist it onto the market as a fact that “Cloud Technology is no longer scary to a majority of companies and their CIO’s” …. IMAGINE THAT 🙂

They Claim your no longer Scared of Cloud technology

They Claim your no longer Scared of Cloud technology

These are the findings of a new survey (or should we call it Marketing Push instead of a “survey”?)conducted by North Bridge Venture Partners. The survey had a lot of industry (read as – influential people who have invested heavily in cloud technology) support behind it, sponsored by 39 companies, including Amazon Web Services, Rackspace, Eucalyptus, and Glasshouse, to name a few.

They claim that “the bottom line is cloud is now just considered the normal way to implement software solutions”.
Give me a break, the numbers are not anywhere near 50% who are OKAY with cloud technology much less 50% who even understand “cloud technology” according to most unbiased surveys. If you take an unbiased and honest look or happen to work in the industry as I do (take a look at IDC, Gartner and others who conduct yearly surveys on almost everything technology including cloud technology) you’ll quickly find out that acceptance levels and even understanding what cloud technology is all about, are somewhere in the 25% – 30% range and no where near 50% as these jokers claim.

So, is a survey underwritten by 39 cloud companies likely to say anything but good stuff about cloud? They had to make it look real so they threw in a few things which in reality are “killing the cloud market right now, like security, identity management, data tracking, compliance and many others which have no solutions currently so people are rightly concerned!) Their survey claims that “there still concerns that came up in the data. Security remains the primary inhibitor to adoption in the burgeoning cloud marketplace with 55% of respondents identifying it as a  concern. The implications of regulatory compliance (38%) also loom large, as do concerns about vendor lock-in (32%). Interestingly, pricing and expenses are way at bottom of the deal-breaker list”

Anyway folks, if you believe this survey is ANYTHING BUT MARKETING HYPE, I have some nice ocean-front property in Arizona to sell you….cheap!


You can find the survey if you did not click the links above here:

Designing Big Data Storage infrastructures

Big Data Storage Infrastructures

Big Data Storage Infrastructures

Designing Big Data Storage infrastructures

Big Data turns the traditional storage problem upside down. Up until now the growth in storage consumption has been looked at as a negative, something to be dealt with. Storage growth and the need to retrieve information quickly is a continual challenge and has been handled by begrudgingly adding more capacity and by encouraging policies that would have users delete or move unnecessary information to different and less expensive storage tiers. The concept of Big Data takes a different perspective. Growth in data is no longer being viewed as more bytes to store but as more information to mine for value. It assumes that if information is a weapon then the more of it that’s available the better.

What is Big Data?

Big Data is essentially a combination of processes and infrastructure that allow organizations to capture, analyze and manipulate large sets of structured (information in databases) and unstructured (file) data with the eventual goal of extracting value from that data. The particular value will differ from organization to organization, but it may be assembling the right information to more accurately forecast the weather in the near term, or using similar data to predict when climate shifts may occur over the long term. Large data sets may also be used to identify changes within trading patterns that an investment firm can use to guide investment strategies or spot business trends that a  manufacturer or web services provider can take advantage of to build a new ‘got to have’ product sooner.

Data Growth is the Common Denominator

While the above examples cross industries and data types, the common thread of Big Data is the need to initially capture the complete set of raw data. At times there is no advanced warning of which data needs to be later accessed at any given point in time. In many cases much of the data segment needs to be online so that all the stored data points can be queried. As the rationale goes, the more data there is, the more accurate the predictions can be from that data. This, ‘most of the data, most of the time’ requirement puts new demands on the storage infrastructure to support Big Data

Requirements for Big Data Storage Infrastructure

Legacy storage infrastructures are not well suited for this type of ‘no limits’ storage strategy that Big Data will impose
on storage infrastructures. While these capacities can be  purchased upfront, this will lead to tremendous unused
capacity in the intial stages of deployment. It is more cost effective to add capacity as needed. A Big Data infrastructure needs to scale capacity while maintaining performance to provide rapid answers when it is queried. It also needs to be  simple to manage. Old paradigms that assign numbers of storage administrators to terabytes owned will no longer be economically feasible as a single storage administrator will need to manage petabytes of information in order to keep the cost equation in line. With these requirements, the ideal infrastructure for Big Data may be scale-out storage. The Need For Scale The ability to scale is potentially the most critical aspect of a Big Data storage infrastructure. Big Data capacities are measured in 100s of terabytes (TBs) to start with and are growing to tens of petabytes (PBs). The challenge  that Big Data places on storage infrastructures is not necessarily what the starting capacity or ending capacity will be, but the rate of growth within that capacity range. In the legacy scale-up storage system, where the ability to support a
given capacity is purchased upfront, the start up cost to purchase a system capable of scaling into the PBs often
would be too great to be justifiable. It’s simply unreasonable to expect the standard scale-up infrastructure to cover this
range in capacity demands.
Finally, the fundamental architectures that are the foundation of traditional scale-up storage were designed decades ago with much smaller capacities in mind. As a result, the typical 16TB volume size limits are minuscule compared to the needs of Big Data. Even when those size limits are increased by a factor of five, they will come short of what the traditional Big Data needs will be. Scale-out NAS systems like Isilon’s are much better suited for this type of Big Data environment. Scale-out NAS is a file based storage system comprised of a group of servers clustered together as a single system. Each file server, called a “node”, can be added to the cluster in realtime without taking the system down. Each time a node is added to the system it increases the storage capacity, processing performance and network I/O  bandwidth, allowing the system to grow these three parameters in unison.

The Need For Performance

In addition to scaling capacity a Big Data storage infrastructure needs to scale performance, measured by storage processing power, disk I/O and network I/O. Queries on Big Data could be across billions of discrete files or include billions of rows within databases and a fast response time can mean meeting a production deadline, warning 1,000s of an impending disaster or cutting months off a medical breakthrough. The challenge is that in legacy storage architectures the right amount of storage processing power, disk I/O and network bandwidth required have to be planned for and purchased upfront. With Big Data the ability to accurately predict long term storage growth and more importantly, to afford the up front costs of that required performance is impractical. Once again Scale-out NAS provides an answer since it can be purchased in a relatively small capacity upfront, 50TBs for example, and then scaled into the PBs if needed as
more data comes in and as more data mining is performed. Each node brings the additional processing and I/O required to support that growth. In other words, the Scaleout NAS becomes faster as its capacity increases. That additional power can be used to manage data protection processes, storage optimization techniques like cloning, snapshots and automated tiering. The ability to scale  network I/O is equally critical since most Big Data initiatives support big compute infrastructures that consist of dozens, hundreds or thousands of servers, all supporting potentially thousands of server requests.

The Need For “One-ness”

Big Data also has a preference that all the data assets appear to be on a single volume, which is a challenge for traditional file systems and NAS solutions. While some of these file systems have just recently broken the 16TB limit many are still bound to it.
Very few can support anything more than 100TB volume sizes and most have unclear roadmaps on how they will expand beyond 100TB. Volume size limitations like these will quickly become a problem in the Big Data infrastructure
if they have not already and the Big Data storage administrator will be forced to manually chain multiple volumes of information together or write complex queries that do. Randomness of queries is one of the essential components of Big Data and the storage infrastructure chosen for it should support a single volume of virtually limitless capacity allowing all the data to be accessed in a single location. There is also the reality of administration. Multiple volumes lead not only to complex queries but also to complexity in traditional IT processes, like capacity allocation and data protection. Without the single volume storage administrators will be forced to manually inspect each volume to understand which is the most ideal candidate for a new loading of data. Also, since new data sets can come in large chunks no single volume may have enough free space to handle the new inbound data. This will lead to data being migrated back and forth between volumes to clear out enough space. In similar fashion the data protection process becomes more complicated.

As new storage is added, because current volumes cannot support inbound data sets, this storage has to be assigned to new volumes. It’s very easy for the backup administrator to miss the addition of these new volumes and data sets can go
unprotected for weeks until the oversight is discovered. While many categories of storage can easily have additional
capacity plugged into them very few can actually have that capacity automatically assigned and available to the applications that need it. With a multi-volume approach capacity allocation has to stop until a storage administrator
decides which of the new capacity should be added to which volume. Often this is not a very scientific process and
accuracy of assignment is sacrificed for speed of capacity delivery. With the single volume approach all capacity is
instantly added to one volume and instantly available to the application; decisions don’t have to be made and accuracy
is not sacrificed. In fact most storage managers will find that capacity utilization goes up dramatically with the onevolume approach since no capacity is being held captive on less active volumes.

The Need For Economies

Finally, all of the Big Data storage infrastructure has to be affordable. Once again the scale-out architecture has an advantage here because capacity and performance can be added in a near linear fashion. “Pay as you grow”, which has been the hallmark of Scale-out NAS, turns into “earn as you grow” in the Big Data context since typically the more data that’s available to the application and the business line owners, the better and more informed the decisions can be. One of the ways to maximize these investments is to leverage storage tiering like that available within Isilon’s OneFS operating environment. This allows sections of the Big Data asset to be stored on less expensive SATA storage and then promoted to higher performance, SAS, fibre channel or even solid state storage as the query activity justifies it.

Big Data is more than just the domain of scientists looking for cures or internet companies mining search logs to identify new services to monetize. It’s also becoming a key resource for businesses looking to optimize business processes, improve production cycles and increase profitability. These organizations are understanding that information is indeed power and it’s becoming the responsibility of the data storage infrastructure to house and deliver that power. A Big Data storage infrastructure is a unique storage project compared to most other environments. While the requirements of Big Data, such as the ability to scale capacity and performance while maintaining reliability and cost effectiveness, are similar to other data center storage projects. What makes Big Data unique is the extreme to which it takes these requirements. Speed to scale, flexibility to scale and a watchful eye on cost are all critical and it’s these extreme  requirements for which Scale-out NAS is best suited.

By George Crump, Senior Analyst


“Tipping Point” Technologies That Will Unlock Long-Awaited Technology Scenarios

Cloud Marketing Hype 2012

Cloud Marketing Hype 2012

2012 Hype Cycle for Emerging Technologies Identifies “Tipping Point” Technologies That Will Unlock Long-Awaited Technology Scenarios

2012 Hype Cycle Special Report Evaluates the Maturity of More Than 1,900 Technologies

STAMFORD, Conn., August 16, 2012—

Big data, 3D printing, activity streams, Internet TV, Near Field Communication (NFC) payment, cloud computing and media tablets are some of the fastest-moving technologies identified in Gartner Inc.’s 2012 Hype Cycle for Emerging Technologies.

Gartner analysts said that these technologies have moved noticeably along the Hype Cycle since 2011, while consumerization is now expected to reach the Plateau of Productivity in two to five years, down from five to 10 years in 2011. Bring your own device (BYOD), 3D printing and social analytics are some of the technologies identified at the Peak of Inflated Expectations in this year’s Emerging Technologies Hype Cycle (see Figure 1).

Gartner’s 2012 Hype Cycle Special Report provides strategists and planners with an assessment of the maturity, business benefit and future direction of more than 1,900 technologies, grouped into 92 areas. New Hype Cycles this year include big data, the Internet of Things, in-memory computing and strategic business capabilities.

The Hype Cycle graphic has been used by Gartner since 1995 to highlight the common pattern of overenthusiasm, disillusionment and eventual realism that accompanies each new technology and innovation. The Hype Cycle Special Report is updated annually to track technologies along this cycle and provide guidance on when and where organizations should adopt them for maximum impact and value.

The Hype Cycle for Emerging Technologies report is the longest-running annual Hype Cycle, providing a cross-industry perspective on the technologies and trends that senior executives, CIOs, strategists, innovators, business developers and technology planners should consider in developing emerging-technology portfolios.

“Gartner’s Hype Cycle for Emerging Technologies targets strategic planning, innovation and emerging technology professionals by highlighting a set of technologies that will have broad-ranging impact across the business,” said Jackie Fenn, vice president and Gartner fellow. “It is the broadest aggregate Gartner Hype Cycle, featuring technologies that are the focus of attention because of particularly high levels of hype, or those that Gartner believes have the potential for significant impact.”

“The theme of this year’s Hype Cycle is the concept of ‘tipping points.’ We are at an interesting moment, a time when many of the scenarios we’ve been talking about for a long time are almost becoming reality,” said Hung LeHong, research vice president at Gartner. “The smarter smartphone is a case in point. It’s now possible to look at a smartphone and unlock it via facial recognition, and then talk to it to ask it to find the nearest bank ATM. However, at the same time, we see that the technology is not quite there yet. We might have to remove our glasses for the facial recognition to work, our smartphones don’t always understand us when we speak, and the location-sensing technology sometimes has trouble finding us.”

Figure 1. Hype Cycle for Emerging Technologies, 2012

Source: Gartner (August 2012)

Although the Hype Cycle presents technologies individually, Gartner encourages enterprises to consider the technologies in sets or groupings, because so many new capabilities and trends involve multiple technologies working together. Often, one or two technologies that are not quite ready can limit the true potential of what is possible. Gartner refers to these technologies as “tipping point technologies” because, once they mature, the scenario can come together from a technology perspective.

Some of the more significant scenarios, and the tipping point technologies, need to mature so that enterprises and governments can deliver new value and experiences to customers and citizens include:

Any Channel, Any Device, Anywhere — Bring Your Own Everything

The technology industry has long talked about scenarios in which any service or function is available on any device, at anytime and anywhere. This scenario is being fueled by the consumerization trend that is making it acceptable for enterprise employees to bring their own personal devices into the work environment. The technologies and trends featured on this Hype Cycle that are part of this scenario include BYOD, hosted virtual desktops, HTML5, the various forms of cloud computing, silicon anode batteries and media tablets. Although all these technologies and trends need to mature for the scenario to become the norm, HTML 5, hosted virtual networks and silicon anode batteries are particularly strong tipping point candidates.

Smarter Things

A world in which things are smart and connected to the Internet has been in the works for more than a decade. Once connected and made smart, things will help people in every facet of their consumer, citizen and employee lives. There are many enabling technologies and trends required to make this scenario a reality. On the 2012 Hype Cycle, Gartner has included autonomous vehicles, mobile robots, Internet of Things, big data, wireless power, complex-event processing, Internet TV, activity streams, machine-to-machine communication services, mesh networks: sensor, home health monitoring and consumer telematics. The technologies and trends that are the tipping points to success include machine-to-machine communication services, mesh networks: sensor, big data, complex-event processing and activity streams.

Big Data and Global Scale Computing at Small Prices

This broad scenario portrays a world in which analytic insight and computing power are nearly infinite and cost-effectively scalable. Once enterprises gain access to these resources, many improved capabilities are possible, such as better understanding customers or better fraud reduction. The enabling technologies and trends on the 2012 Hype Cycle include quantum computing, the various forms of cloud computing, big data, complex-event processing, social analytics, in-memory database management systems, in-memory analytics, text analytics and predictive analytics. The tipping point technologies that will make this scenario accessible to enterprises, governments and consumers include cloud computing, big data and in-memory database management systems.

The Human Way to Interact With Technology

This scenario describes a world in which people interact a lot more naturally with technology. The technologies on the Hype Cycle that make this possible include human augmentation, volumetric and holographic displays, automatic content recognition, natural-language question answering, speech-to-speech translation, big data, gamification, augmented reality, cloud computing, NFC, gesture control, virtual worlds, biometric authentication methods and speech recognition. Many of these technologies have been “emerging” for multiple years and are starting to become commonplace, however, a few stand out as tipping point technologies including natural-language question answering and NFC.

What Payment Could Really Become

This scenario envisions a cashless world in which every transaction is an electronic one. This will provide enterprises with efficiency and traceability, and consumers with convenience and security. The technologies on the 2012 Hype Cycle that will enable parts of this scenario include NFC payment, mobile over the air (OTA) payment and biometric authentication methods. Related technologies will also impact the payment landscape, albeit more indirectly. These include the Internet of Things, mobile application stores and automatic content recognition. The tipping point will be surpassed when NFC payment and mobile OTA payment technologies mature.

The Voice of the Customer Is on File

Humans are social by nature, which drives a need to share — often publicly. This creates a future in which the “voice of customers” is stored somewhere in the cloud and can be accessed and analyzed to provide better insight into them. The 2012 Hype Cycle features the following enabling technologies and trends: automatic content recognition, crowdsourcing, big data, social analytics, activity streams, cloud computing, audio mining/speech analytics and text analytics. Gartner believes that the tipping point technologies are privacy backlash and big data.

3D Print It at Home

In this scenario, 3D printing allows consumers to print physical objects, such as toys or housewares, at home, just as they print digital photos today. Combined with 3D scanning, it may be possible to scan certain objects with a smartphone and print a near-duplicate. Analysts predict that 3D printing will take more than five years to mature beyond the niche market.

Additional information is available in “Gartner’s Hype Cycle for Emerging Technologies, 2012″ at The Special Report includes a video in which Ms. Fenn provides more details regarding this year’s Hype Cycles, as well as links to the 92 Hype Cycle reports.

Mr. LeHong and Ms. Fenn will provide additional analysis during the Gartner webinar “Emerging Technologies Hype Cycle: What’s Hot for 2012 to 2013” today, August 16, at 10 a.m. EDT and 1 p.m. EDT. To register for one of these complimentary webinars, please visit

Worldwide IT Outsourcing Services Spending on Pace to Surpass $251 Billion in 2012

Cloud Computing Groth

Cloud Computing Growth

Worldwide IT Outsourcing Services Spending on Pace to Surpass $251 Billion in 2012

Key Issues Facing ITO Industry to Be Examined at Gartner Outsourcing & Strategic Partnerships Summits 2012, September 10-12 in Orlando and October 8-9 in London

STAMFORD, Conn., August 7, 2012—                     Worldwide spending for IT outsourcing (ITO) services is on pace to reach $251.7 billion in 2012, a 2.1 percent increase from 2011 spending of $246.6 billion, according to the latest outlook by Gartner, Inc.

The fastest-growing segment within the ITO market is cloud compute services, which is part of the cloud-based infrastructure as a service (IaaS) segment. Cloud compute services are expected to grow 48.7 percent in 2012 to $5.0 billion, up from $3.4 billion in 2011.

“Today, cloud compute services primarily provide automation of basic functions. As next-generation business applications come to market and existing applications are migrated to use automated operations and monitoring, increased value in terms of service consistency, agility and personnel reduction will be delivered”, said Gregor Petri, research director at Gartner. “Continued privacy and compliance concerns may however negatively impact growth in some regions, especially if providers are slow in bringing localized solutions to market.”

Data center outsourcing (DCO), a mature segment of the ITO market, represented 34.5 percent of the market in 2011, but growth will decline 1 percent in 2012. “The data center outsourcing market is at a major tipping point, where various data center processing systems will gradually be replaced by new delivery models through 2016. These new services enable providers to address new categories of clients, extending DCO from traditional large organizations into small or midsize businesses,” said Bryan Britz, research director at Gartner.

The application outsourcing (AO) segment is expected to reach $40.7 billion, a 2 percent increase from 2011 spending of $39.9 billion. This growth reflects enterprises’ needs to manage extensive legacy application environments and their commercial off-the-shelf packages that run the business.

“Change is afoot in the AO market. The burdens of managing the legacy portfolio, along with the limitations of IT budgets, have shifted the enterprise buyers to be cautious and favor a more evolutionary approach to other application services, such as software as a service (SaaS),” said Mr. Britz. “New applications will largely be packaged and/or SaaS-deployed in order to extend and modernize the portfolio in an incremental manner. While custom applications will remain ‘core’ for many organizations, the trend in the next few years to SaaS enablement in the cloud will reflect in the growth of the AO outlook.”

While there will be some impact from the ongoing business slowdown due to sovereign-debt issues in Europe and slowing exports in China, Gartner expects the ITO market in the emerging Asia/Pacific region to represent the highest growth of all regions.

Spending on ITO in the Asia/Pacific region will grow 1 percent in U.S. dollars in 2012 and exceed 2.5 percent growth in 2013. With the exception of Japan, Australia, New Zealand, and to a lesser degree, Singapore and Hong Kong, the countries in Asia/Pacific are quite new in terms of outsourcing usage, understanding and sophistication. The growth is being driven by the large inflow of capital into Asia over the past three to five years, leading to the need among global and regional businesses to scale up their operations.

In North America, Gartner expects that buyers will seek to transition more IT work to annuity-managed service relationships for cost take-out and IT costs. This will keep ITO growing through 2016. Enterprises’ reluctance to hire or make large capital purchases, as well as their pursuit of asset-light IT strategies, continues to push clients toward consuming externally provided services.

A challenging economic scenario that worsened in late 2011 continues to affect the government policies and end-user sentiment in many key European countries, resulting in a forecast for Western Europe ITO growth decline of 1.9 percent in U.S. dollars during 2012. Reinvigorated economic pressure is delaying the willingness of many commercial organizations to focus on enhancing competitiveness rather than cost reduction. In addition, the European public sector will continue to see a cautious budget environment throughout 2012. This will force many central and local government entities to concentrate on outsourcing initiatives aimed at reducing IT cost through IT efficiencies and rationalization.

Additional information is available in the report “Forecast Analysis: IT Outsourcing, Worldwide, 2010-2016, 2Q12 Update,” which is available on Gartner’s website at

About the Gartner Outsourcing & Strategic Partnerships Summits 2012

The Gartner Outsourcing & Strategic Partnerships Summit series provides an in-depth exploration of the significant developments and trends shaping vendor and strategic sourcing management practices, as well as the sourcing marketplace.

For additional details about the Gartner Outsourcing & Strategic Partnerships Summit 2012 taking place September 10-12 in Orlando, Florida, please visit Members of the media can register by contacting Janessa Rivera at

The Gartner Outsourcing & Strategic Partnerships Summit 2012 in London will be held October 8-9. More information is available at  Members of the media can register by contacting Laurence Goasduff at

Additional information from the event will be shared on Twitter at and using #GartnerOUT.

Analysis of Cloud Computing for 2012

Figure 1. Cloud Computing Agenda Overview

Analysis of Cloud Computing for 2012

Cloud computing allows new relationships between those that provide solutions based on technology and those that consume them. As cloud computing matures and adoption grows, businesses continue to explore its potential. But as cloud computing has become more real for the enterprise, there has been massive confusion as to which options are most appropriate for use, and when.

Cloud computing heralds an evolution of business — no less influential than the era of e-business — in potentially positive and negative ways. Virtualization, service orientation and the Internet have converged to sponsor a phenomenon that enables individuals and businesses to choose how they’ll acquire or deliver IT services, with reduced emphasis on the constraints of traditional software and hardware licensing models.

A continuing trend in business computing is the use of outsourcing to shift work from inside an organization to a responsible service provider. This trend is part of the appeal of cloud computing, as businesses seek to divest themselves of computer resources, but retain (or enhance) the value associated with the use of these resources.

Cloud computing affects more than just businesses. The use of the cloud has become commonplace for individual consumers as they acquire solutions based on technology with a reduced need for IT specialists to assist in this effort. The use of cloud solutions brought into the enterprise by individuals is shifting the way IT organizations respond to the demands of their users. This leads to a shift in the technologies these IT organizations buy from, and a shift in which vendors they depend on. Thus, cloud computing is changing user expectations, business reactions, and the vendors and markets that supply them.

Key Issues

·         How should enterprises exploit cloud computing?

·         How will architectures and techniques evolve to support the many flavors of cloud computing?

·         How will cloud computing evolve?

·         What vendors, markets and industries will be transformed by cloud computing?

How should enterprises exploit cloud computing?

Cloud computing implementations are becoming real, and we are learning lessons from the field. Cloud computing will shift the way purchasers of IT products and services contract with vendors (as well as the way those vendors deliver their wares). IT has the ability not only to consume services, but also to provide cloud services by leveraging cloud-enabled technology. IT can increasingly position itself as a broker between internal and external services. Using distributed computing resources, global-class design, new data models, and Web-centric architectures and languages, internal and external organizations can provide cloud computing services, and can potentially offer platforms for building and delivering new applications.

Given the economics of the cloud and the new business models emerging around the delivery of cloud-based services, organizations could create and deliver these new applications at a lower cost, compared with conventional approaches. However, many technological and business models are, as yet, unproved. While there are many areas in which IT can leverage the cloud, there are challenges in adapting an enterprise’s culture, skills, management, integration and vendor management strategies. Anyone, regardless of his or her perspective or role (user or vendor, consumer or provider), may end up being a consumer, provider or broker of cloud services. IT shops in particular specifically need to define their actions in all three levels (especially as a broker between their customers and external rather than internal providers). Those providing private cloud services need to begin to think of themselves as vendors, even if the customers they sell to are very constrained (internal).

How will architectures and techniques evolve to support the many flavors of cloud computing?

Companies can exploit cloud-based services in a variety of ways to develop an application or a solution. The least disruptive approach is to continue using traditional tools and techniques, and to exploit a virtualized pool of compute and storage services to host the application. A more sophisticated model is to build a program that will uniquely exploit cloud-centric distributed and parallel processing capabilities, and run the resulting program in the cloud. Developers can also create and execute applications internally, and simply access external applications, information or process services via a mashup model. All these approaches demand new skills and techniques to build, deploy, manage and maintain applications.

Hybrid cloud computing refers to the combination of external public cloud computing services and internal resources in a coordinated fashion to assemble solutions. Hybrid cloud computing implies significant integration or coordination between internal and external environments. Hybrid cloud computing can take a number of forms, including cloudbursting, where an application is dynamically extended from a private cloud platform to an external public cloud service based on the need for additional resources. More ambitious approaches define a solution as a series of granular services, each of which can run in whole or in part on a private cloud platform or on a number of external cloud platforms, with execution dynamically determined based on changing technical, financial and business conditions.

How will cloud computing evolve?

What sets cloud computing apart from traditional outsourcing and hosting approaches is the focus on both outcome and consumption model. Behind the scenes, providers use particular design models, architectures, technologies and best practices to instantiate and support the delivery of an elastically scalable, service-based environment serving multiple constituents. Providers may have created custom hardware, software and/or processes to deliver the service.

Enterprise IT users have long desired a more agile, flexible and service-based environment for the delivery of internal applications and services. As cloud computing gains momentum, and as particular approaches prove it is possible to lower costs and provide greater flexibility, there is the potential to apply the lessons to internal systems, as well as to leverage external systems.

What vendors, markets and industries will be transformed by cloud computing?

Cloud computing is a disruptive force. The impact will be huge for IT vendors. As new business models evolve and become the province of not just consumer markets, much will change. Consumer-focused vendors are the most mature in delivering a “global-class” offering from technology and community perspectives; most investment in recent years has occurred in consumer services. Business-focused vendors have rich business services and robust technologies. While, at times, they are very mature in selling these offerings, they also face wholesale changes as a result of the disruption.

Many vendors from different perspectives (traditional IT vendors, Web-centric vendors or vendors from other businesses) that have not been technology providers will play an important role in the overall cloud market. Vendors will comprise those that provide cloud services directly, those that provide it intermediately (as cloud services brokers) and those that provide cloud-enabling technology providers can use (whether public or private). Cloud computing is impacting industries in many varying ways. Some highly regulated industries are somewhat limited in what they can do today, while some industries are leading the charge in cloud adoption.

Related Priorities

Key Initiatives address significant business opportunities and threats, and typically have defined objectives, substantial financial implications and high organizational visibility. They are typically implemented by a designated team with clear roles, responsibilities and defined performance objectives. Table 1 shows the related priorities.

Table 1. Related Priorities
Key Initiative Focus
Application Development Application development involves methods and practices for developing, deploying and maintaining custom software applications.
Application and Integration Platforms Application infrastructure is essential software (middleware) that executes and integrates business applications on-premises and in the cloud.
Cloud Computing Cloud computing is a style of computing where scalable and elastic IT-related capabilities are provided as a service to customers, using Internet technologies.
SOA and Application Architecture Modern application architectures — such as service-oriented architecture (SOA), event-driven architecture (EDA), representational state transfer (REST) and others — help create agile business applications.
Portal and Web Strategies Web computing comprises diverse technologies and business solutions, ranging from simple, traditional websites to sophisticated enterprise portals and mobile applications.
Virtualization Virtualization detaches workloads and data from the functional side of physical infrastructure, enabling unprecedented flexibility and agility for storage, servers and desktops.

Source: Gartner (February 2012)

By David Mitchell Smith | David W. Cearley

Gartner Says Consumers Will Spend $2.1 Trillion on Technology Products and Services Worldwide in 2012

Internet Services Spending 2012

Technology Spending 2012

Gartner Says Consumers Will Spend $2.1 Trillion on Technology Products and Services Worldwide in 2012

Spending will continue to grow at a faster rate, at around $130 billion a year, to reach $2.7 trillion by the end of 2016.

STAMFORD, Conn., July 26, 2012— Consumers will spend $2.1 trillion worldwide on digital information and entertainment products and services in 2012, according to Gartner, Inc. This amounts to a $114 billion global increase compared with 2011, and spending will continue to grow at a faster rate than in the past, at around $130 billion a year, to reach $2.7 trillion by the end of 2016.

The $2.1 trillion consists of what the consumers will spend on mobile phones, computing and entertainment, media and other smart devices, the services that are required to make these devices connected to the appropriate network, and software and media content that are consumed via these devices.

“The three largest segments of the consumer technology market are, and will continue to be, mobile services, mobile phones and entertainment services,” said Amanda Sabia, principal research analyst at Gartner. “There are two product classes, which in terms of absolute dollars are significantly smaller, but offer tremendous growth by 2016. These are mobile apps stores and e-text content. We fully expect consumers to more than triple their spending in these latter two categories by 2016.”

Mobile services are expected to generate 37 percent of total worldwide consumer technology spending in 2012 — that is $0.8 trillion — rising to almost $1 trillion by 2016. Mobile phones will account for 10 percent of total spending in 2012 — that is $222 billion — rising to almost $300 billion by 2016. Similarly, entertainment services — cable, satellite, IPTV and online gaming, will account for 10 percent of total consumer spending on technology products and services in 2012, at $210 billion, rising to almost $290 billion in 2016.

Gartner predicts that consumer spending on mobile apps stores and content will rise from $18 billion in 2012 to $61 billion by 2016, and that spending on e-text content (e-books, online news, magazines and information services) will rise from $5 billion in 2012 to $16 billion by 2016.

“Our research consistently shows that consumers are willing to pay for content they deem “worth it”,” Ms. Sabia said. “However, our research has also found that consumers are willing to tolerate an ad-supported business model in exchange for free functions and content such as personal cloud storage, social networking, information searching, email, IM, person-to-person (P2P) voice (Skype and mobile voice over IP [VoIP]), streaming/downloading video and musical content when accessing the Internet.”

The inter-relationships among the various segments are getting more critical. For example, new multidevice rate plans being announced by U.S. mobile carriers are enabling consumers to get more from their devices. These persistent connections to more phones, tablets and mobile PCs will increase the value of entire ecosystem and will drive hardware sales. Partnerships among vendors in different segments are needed to build the bridges among the various platforms and deliver simpler solutions.

Additional information is available in the Gartner report “Market Trends: Worldwide Consumer Tech Spending.” The report is available on Gartner’s website at

Cloud Security

Seven cloud-computing security risks you should be aware of!

Cloud Security

Cloud Security

Cloud computing is picking up traction with businesses, but before you jump into the cloud, you should know the unique security risks it entails

Cloud computing is fraught with security risks. Smart customers will ask tough questions and consider getting a security assessment from a neutral third party before committing to a cloud vendor,  in a June report titled “Assessing the Security Risks of Cloud Computing.”

Cloud computing has “unique attributes that require risk assessment in areas such as data integrity, recovery, and privacy, and an evaluation of legal issues in areas such as e-discovery, regulatory compliance, and auditing,”.

Amazon’s EC2 service and Google’s Google App Engine are examples of cloud computing, which is a type of computing in which “massively scalable IT-enabled capabilities are delivered ‘as a service’ to external customers using Internet technologies.”

[Learn more about what cloud computing really means and the new breed of utility computing and platform-as-a-service offerings. Give us a call at 888-498-4333 and we can help you understand everything there is to know about your Cloud choices]

Customers must demand transparency, avoiding vendors that refuse to provide detailed information on security programs. Ask questions related to the qualifications of policy makers, architects, coders and operators; risk-control processes and technical mechanisms; and the level of testing that’s been done to verify that service and control processes are functioning as intended, and that vendors can identify unanticipated vulnerabilities.

Here are seven of the specific security issues customers should raise with vendors before selecting a cloud vendor.

1. Privileged user access. Sensitive data processed outside the enterprise brings with it an inherent level of risk, because outsourced services bypass the “physical, logical and personnel controls” IT shops exert over in-house programs. Get as much information as you can about the people who manage your data. “Ask providers to supply specific information on the hiring and oversight of privileged administrators, and the controls over their access”.

2. Regulatory compliance. Customers are ultimately responsible for the security and integrity of their own data, even when it is held by a service provider. Traditional service providers are subjected to external audits and security certifications. Cloud computing providers who refuse to undergo this scrutiny are “signaling that customers can only use them for the most trivial functions”.

3. Data location. When you use the cloud, you probably won’t know exactly where your data is hosted. In fact, you might not even know what country it will be stored in. Ask providers if they will commit to storing and processing data in specific jurisdictions, and whether they will make a contractual commitment to obey local privacy requirements on behalf of their customers.

4. Data segregation. Data in the cloud is typically in a shared environment alongside data from other customers. Encryption is effective but isn’t a cure-all. “Find out what is done to segregate data at rest”. The cloud provider should provide evidence that encryption schemes were designed and tested by experienced specialists. “Encryption accidents can make data totally unusable, and even normal encryption can complicate availability”.

5. Recovery. Even if you don’t know where your data is, a cloud provider should tell you what will happen to your data and service in case of a disaster. “Any offering that does not replicate the data and application infrastructure across multiple sites is vulnerable to a total failure”. Ask your provider if it has “the ability to do a complete restoration, and how long it will take.”

6. Investigative support. Investigating inappropriate or illegal activity may be impossible in cloud computing. “Cloud services are especially difficult to investigate, because logging and data for multiple customers may be co-located and may also be spread across an ever-changing set of hosts and data centers. If you cannot get a contractual commitment to support specific forms of investigation, along with evidence that the vendor has already successfully supported such activities, then your only safe assumption is that investigation and discovery requests will be impossible.”

7. Long-term viability. Ideally, your cloud computing provider will never go broke or get acquired and swallowed up by a larger company. But you must be sure your data will remain available even after such an event. “Ask potential providers how you would get your data back and if it would be in a format that you could import into a replacement application”.

Securing and Managing Cloud Computing


In our most recent Hype Cycle on cloud computing (“Hype Cycle for Cloud Computing, 2011”), Gartner positioned cloud computing firmly atop the Peak of Inflated Expectations — nearly identical to where we had it in 2010. While there certainly is an enormous amount of hype around cloud computing overall, Gartner has seen rapid adoption of software as a service (SaaS), early adopter clients testing Infrastructure as a service (IaaS) and platform as a service (PaaS), and strong growth in cloud delivery of security services. For these reasons, we’ve given this Spotlight on securing and managing cloud a “how to” focus, rather than the “What does it mean?” focus we took in our 2010 cloud security Spotlight.

Figure 1 shows why we think this is an important time to be thinking through how you will secure your business’s use of cloud computing. Gartner has identified the typical stages that enterprises will go through, from data center virtualization, to private cloud, to hybrid cloud use, where “cloudbusting” to external cloud resources augments local data center/private cloud capacity. In Gartner’s most recent survey of data center managers, close to half believe they will be using hybrid cloud (see “Design Your Private Cloud With Hybrid in Mind”) by 2015, and almost one-third believe they will be delivering private cloud capabilities. In order to do either of these things, IT architectures and processes will need to change and extend — and the same is true for security capabilities. Now is the time to be planning how to ensure company and customer data can be protected when public and private cloud services are used and also how security policies and architectures can take advantage of cloud delivery to actually increase levels of security.

Figure 1. Progress Toward Virtualization

An important input to Gartner research is the constant stream of inquiries from Gartner clients, both direct to Gartner analysts and via searches of In “Top 10 Gartner Client Inquiries in Cloud Security,” we look at the trends that data reveals. Gartner clients have moved from asking tutorial questions about cloud security to looking for ways to evaluate the relative levels of security of various cloud offerings. The demographics of the search data show early interest by the services and education industries, as well as small businesses.

Gartner also performs primary research, surveying industry IT decision makers on a wide variety of topics. In January 2012, we conducted a survey of information risk management professionals in the U.S., Canada, the U.K. and Germany. The survey population was 425 respondents with security or risk responsibility at firms with at least 500 employees and $500 million in revenue. “Survey Analysis: Assessment Practices for Cloud, SaaS and Partner Risks, 2012” shows the results of that survey compared to previous years. One of the key findings shows that organizations of all sizes are increasingly willing to place their data in the cloud, and while the percentage of those that have formalized processes for the assessment of the associated risk has increased, 50% still do not have such formalized processes in place.

One of the reasons it is important for enterprises to assess the risk of using cloud-based services is that they will always retain the ultimate liability for loss or exposure of customer information in the event of a security incident involving those cloud services. The only way to limit your liability is to have the appropriate clauses in your contracts with cloud service providers. “IT Procurement Best Practice: Nine Contractual Terms to Reduce Risk in Cloud Contracts” provides guidance on the critical contract language to include. “Best Practices for Limiting Data Protection Exposure in the Cloud” provides Gartner’s recommendations for addressing liability around the loss of access to sensitive data stored or processed in cloud services.

The best way to limit your liability, of course, is to avoid security incidents by minimizing vulnerabilities and using effective security controls. Many organizations have made significant progress in increasing the security levels of their own infrastructures, and will need to extend those security processes out to cloud services. “Application Security Testing of Cloud Services Is a Must” lays out several models for assuring that cloud-based applications are fully tested for application-level vulnerabilities prior to operational use. Of course, there is no such thing as software that stays vulnerability-free over time, and patching of software will always be required. “Extending Patch and Vulnerability Management to the Cloud” details several scenarios for ensuring that SaaS, IaaS and PaaS offerings meet the requirements for patch management.

While vulnerabilities can enable external attackers to compromise systems and inflict severe business damage, accidental data disclosure attacks have been a constant problem. The cloud represents another potential path for unintended data leakage or loss. “Data Security Monitoring for the Cloud: Challenges and Solutions” describes the major issues around maintaining data-aware security controls when cloud services are used, and details recommended approaches for minimizing the risks.

Most enterprises will end up using a mix of cloud services across SaaS, PaaS and IaaS, along with a continuing inventory of internally hosted applications. Extending identity and access management across this array of services will require new security processes and new delivery mechanisms for authentication and authorization controls. When new employees join the company, they will now need to be provisioned into external cloud-based applications as well as into internal applications. More importantly, when their privileges change or when they leave the company, they will need to have their access rights modified or deleted across that increasingly complex array of applications. “A Guide to Making the Right Choices in the Expanding IDaaS Market” details Gartner’s view of how identity and access management services will need to evolve to deal with this, while the recently published “Magic Quadrant for SOA Governance Technologies” compares products that are increasingly playing a role in supported identity federation across business partners and cloud services.

Every time IT adds new delivery mechanisms for applications, security needs to quickly add those same delivery mechanisms. This was true when we moved from mainframe to client/server, client/server to Internet, and so on — and it will be true as the move to adding cloud-based delivery grows. Cloud-based delivery of security will be required for most security controls — identity and access management as a service is an example of this. “The Growing Adoption of Cloud-Based Security Services” provides Gartner’s projections for how rapidly other areas of security will add cloud-based delivery.

We have focused the majority of this Spotlight on securing the use of public cloud services, but private cloud services also require security processes and controls to evolve and adapt. For most organizations, private cloud will happen before any use of public IaaS or PaaS, so the security and management decisions made for private cloud can ease the path for secure use of public cloud. “Make Optimizing Security Protection In Virtualized Environments a Priority” provides guidelines for focusing on the best ways to roll out private cloud services as securely as possible. “How to Build An Enterprise Cloud Service Architecture” points out that both private and hybrid cloud services will require cultural and political changes inside the IT organization to enable the automation of predefined planning, policies, service levels and automated actions on the run-time environment, as opposed to the manual initiation of scripts or workflows.

To tie many of these concepts together, we’ve included a case study of the process a financial organization went through to implement hybrid cloud. “Case Study: Securing the Cloud” describes the security features that enabled a high-value application run on internal data centers to be “cloudbusted” to a public IaaS provider to support business demands for global, elastic service delivery.

By John Pescatore @ Gartner, Inc.

Startup Skyera Aims to Place Affordable SSD In Every Data-Center

Skyera Skyhawk Storage Appliance

Skyera Skyhawk Storage Appliance

By Jarrett Neil Ridlinghafer

Skyera, a solid-state (SSD) storage startup has started offering its new Skyhawk Family of rackable SSD storage solutions at an affordable $3 per gigabyte.

Utilizing consumer-grade multi-layered SSD components enabled them to begin offering the first affordable solid-state storage on the market, they hope to compete with traditional storage solutions for your IT dollars.

Skyera is offering the Skyhawk at three capacity points: 12, 22, and 44 terabytes, priced respectively at  $48,000, $77,000 and $131,000, according to StorageReview. Skyhawk is expected to ship in early 2013.

You can find out more at their Company Website: