Until recently, enterprise cloud infrastructure was primarily used for Dev/Ops and marketing organizations because internal IT was not efficient enough to provision and set up a complete compute platform in a timely manner. Since there were few Cloud Service Providers (CSPs) at the time, AWS got the mindshare of the public cloud workloads. Today the cloud market is extremely competitive, flooded with companies trying to gain market share.
This proliferation of multiple CSPs and the combination of Hybrid Cloud, Multi-Cloud, Private Cloud and SaaS Cloud applications has created a complex cloud setup. On average companies are using three different public clouds and three different private clouds. Which raises the questions, why so many clouds, and how do enterprises best utilize these different clouds for various enterprise workloads?
The Multi-Cloud adoption is the result of enterprises utilizing different clouds for different applications. Not one cloud size fits all. Certain applications work better in one cloud environment than another. For example, Big Data applications need high performing compute power with high Input/Output Operations Per Second (IOPS) to accomplish massive analytical calculations. This type of application is better supported in hyperscale cloud server environments. Hyperscale servers are designed for the distributed workloads that require high performance CPU, RAM and IOPS to crunch massive amounts of data quickly. Choosing the wrong cloud provider for this type of workload is potentially detrimental to the performance of this application.
According to Cloud Spectators 2016 report, the best hyperscale cloud service, Microsoft Azure’s new D2 series, performs above all other cloud providers. SoftLayer ranked at the bottom for hyperscale performance. For normal web tier application workloads, however, a cloud provider with lower server build requirements at a lower price point would be sufficient.
While Public cloud continues to be geared to web based applications and test/dev, there is still a need to support legacy applications and highly sensitive enterprise data. This is why private cloud is continuing to grow in the enterprise. According to the RightScale survey, private cloud adoption increased from 63% in 2015 to 77% this year. The private cloud fits well for critical applications like SAP, SQL servers, and Oracle apps that need higher performance, security and availability.
We cannot ignore the old Unix and mainframes systems that are still being utilized by enterprises in a private data center facility, which represents a good chunk of the enterprise applications. These old legacy applications are not easily migrated to the cloud; to do so, they need to be rewritten and re-architected. This is why the hybrid model continues to grow.
As more applications get moved to the cloud, we’ll have to manage, monitor, secure, and deliver those applications with highest performance across the public internet. This can be a challenge because of the lack of control and visibility. The additional growth of hybrid cloud and multi-cloud strategies adds more complexity for IT operations to manage, monitor and maintain. Enterprises must implement new solutions around monitoring, managing, securing, and orchestrating across multiple cloud assets to ensure the highest level of application performance, visibility and availability.
Nate Brown is a Solution Sales Engineer for Dyn, a cloud-based Internet Performance company that helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience.