Access and Feeds

Cloud Computing: Expect Headaches with Multi-Cloud Applications

By Dick Weisinger

Cloud latency refers to slowdown that occur when sending and receiving data across the cloud. Apps that are running on the same service, like AWS or Azure, typically have no need to use the network outside their services and are able to easily run with little or virtually no latency.

A number a parameters can affect cloud latency. These include:

  • The design and topology used by a cloud provider
  • Region pairs for intra-cloud links
  • Distance between the client and the cloud region
  • The ISP used for cloud access

But what about designing applications that are built on resources that are located in different cloud vendor data centers? The idea that might motivate such a design is to use best of breed components, for example, running an Oracle database on Oracle’s cloud while the application server is written using Microsoft technology and runs on Azure.

In general, an inter-cloud app design is a bad idea. First, applications that need to communicate with each other but which run in cloud data centers on different vendors will be subject to latency costs from communication across the internet.

But not only inter-cloud latency, the design will significantly add to the complexity of the solution. And it will be less resilient. The use of multiple cloud vendors increases exposure to the possible outages and disruptions that could occur in the different cloud infrastructures.

So there is good reason not to build multi-cloud applications. But there are always exceptions. While keeping in mind the possible issues, consider what Mark Carleton, Chief Operating Officer at MESTEC, said about an application designed to run across Oracle and Microsoft cloud centers:

“We did a proof of concept that basically had three bits to it. We ported our web application to Web App Services (MSFT), which was a breeze. We ported our database to ATP (Oracle). Then there was just a little bit of working with Microsoft and Oracle to make sure that it worked okay over public cloud. If we split those things apart, what’s that going to do for latency and performance? To our relief, and our surprise, we found that it behaved better over public internet than our legacy infrastructure had in a single data center. The benefits of the best of breed PaaS in those two environments far outweighs the disadvantages of having those things on different clouds.”

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Leave a Reply

Your email address will not be published. Required fields are marked *

*