That's it? No. Even when your data is stored somewhere it might not be at your disposal 24/7 because there is a risk called "Loss of Data Access", which means that there is a copy of the data somewhere but it will take time to make it usable. No one knows how long it will take until the data is restored or recovered and during that time it may seem like the data was lost. In September and July this year, Gmail and related services were offline for 40 minutes; similar outages also occurred in April and December last year. The list goes on that Netflix services were offline due to an operational dysfunction at Amazon Web Services last year, and in 2011, Amazon EC2 suffered an outage as a result of a network configuration change run amok…
The value of using an alternate cloud-based storage provider for backups is because there is NO guarantee that the location in which you currently store your data will be online tomorrow, or stay online long enough for you to get your data transferred. The prime example of this is the recent announcement of the Nirvanix closure. Despite the company giving its customers two weeks to move their data elsewhere, it will take them days to transfer their data given the majority of the available bandwidth, let alone the number of days for them to get their data off that service when the bandwidth is shared with every other customer within a fixed period of time. Therefore, it is much better to already have the data elsewhere, and not actually have to dump it all.
So what factors do you need to consider when selecting a second cloud-based provider? Generally they should be similar to the factors considered when selecting your primary cloud provider, but this also relates to how the secondary data is being stored. If it's being stored as backup data, that would need to be restored to the primary service, then you may find that performance considerations are less significant than they were for online access. On the other hand, you might also opt to configure this secondary storage as a potential "online" source (many applications can be easily reconfigured to point to a different data source, restarted, and keep on operating). Doing this requires a couple of extra considerations. First, the performance considerations are equal to that of the primary, but more so, you need to be able to replicate data from the primary source to the secondary source in near real time.
Business continuity requires access to business data. The premise of the cloud is access to that data is available anytime/anywhere -- until you can't access it. So, in addition to backing up that data to a different cloud-based storage provider, DO always back it up to a LOCAL resource or have a LOCAL copy of the data in the physical possession of the data owner.
The primary motivation has always been business continuity or at least some form thereof. Cloud-based data storage is only as good as the availability of network connectivity to get to that data. However, if the circuit(s) is/are cut by a backhoe, or an upstream provider's capacity is restricted or completely offline, and the only available data is "in the cloud", then do not conduct any business until connectivity is restored. Always remember -- local data storage is a good alternative for business continuity.
Be aware that entire backup process is likely going to be a bit more complex than it would be if everything was on premise in your own data center. The lack of available tools is precisely why the process would be "more complex" in certain scenarios. Furthermore, the inability to easily retrieve data from cloud-based application service providers also causes complexity, especially when compared to the less-complicated process of merely using native database server backup tools or built-in operating system backup tools to make those backups. Some backups may need to be run from the applications themselves, particularly those that are SaaS-based. Some providers have tools specifically for this purpose. For example, NetSuite provides a tool expressly designed to back up NetSuite cloud-based data into an on-premise database instance.
Most importantly, this effort will likely involve a fair amount of home-grown work. In Lawrence Garvin's research for this article, he could not find a single provider focused on backing up cloud-based services or storage.
"The problem with no service provider can yet provide services with respect to backing up CloudDataSource 'A' to CloudDataSource 'B' is that the end-user and the data owner must unfortunately fend for themselves. Ideally this would be a consideration prior to subscribing to a cloud-based service, but the truth is that very few organizations have actually considered what complications arise trying to get their Lead & Prospect data out of Salesforce.com. In fact, they have probably not even recognized that having a second copy of that data might be a good thing", Lawrence comments.
Lawrence Garvin is a Head Geek and technical product marketing manager at SolarWinds, a Microsoft Certified IT Professional (MCITP), and a nine-time consecutive recipient of the Microsoft MVP award in recognition for his contributions to the Microsoft TechNet WSUS forum.