Data portability is now one of the main issues companies face when using cloud services, especially with a hybrid cloud strategy.
The situation is created by several conditions. First, some cloud providers use proprietary data formats, meaning that information can’t easily be moved to an alternative provider. Secondly, the volumes of data involved in such a move may be considerable, making high bandwidth connectivity essential if downtime is to be reduced.

This is on top of the perennial issues around data, including sovereignty, stewardship, and data protection laws. These factors tend to mitigate against moving to a new provider, even if prices or service might be better elsewhere.
This leads to data portability becoming an important consideration. If a cloud provider cannot meet its customers’ expectations, or worse, if it goes bust, those firms need to be able to move their data quickly, easily and with an absolute minimum of downtime.
This issue is especially critical in a hybrid strategy, where several different platforms are all in the mix.
“As people move to hybrid cloud they want to use multiple clouds,” said Martin Warren, cloud solutions marketing manager, EMEA, NetApp. “A company might start out with a private cloud, then it might consume some public cloud services, then it might use multiple cloud providers. It may also want to use hyperscalers like Amazon, Microsoft or IBM.
“The issue is that it’s fine turning compute services on, it’s easy to turn on and off, but data is often left behind. You don’t own the network, the compute or the storage, but you do own the data, it remains your responsibility.”
Warren explained that a new approach is to place a firm’s data “next to the cloud”, rather than in it.
“You can store the data in a co-location facility, and that data would then be linked to the compute of a cloud provider, which could be Amazon or Microsoft or others. The customer then gets total security, as no data is actually in cloud, but instead it’s next to it. They still get all the benefits of the cloud, but fewer risks.”
Warren added that he still hears from customers that they worry about security and vendor lock-in when they approach cloud suppliers. Indeed this chimes with Computing’s own research. In 2014, Computing surveyed its readers and asked: “What concerns, if any, do you have about using cloud applications, platforms and/or infrastructures?”
The results were conclusive: 64 per cent of respondents cited data security as their biggest worry, and 54 per cent chose “fears of lock-in or over-reliance on an external company” (respondents could select multiple answers – other less popular choices included “data latency/accessibility” and “data sovereignty/ownership”).
“Our customers tell us as they move into the cloud they’re concerned about vendor lock-in and loss of control,” continued Warren. “And security is a big part of that. They like the cloud, they see the benefits, but they have concerns around security, it’s a question of confidence from the customer’s perspective. They need to be totally convinced before they put important data into those environments.”
Warren concluded on an encouraging note, suggesting that industry fears are generally being dispelled as more firms have positive experiences with cloud services.
“Fears are starting to dissipate, as people get more into the cloud and get more experience with it, they find it’s not quite as they feared. That confidence spreads by word of mouth, as people talk to their peers,” he added.
This issue will be explored in depth in Computing’s upcoming free web-seminar ‘Data portability – the missing piece in the hybrid cloud puzzle’. It will be broadcast live on Wednesday, 29 April at 11am BST – click here to register.

Leave a Reply