The consensus on the best way to manage IoT device data has flipped. Advocates for managing the information on computers in-house have lost their majority in a massive swing in consensus. The conventional wisdom about cloud management has changed from why to why not, writes Nick Booth.
The in-house party isn’t so small that they could lose their deposit – but it’s heading that way.
Hima Mukkamala, the senior vice president and general manager of IoT Cloud Services at Arm, has an explanation. The public cloud deploys faster and offers low entry costs, minimises the customer’s need for a data centre infrastructure and changes the financing from the rigid formality of capital investment to a flexible operational cost.
On the other hand enterprises that run their own systems ‘onpremise’ have greater control and governance of their data, says Mukkamala. “This supports faster decision-making, tighter security and a system that runs even where there’s no internet coverage. So a water company that’s governed by strict regulations on where its data can go and who can access it is better served by on-premises deployments.”
In this niche of the Internet of Things (IoT) the cloud doesn’t fit with every data manager’s philosophy. In this niche localised control can be more of a priority. Machine to machine (M2M) communications run to a different rhythm because – as automatons – neither end user is constrained by the fatigue, capacity or speeds of input limitations that humans have. Which means they have a voracious appetite for consumption – and therefore transport and storage – of data in either greater volumes or a fraction of the delay – possibly both.
Data protections and data management are much easier in the cloud, says Druva’s chief technologist, W.Curtis Preston. Why? Because the likes of Amazon Web Services (AWS) and Microsoft with Azure have based their entire business on honing these specialist skills. “So there is no way you can do it better or cheaper than they can. There are no savings to be made trying to build your own infrastructure,” says Preston.
You can’t get better security than AWS, which has the most closely vetted centres in the world, according to Preston. “The cloud isn’t the wild west any more so nobody needs to lay their own roads or build their own bank,” says Preston.
There are political reasons why you should never run your data management systems on your own premises, Preston says. These are routed in human foibles rather than machine failings, so there is no technical fix.
Backup and recovery is a vital job, but is incredibly boring and unrewarding. It’s a low status job that nobody wants and is invariably defaulted to the office junior. Which is a fatal mistake, given that data is the lifeblood of the company. Emergency transfusions should not depend on the operational skills of the hospital’s work experience trainee.
“I’ve been doing this job 25 years and never seen anyone want to do the backup and recovery,” says Preston. This means that on premise back up and recovery dangerously compromises performance, the likelihood of recovery and the degrees of security.
By contrast, software as a service (SaaS) for data protection is fine-tuned by almost perfect competitive market conditions and a skills base matured by decades of experience.
The only circumstance where data protection is not compromised by DIY savings is when the data amassed is static, unchanging and in such vast volumes that the economies of scale outweigh the saving from paying a service provider. Which is very rare, since service providers do everything more efficiently. This would happen if you have a single data centre with tens or hundreds of petabytes of uniform data. On the whole, IoT is unlikely to generate data that is static. IoT deployments are dynamic in nature.
There are more likely to be circumstances in which thousands of devices are each giving off tiny records which are unlikely to be subject to rapid and constant examination. For example, devices that measure temperature and humidity in an environment that rarely changes.
There are two options for protecting your data in the cloud. You either buy software and run it on your own virtual machines or you get a software service provider to do it for you.
“Running your own software on your own property gives you problems you have to own,” says Preston. The most expensive problem is that this forces you to expertly guess the future – which is near impossible in any aspect of computing, let alone something as volatile as the IoT. Nobody can correctly predict their future needs for bandwidth, storage and processing power. If you could predict the future, you wouldn’t need all those measuring devices out there monitoring all the variables. Even backup and recovery are unpredictable.
If you buy a service to do all this, it will run on the SaaS provider’s systems so they take responsibility. The SaaS provider can optimise for the cloud and coax the best possible performance with totally predictable costs.
One problem with the cloud is that the pricing plans are often tricky, says Martino Corbeli, chief product officer at integration platform maker SpinR. “It’s a bit like going into a car showroom knowing what you want to get but being shifted into various pricing plans none of which quite fit.”
Despite that one area of uncertainty, Corbeli says we are definitely going into a cloud first world because the savings made – by not having to worry about the peaks and troughs of variable data – will compensate.
There is no one size fits all solution. In circumstances where there is a wide diversity of needs, the versatility of a service provider becomes their most attractive quality, because it means they can tailor their supply to match your demand.
So your best bet is to look for providers who can manage all three options – in cloud, on-premise and as hybrid, concludes Arm’s Mukkamala.