Top 10 Cloud myths busted, part 1

From TalentLMS by Nikos Andriotis

Online learning & teaching activities TalentLMS blogHere at Epignosis, the company behind TalentLMS, we view the Cloud as one of the pillars of modern computing in general and educational technology in particular, both now and for the foreseeable future. Until direct to brain interfaces become popular and training material is directly implanted into our memories, that is.

Cloud computing remains a mystery for many enterprise departments and businesses, and there are plenty of myths built up around deploying and running applications on it.

Let’s wear our myth-busters suits, and clean up those myths once and for all.

Myth #1 : Everything works better in the cloud

Having a successful new technology be perceived as a silver bullet is the oldest cultural problem in IT, and Cloud has been no exception in this regard.

Truth is, the cloud might not be right for all of your IT infrastructure needs.

And even when it is, it might a private cloud (which offers more integration options and raw performance) that fits your needs better, as opposed to the public cloud.

Depending on your particular use cases, the right overall solution is often best achieved through a combination of one or more of the available options (public and private cloud, dedicated hosting, and even good, old-fashioned native applications).

Myth #2 : The cloud is not secure

Security concerns are the main barrier to cloud adoption, and they really shouldn’t be.

The key insight regarding security is to understand that no system is ever 100% secure (just ask the CIA or NASA), and that why it’s important to assess the relative risk.

Are your local computers, internal networks, and company servers better protected than cloud-based assets? Does your IT team know how to properly install and secure any third party server product you ask them to deploy?

In most cases the answer is no. And relying on “security through obscurity” is prone to fail and is actually discouraged by security advisors.

The major cloud providers invest more heavily in security (and data safety) than the average business, and can afford to have top notch administrators and security gurus on their team — people who know their Cloud offering inside out.

Myth #3: Cloud is expensive. Or cheap.

It is not always cheaper to run on the cloud, but it can often be more cost efficient. As with all business expenses, it’s not just the month-to-month costs that matter but the “total cost of ownership” (TCO).

The up-front costs of a cloud migration can often be sizeable, but the savings over time typically offset the initial expenditure. Plus, switching to an operational expenditure model rather than a capital expenditure one, can be beneficial for many businesses.

Then there’s another important metric, well known to economists and business owners: opportunity cost.

Is building and managing your own infrastructure the best use of your company’s energy and budget? For most businesses that would be like building your own office furniture or making your own printing paper from tree pulp. It just doesn’t make sense.

Myth #4 : There’s only one cloud

Words can be misleading in subtle ways. The fact that the word cloud (as in “the Cloud”) is singular doesn’t really help.

There is, in fact, an innumerable number of clouds, both public and private.

Consumer cloud hosting providers do not store all their data in the same box.

As for the major players, Google, Facebook, Amazon, IBM, Apple, Microsoft etc. – they all run their own cloud.

This is obviously good: it means that you have options, that by adopting the Cloud you’re not tied to a particular vendor, unless you chose to (e.g. to take advantage of some proprietary APIs and services they might offer).

But it also means you have to do your research, as not all clouds are created equal.

Myth #5 : The cloud is a fad

Nothing could be further from the truth.

If anything, it was the isolated personal computer that has proven to be a fad.

You see, the idea that computing should be organized like a public utility goes as far back as 1961.

And for the first few decades of commercial computing, computers were in fact just like that: central behemoths serving tens or thousands of users using dumb terminals.

Then, in the eighties and early nineties, we had the short period of non-connected personal computers whose only option was to run programs natively. A period that came to an end with the emergence of commercial internet and the world wide web.

Today our browsers are once again those “dumb terminals”, whose intelligence comes from being connected to massive cloud services such as Google, Facebook, Wikipedia, YouTube, Flickr, and the like.

While native applications such as Word and Excel are all well and good (and even preferable for CPU demanding tasks, such as video editing and games) there’s simply no going back to the pre-internet and pre-cloud era.

In fact, Gartner predicts that companies will spend $788 billions on public cloud services the next four years, while McKinsey forecasts that cloud technology could have an economic impact of $1.7 to $6.2 trillion a year by 2025.

Those clouds aren’t dissipating any time soon.

Leave a Reply

Your email address will not be published. Required fields are marked *