Many people would have you believe that modern day Cloud Computing is not a new concept.
As Wikipedia points out:
“The underlying concept of cloud computing dates back to the 1950s, when large-scale mainframe became available in academia and corporations, accessible via thin clients / terminal computers, often referred to as “dumb terminals”, because they were used for communications but had no internal computational capacities.”
I get the following kinds of reaction at conferences whenever I talk about working in the Cloud and the benefits of Cloud Computing:
“Seen it all before”, “Nothing new”, “Fad”
But these reactions are missing the point. Things are very different now.
Historically, Cloud computing was still effectively point-to-point computing. A mainframe provided a series of apps to a client terminal or computer. As a user you were limited to connecting with whatever apps were available via the mainframe or other computing service. The principle benefit was one of sharing resources.
Today the benefits of sharing or outsourcing computing resources are still considerable but the real value comes from sharing and implementing new ideas.
Historically, companies relied solely on their IT team to develop apps that could service their business and operations. Most companies still have an IT team but the concept of an IT team delivering apps and information services is rapidly becoming archaic.
Companies today need to be leaner, faster, flexible and prepared to adapt far more effectively than ever before. Business models are being broken down and reinvented. Traditional IT teams can’t cope with these massive upheavals.
Cloud Computing is the only salvation. The ability to select new providers, innovate new solutions, test, fail, improve and break new ground – these are the niches that Cloud Computing has embraced.
If you still think of Cloud Computing as time-sliced computing then you’re missing the point.
The game has changed, forever.