Friday, February 24, 2012

What the Heck is Cloud Computing? - reblog

It is fascinating to me how 21st Century ideas are such knock-offs of common 20th Century stuff. Take Cloud Computing. Its roots lie in the mainframe computer world of the 1950s and 60s. The Eniac computer of John Mauchly and Presper Eckert at the University of Pennsylvania was a physically huge research tool utilized to solve Bureau of the Census compilation challenges. It was centralized due to size, cost, and its uniqueness. As that technology developed, connections to it were moved farther and farther away from Philadelphia as remote users wanted to purchase “shares” of its CPU cycles and to store small quantities of data on its discs and tapes. Even though centralization was a necessity, the extension of computing capability out to remote users increased utilization, revenue, and the natural proliferation of new devices and innovation. The devices of those days had tremendous computing capability and efficiency when compared to most common research and commerce practices. But over time, greater access to technology, capability and efficiency, stimulated demand and produced the technology revolution that followed through the end of the 20th Century.

The resulting developments produced mini computers from companies like IBM, HP, and DEC. They were smaller, more portable, and less expensive to operate. Rather than handling large computational tasks, they were built to supplant more manual accounting and transaction processing systems. The days of the coin box and ledger passed in businesses and offices. Later came the Apple/PC microcomputer introductions in the early 80s when the computer arrived in the household. Interestingly, the micro PC was NOT connected to the outside world at first. Instead it became a household (or office) appliance that was mostly self-contained. Again, these small, inexpensive devices were connected first to each other and then to larger systems in the same fashion as the early mainframes and minis. The difference was that now the microcomputer itself could compute on its own or share complex tasks with the other remote machines to which it connected.
Then came the Internet in the 80s and the emergence of thousands of hubs that now house varieties of mediated information. The Internet itself guaranteed the permanent connections among computing devices that we now take for granted.

But the Cloud. Come on! In my honest opinion, the term is such a silly marketing notion. That is because it really isn’t a new thing or even worthy of being branded. It is akin to branding the stars (the heavenly ones so I don’t confuse) as analagous to space travel. Of course there’s Star Trek to confuse the issue. But confusion reigns for the marketing world.

Rather, the Cloud has become synonymous with centralized enterprise utility computing of the Amazon, Google, Facebook, or Sony PlayStation flavor. We consumers can’t really see or appreciate it because the physical facilities where it happens – data centers and utility providers – are usually faceless buildings without windows where we are not invited or permitted access. Even the so called public cloud facilities are accessible to us only via the internet. They are often located in other countries or even multiple locations throughout the world.

So when you see the term Cloud Computing it usually infers the following, for the uninitiated consumer:
A place to store your files or photos
A branded service to back-up your computer
A social network provider like Facebook or Linked In
A search utility like Google or Yahoo or Bing or Ask It or Dogpile
Online applications like Google Apps or TurboTax

Thus the “Cloud” has become a convenient means for marketers to refer to a single word to describe what will eventually become a better defined range of products that we, consumers, can identify and use -obviously for a price – as time passes.

But aren’t most of them “free”? More on Internet revenue and who actually pays on an upcoming blog.
Show Comments: OR