Skip navigation

Tag Archives: Richard Stallman

Cloud computing. It’s a term that has become so pervasive that it’s easy to imagine it as the next logical, progressive step in computing. I, however, find myself agreeing with Richard Stallman more and more. Cloud computing is, perhaps, the least needed, least thought out and, potentially, most dangerous “improvement” in modern computing history. I’m also aware that I am in the minority among tech-savvy users when it comes to this position. With that in mind, I must acknowledge the potential benefits. Moving applications and data storage onto servers has it’s advantages. The operating system is no longer a barrier. A person wouldn’t have to choose their software based, in large part, upon their operating system. Data storage becomes more convenient as online storage solutions such as Amazon’s S3 service enable ordinary users to essentially operate their own, mostly hassle-free, web servers. Even seemingly innocuous services like web e-mail and hosted blogging services illustrate the ability for “cloud computing” to makes previously complicated services simple. Anyone can run their own internet-connected file server. But only a few have the technical knowledge or desire to successfully do so. So why don’t I like cloud computing?

There are three main reasons why I’m luke warm on cloud computing. Cloud computing requires the user to depend upon a machine, run by someone else, whose only connection to him is through the internet. Cloud computing means sending to and storing data on a server that you don’t control, and whose security measures you cannot be sure of. Cloud computing also has the potential of being less secure than traditional desktop computing.

Reliability

Reliability is an essential aspect of any system. A computer is only useful as long as it continues to function. A website is only useful as long as it continues to run and have bandwidth and resources available. When you consider cloud computing according to these simple requirements, it ought to seem obvious that computing in the cloud will be less reliable, everything else being equal. An application depends upon a single workstation having enough resources available and, at a lower level, functioning hardware to run. Cloud computing requires the same thing from a server, which is most likely being accessed by multiple users, and a functioning internet connection with sufficient bandwidth. Increasing the point of failure will inevitably lead to an increase in potential that such services will be degraded or even fail.

But what of hardware failures? Isn’t it true that a business will use better hardware and employ people who are more knowledgeable than the average consumer? Won’t these considerations tilt the scales toward the cloud? Simply? No. Hardware performance and quality have been increasing while the cost has been decreasing. This inverse relationship allows the average consumer’s desktop to be more capable than it ever was before. It is true, however, that in the rare case of hardware failure a cloud application will have built-in redundancy and people capable of fixing such problems. This prevents most breaks in service. Yet, this fails to be a strong argument. Most consumers have multiple computers, a fact that will become increasingly more widespread as the cost of hardware goes down. That, coupled with a prudent back-up plan, would allow most consumers to avoid serious disruptions.

Control

I was originally planning to focus exclusively on the privacy implications of moving and manipulating data on a “foreign” server. The truth is privacy seems to be important only to a select few people, myself included, and the consequences of cloud computing really extend to the concerns over control. Cloud computing leads to a fundamental loss of control. Our data is stored on someone else’s servers, in someone else’s building. By doing work through a cloud application, the user is fundamentally placing undeserved trust in the honesty of the application owner and it’s employees. In all cases the user is put into a situation where he lacks actual physical control over his data.

Why does this matter? Data is malleable. It can be easily changed. Data is also easily copied. In this situation, someone else can more easily copy and/or modify your data and monitor you. It also opens up the further possibility, suggested by recent events, that you could even be locked out of access to your data and applications. Some of these risks can be mitigated through the use of encryption. Encryption is no panacea though. Most people choose weak keys/passwords. This makes the encryption much weaker. Also many application providers offer secondary means of access, often in the form of “security questions”, which are usually even weaker than the key or password in use.

Security

Cloud computing, currently, is potentially less secure than traditional desktop computing. Web applications are typically available at any time of the day, and any day of the week. That means it is available to attempted exploits at any time. The database and/or data behind the application are continually available to any one who is able to gain access. Contrast that with a desktop application. A person who gains access to that desktop will have access to the data on that specific machine. If the network which the computer is on happens to be unsecured, then the person could gain access to them at well. At it’s worst, the damage is limited to specific instances—specific machines. In other words, it is easier to limit the damage caused by penetration through a flaw in a desktop application than it is with a cloud-based application. A cloud application is, then, a bigger target than any individual user would be. That combined with the current current vulnerability which modern web applications have shown to attacks by malicious users, ought to inspire caution.