Skip navigation

Category Archives: Technology

Courtesy of /. and the BBC, and fresh from the No-Duh desk, we have news that Apple is recommending Mac users use an Anti-Virus program.   Gizmodo reported that they have been recommending this since last year.   The irony is thick when you remember their switcher ad where they said viruses weren’t a problem on a mac.   But the real question is, why is anyone surprised?   Neither Macs nor *nix are immune to viruses, trojan horses, or the like.  They never have been.

Sure, a lot of people have this idea that those systems are either immune or almost immune to viruses.   But, ironically, most people involved in computer security would point out that there’s no reason to believe that.   In fact I think it’s worthwhile to point out that even the current scarcity of viruses and other hostile software for these systems do not prove that these systems are necessarily any more secure than Windows.   It is true that they might be more secure than Windows, but that doesn’t invalidate the need for Anti-Virus software.

So, like I said.  Macs need anti-virus?  Duh. Well–maybe more like should have…

One of the nice things about the open source operating system, GNU/Linux, is the breadth of choice available to users, most at  absolutely no cost.   This allows a user to choose the distribution which matches his tastes best.   But, there is one flaw in this gluttony of chocie.   How’s a beginner to choose a distro?   Okay, let’s say you limit the choices to all of the “major” distros, like Fedora, OpenSUSE, Ubuntu, et cetera.   Even then, there’s no easy way for a newbie to pick.   I feel that if we can change this situation, we would be enabling new users to more easily adopt linux as an operating system, as a result spreading free and open source software.

The question then arises, “Which distribution should be the ‘go-to’ distribution for new linux users?”   Well if you read the title for this, you’ll have guessed already…the distribution should be Ubuntu.   Now in all fairness I do use and like Ubuntu, but it isn’t the distro I use most often.   OpenSUSE and Fedora are battling for that prize.   Rather, Ubuntu was the first linux distro that I used.

With that in mind, here are three good reasons why all linux users should support Ubuntu as the linux distro for new linux users.

1:   Ubuntu’s stated goal has always been to make a linux for ordinary people, and it has usually succeeded in making their distro easy for novices to pick up.    For that reason, Ubuntu is already a good distro for new users.

2:   While a generic Wubi is being created to work with any linux distribution, Ubuntu is, now, still the only distro which features the ability to install itself easily onto a windows system and, just as easily, remove itself.   This reduces the upfront cost of time and knowledge necessary to install linux on your computer, so new users will be more likely to try using linux and will encounter fewer road blocks to that goal.

3:   While choice is wonderful, having one distro which every linux user can point to as the distro for people new to linux makes it easier for advocates.   An advocate won’t have to bring up different distros, or explain any complex ideas.  They can simply give them a cd, tell them to choose “install in windows”, and the rest will be self-explanatory.  Hiding the details from new, usually non-technical, users makes the whole experience better for the user, front-to-back, and makes it more likely that they will stick will linux.

That being said, let me know what you think.   Have I gone crazy, or does this seem to be a net benefit for FOSS?

I’m a programmer, though I’m really only an amateur right now.   I’ve written programs in C++, C, and Pascal.   My first language was Pascal, Turbo Pascal specifically.    I love the act of programming, but when I attempt to explain what programming is like, I often find myself at a loss for words.   What does a person do when they code?   My best metaphor has always been that coding is like writing or creating music.   It’s an act of creation…an art.

What does that mean?   Art is often thought of as creation ex nihlo–creating out of whole cloth.   But that’s not true.   A writer uses a known language, with a known grammar.   When she writes, she writes with an eye towards her genre.   She might borrow from the generic conventions or go against them.   But few good writers ignore them.   A musician will tend to pick a certain key and a certain scale.   He doesn’t have to, but the alternative, composing in the chromatic scale (using all possible notes), is often less pleasing and more difficult to compose in.  He will also compose with the conventions of his genre in mind.  He could use, ignore, or even self-consciously twist the conventions in an attempt to make the statement he wants to with his music.   In each case the artist is remixing, for a lack of a better word, the conventions and limitations to express his or her own statement.   They are using a library of pre-built words, expressions, biases and beliefs to achieve their goal.   My question is, understanding all of that, how could you view coding as anything else but art?

When you code, you choose your limitations.  Your language decides what you are capable of expressing.   Coding in C++ is always different, and always causes a different result, from coding in Lisp, or even Assembly.    The language, like a scale, limits your options and, by doing so, enables the coder to accomplish certain things more easily.   It is interesting to consider that when you finally start to program, you will usually be programming for an application which has been implemented before.   Yet, your code, and your final product, will inevitably be different from them.   Or, to put it another way, it isn’t just a coincidence that programmers will write the same program or even the same functions in different ways, even if they use the same language.   When I code a program, I’m expressing my own personal beliefs and biases about how that program should work.  It might be better or worse than someone else’s implementation.   That doesn’t matter because unless I am attempting to mimic someone else’s coding style, I will always code how I believe a thing should be coded.   A more “zen like” way to think of the problem is this.  I can only code as I would code, or code as I think others would code.    I can never code as a different person codes because doing so would require me to be that person.

Thus coding is not only a form of art, but a form of personal expression.

I know that seems funny.  But even in the most staid task, the coder cannot escape the fact that HE is always coding and that the code will either reflect his beliefs or what he believes his boss’ beliefs are about the best way to implement the program.   In each case, (excluding the case where the programmer is essentially copying someone else’s code) the programmer is the filter through which the code is passed and the programmer is the “designer” or “creator” of the code.

Cloud computing. It’s a term that has become so pervasive that it’s easy to imagine it as the next logical, progressive step in computing. I, however, find myself agreeing with Richard Stallman more and more. Cloud computing is, perhaps, the least needed, least thought out and, potentially, most dangerous “improvement” in modern computing history. I’m also aware that I am in the minority among tech-savvy users when it comes to this position. With that in mind, I must acknowledge the potential benefits. Moving applications and data storage onto servers has it’s advantages. The operating system is no longer a barrier. A person wouldn’t have to choose their software based, in large part, upon their operating system. Data storage becomes more convenient as online storage solutions such as Amazon’s S3 service enable ordinary users to essentially operate their own, mostly hassle-free, web servers. Even seemingly innocuous services like web e-mail and hosted blogging services illustrate the ability for “cloud computing” to makes previously complicated services simple. Anyone can run their own internet-connected file server. But only a few have the technical knowledge or desire to successfully do so. So why don’t I like cloud computing?

There are three main reasons why I’m luke warm on cloud computing. Cloud computing requires the user to depend upon a machine, run by someone else, whose only connection to him is through the internet. Cloud computing means sending to and storing data on a server that you don’t control, and whose security measures you cannot be sure of. Cloud computing also has the potential of being less secure than traditional desktop computing.

Reliability

Reliability is an essential aspect of any system. A computer is only useful as long as it continues to function. A website is only useful as long as it continues to run and have bandwidth and resources available. When you consider cloud computing according to these simple requirements, it ought to seem obvious that computing in the cloud will be less reliable, everything else being equal. An application depends upon a single workstation having enough resources available and, at a lower level, functioning hardware to run. Cloud computing requires the same thing from a server, which is most likely being accessed by multiple users, and a functioning internet connection with sufficient bandwidth. Increasing the point of failure will inevitably lead to an increase in potential that such services will be degraded or even fail.

But what of hardware failures? Isn’t it true that a business will use better hardware and employ people who are more knowledgeable than the average consumer? Won’t these considerations tilt the scales toward the cloud? Simply? No. Hardware performance and quality have been increasing while the cost has been decreasing. This inverse relationship allows the average consumer’s desktop to be more capable than it ever was before. It is true, however, that in the rare case of hardware failure a cloud application will have built-in redundancy and people capable of fixing such problems. This prevents most breaks in service. Yet, this fails to be a strong argument. Most consumers have multiple computers, a fact that will become increasingly more widespread as the cost of hardware goes down. That, coupled with a prudent back-up plan, would allow most consumers to avoid serious disruptions.

Control

I was originally planning to focus exclusively on the privacy implications of moving and manipulating data on a “foreign” server. The truth is privacy seems to be important only to a select few people, myself included, and the consequences of cloud computing really extend to the concerns over control. Cloud computing leads to a fundamental loss of control. Our data is stored on someone else’s servers, in someone else’s building. By doing work through a cloud application, the user is fundamentally placing undeserved trust in the honesty of the application owner and it’s employees. In all cases the user is put into a situation where he lacks actual physical control over his data.

Why does this matter? Data is malleable. It can be easily changed. Data is also easily copied. In this situation, someone else can more easily copy and/or modify your data and monitor you. It also opens up the further possibility, suggested by recent events, that you could even be locked out of access to your data and applications. Some of these risks can be mitigated through the use of encryption. Encryption is no panacea though. Most people choose weak keys/passwords. This makes the encryption much weaker. Also many application providers offer secondary means of access, often in the form of “security questions”, which are usually even weaker than the key or password in use.

Security

Cloud computing, currently, is potentially less secure than traditional desktop computing. Web applications are typically available at any time of the day, and any day of the week. That means it is available to attempted exploits at any time. The database and/or data behind the application are continually available to any one who is able to gain access. Contrast that with a desktop application. A person who gains access to that desktop will have access to the data on that specific machine. If the network which the computer is on happens to be unsecured, then the person could gain access to them at well. At it’s worst, the damage is limited to specific instances—specific machines. In other words, it is easier to limit the damage caused by penetration through a flaw in a desktop application than it is with a cloud-based application. A cloud application is, then, a bigger target than any individual user would be. That combined with the current current vulnerability which modern web applications have shown to attacks by malicious users, ought to inspire caution.

The end of the world is nigh, or so they say. Tomorrow marks the beginning of the Large Hadron Collider’s career and, more specifically, it’s search for the oft mentioned Higgs boson, a particle so stupendously amazing that some jerk called it the “god particle”. The Large Hadron Collider, LHC, is a particle accelerator, the biggest yet created. It spans the border between France and Switzerland and features a circular tunnel with a circumference of 27km buried at depths ranging between 50 and 175m. The LHC is unique from the other existing particle accelerators in it’s power. Scientific American puts the LHC into perspective by describing just how much of a, if you’ll pardon me, quantum leap this accelerator is when compared with even the most powerful accelerator to date.

Outline of the Large Hadron Collider, via flickr

Outline of the Large Hadron Collider, via flickr

“It starts by producing proton beams of far higher energies than ever before. Its nearly 7,000 magnets, chilled by liquid helium to less than two kelvins to make them superconducting, will steer and focus two beams of protons traveling within a millionth of a percent of the speed of light. Each proton will have about 7 TeV of energy—7,000 times as much energy as a proton at rest has embodied in its mass, courtesy of Einstein’s E = mc2. That is about seven times the energy of the reigning record holder, the Tevatron collider at Fermi National Accelerator Laboratory in Batavia, Ill. Equally important, the machine is designed to produce beams with 40 times the intensity, or luminosity, of the Tevatron’s beams. When it is fully loaded and at maximum energy, all the circulating particles will carry energy roughly equal to the kinetic energy of about 900 cars traveling at 100 kilometers per hour, or enough to heat the water for nearly 2,000 liters of coffee.”

But while the LHC is easily in a league of it’s own in terms of it’s power, it is also unique for it’s delays, as well as the fear it has created in the minds of some people. A small group of people have been railing against the LHC, due to the belief that it will cause the end of the universe. The folks over at CERN, the organization in charge of the LHC, have attempted to allay those fears along with the rest of the scientific community. Unfortunately, as a result of the LHC approaching it’s start, some researchers have found themselves the targets of death threats.

Considering all of the hype, and fear, produced by the LHC it is only natural to ask what the potential payoff is. After all, it’s all well and good to parrot lines about the Higgs boson, but what does it really mean. In the end, the LHC offers us a unique opportunity to understand how our universe works at the most fundamental level we know. The true answer is that there is no clear idea of what will be found once the LHC is up, and running at full capacity. The clearest target is, of course, the Higgs boson, but other targets include gravity, and even dark matter. Most important of all, however, is the effect that the LHC might have upon the Standard Model of particle physics.

Atlas particle detector, via flickr

Atlas particle detector, via flickr

Any of the above findings would be enormous in confirming more of the Standard Model, especially finding the Higgs particle. Yet, one can’t help but consider what the lack of such a discovery might mean. If the Higgs particle isn’t found, there is the potential to overturn the Standard Model of particle physics. As Stephen Hawkins pointed out, “I think it will be much more exciting if we don’t find the Higgs. That will show something is wrong, and we need to think again.” It is those moments when things fail that are the most exciting. Failure indicates a deeper truth to be found and more to learn.

No matter how exciting it is to contemplate the potential advances which may come about from the Large Hadron Collider, it’s always useful to remember that science is slow. Tomorrow’s start up merely marks the very beginning. The LHC will only be testing out it’s equipment, merely calibrating. Even if things go well, results most likely won’t be out for a while, up to several months if equipment must be repaired. It’s only the beginning, but who can’t help but be excited at the prospect’s ahead for CERN’s Large Hadron Collider?

Oh, and just in case you’re worried you can always check to make sure you’re alright.

Scientific American, “Large Hadron Collider: The Discovery Machine”

Physorg.com, “Hawkings bets CERN mega-machine won’t find God particle”

Large Hadron Collider, http://lhc.web.cern.ch/lhc/

Greats news from the EFF today!   They’ve succeeded in lifting the gag order on the group of MIT security researchers and their MBTA/Charlie Card vulnerability research.   The students were originally scheduled to debut their report during the security conference DEFCON.   However, the MBTA sued the students, claiming that their report, which would have excluded details needed to pull off the attack, violated the Computer Fraud and Abuse Act(a law passed primarily to prevent fraud using computers).   As part of the lawsuit, the students had a gag order placed on them.   While this is certainly great news for the MIT students, and security researchers in general, the lawsuit still stands.

The case was interesting in the choice of the CFAA as a tool to prevent academic researchers from exposing gaping security holes in public infrastructure.   The MBTA claimed that the students were going to be aiding others in defrauding the MBTA.  The only problem with the claim is that the students gave the MBTA advance noticed and told them that they would withold details in order to prevent people from easily exploiting the problems.   In the end it’s difficult to wonder how much better this whole experience would have gone if the MBTA had embraced their responsibility, taken the vulnerability serious, and worked with the students, rather than abusing the CFAA in order to protect their butts.   With that in mind, I’m going to end with a short quotation from the EFF press release(link below).

“The students have already voluntarily provided a 30-page security analysis to the MBTA and have offered to meet with the MBTA and walk the transit agency through the security vulnerability and the students’ suggestions for improvement.

“The only thing keeping the students and the MBTA from working together cooperatively to resolve the fare payment card security issues is the lawsuit itself,” said EFF Senior Staff Attorney Kurt Opsahl. “The MBTA would be far better off focusing on improving the MBTA’s fare payment security instead of pursuing needless litigation.””

EFF: “Judge Lifts Unconstitutional Gag Order Against MIT Students

It’s a rare incident for me to find a weapon where it’s very concept scares me.   Straight from the fine folks at Gizmodo(via Technabob, via UK Daily Mail) we have the WASP knife.   Apparently the WASP knife is a hunting knife.  But unlike most other knives, it features a special little secret.   It features a can of compressed air, which sounds like an odd thing for a knife to have.   That is until you realize it injects the compressed air into the victim, at which point the air expands rapidly displacing internal organs.   If that’s not scary enough, the compressed air also freezes the skin and organs near the entry point.   Oh and the best part is that it’s just as effective underwater as on dry land and the WASP site lists it’s price as a measly $379.95.  WHAT A STEAL!

waspcase

For those interested in seeing this knife’s potential utility against watermelon enemies, check out the video from the WASP knife website.  This knife has to be the best combination of wickedly awesome gadgetry and terrifyingly deadly weaponry.  As if getting stabbed wasn’t bad enough…

What does it take to get a pc with xp

really?

REALLY?

Apparently so.  PC World, via Slashdot, features an article purporting to advise consumers about how they can purchase a new computer with an old operating system.   This would be interesting…well no it wouldn’t be interesting.  There are two major flaws with even the ideas presented by this article.  First, simply looking at the websites for major vendors reveals that some systems offer windows XP as a choice.  It’s rare, and every vendor doesn’t offer it, but there are some.  Second, anyone who purchases a copy of windows vista has the ability to “roll back” the operating system to either Windows XP or Windows 2000.   Special thanks to Paul Thurrott for pointing this out.   As Thurrott points out, this ability has been a long standing feature of Windows operating systems.   When you consider this, several thoughts naturally arise.   Did this writer do any research?  You would think he would have mentioned the specific ability for any consumer to roll back to XP or 2000 if he had.   Why even write an article about getting an out of date operating system for a new computer?

After considering such thoughts, one might be lead to suspect that this is the tech journalists version of a fluff piece.  He calls up a couple companies about getting xp on a new computer and boom he has an article.   Let’s take apart some quotes shall we?

“I won’t waste time rehashing the argument over whether Windows Vista is any good. The fact remains that lots of people prefer Windows XP, and they’ll go to great lengths to get it.”

Perhaps a better way to write this would be, “I know there’s no good reason to write an article about how to get an old operating system on a new computer, but I can justify it by harping on the “popular” demand.  In fact it’s so popular that, as Thurrott pointed out in the same article above, “BTW: 210,000 signatures represents about 0.03 percent (three tenths of one percent) of all XP users, assuming there are 700 million XP users worldwide. It could actually be higher.” Oh and while we’re at it, since “lots of people” would like to get linux on their new computer, perhaps I should await this guy’s, “Guide to getting gentoo on a new pc.”  NEXT QUOTE!

“To find out how difficult it is to get a new XP machine these days, I asked the nine largest PC vendors in the United States–Dell, HP, Gateway, Toshiba, Acer, Fujitsu, Lenovo, Sony, and Asus–about the specifics of their downgrade policies. Then, to see how closely the official story synced up with the reality in the marketplace, I called sales representatives for each company and asked them whether I could purchase a new laptop equipped with XP from them.”

Investigative journalism at it’s best people!  So he called a bunch of corporate headquarters, then he called and pretended to order a pc.  Too bad he didn’t, say, call Microsoft.  He could have saved 18 calls with one, since Microsoft readily explains that their operating system features a rollback feature.

“The verdict?”

Guilty by reason of inanity?

“Downgrade policies are all over the map, and more than a few rank-and-file sales reps have a sketchy understanding of those policies.”

So let me get this straight.  You called corporate headquarters and then sales and you found out that employees don’t always strictly follow the official policies and procedures of a company.  Wow.  Just.  I’m speechless.  So what?  You’ve described the situation at every single large corporation.

The writer proceeds to explain what he encountered but we can stop right here because of all this is meaningless because you can get either Vista, Xp, or 2000 simply by buying a copy of Vista or getting a copy of Vista with your new computer.  Also, Vista is a more secure, stable operating system, with more conveniences and features.  If you are buying a new computer, please don’t try to buy a system with XP, especially if you’re doing so because of all of the “problems” you heard about.  Vista had, repeat had, a problem with drivers, a problem caused by hardware vendors.   There is no problem with drivers now.  If you buy a new computer it will run with vista just fine, just make sure you have ~2 GB of Ram and at least 126-256mb of video ram to ensure you can utilize the aero glass feature and that you can multi task with no slow down.

For those not aware of it, yesterday was “Download day” for the newest version of firefox, firefox 3.0.   The goal of download day was the break the Guiness World Record for most downloads in a single day.   While I can’t find the official number, the attempt is still ongoing at this moment.   They currently have had over 7 million, one hundred and thirty seven thousand downloads of firefox 3.   If you haven’t tried firefox, now’s you chance, and if you haven’t downloaded a version already, then get to it!

A little bit of news round up is in order.  Two major news items caught my attention during my perusal of the various news items.   First up, Apple’s World Wide Developers Conference started this monday and featured CEO Steve Jobs’ keynote speech.   As per usual with Apple conferences, the internet was abuz with rumors and hype surrounding the potential new porducts that might be unveiled.   Despite my complete lack of a mac, and perhaps due to my current interest in a macbook pro, I sat in on the Gizmodo and Engadget live coverage of the keynote speech.   What ensued can only be described as a string of completely uninteresting revelations.  There’s going to be a new version of OS X eventually?  Well duh…   And it’s going to be called Snow Leoperd?   Ok.   Look! Iphone games, medical apps, et cetera.  Perhaps the strangest app of all though was an ebay app(are you too impatient to just log in to the web based ebay?).   The mobileme(.mac 2.0) seemed vaguely interesting, though it would seem to make more sense to see what free cloud based apps show up to compete with it.   I’ve personally voiced my own apathy towards cloud based solutions, but I must admit I can see benefits to the use of some of the apps shown off in mobile me…though I can’t help but ask if Google couldn’t do the same thing minus the cost.   Finally, there was the big announcement of the day, a 3G or HSDPA, based iphone.   I’m sure this, along with the announced price cut for the iphone, excited quite a few.  But I must ask…so what?   The iphone is still locked to AT&T, which requires any non AT&T customer to pay several hundred to just get out of their old contract.  Going along with that complaint, I wonder why would you buy an iphone if you could get, for example, a new Blackberry Bold, HTC Touch Pro, or any number of other smartphones without having to replace your current provider, especially if that provider offer you good reception?   In the end, the keynote speech for WWDC fell flat.  I think Paul Thurrot put it best when he described the speech as “Microsoftesque”.

Going from the boring to the inane, we have the “Green our Vaccine” rally.   One of the organizers, the strangely labeled TACA or Talk About Curing Autism, (Haven’t we been doing that?) features a recap of the rally which took place June 4th in Washington D.C.   The anti-vaccine, or anti-toxin, movement is certainly nothing new, but it has certainly been aided by the influx of celebrity support. The rally itself featured such notables as Jim Carey, Robert Kennedy Jr., and Jenny McCarthy.   The movement has traditionally consisted of attacks against vaccines for ingredients, such as thiomersal which is a preservative which contained mercury, which they claimed to have caused autism in their child.   All of this sounds very scary especially when you mention chemicals like mercury being in vaccines.  The cold blast of reality is that there is no evidence that these chemicals cause autism.

In general, the movement tends to describe itself as being against any “toxins” being in vaccines.  I would ask, facetiously, who is FOR placing toxins in vaccines?   But that point being put aside, I imagine these people are well meaning but mistaken in their beliefs.  What’s more troubling is how the movement seems to be more interested in eliminating vaccines in general, rather than concerned with certain non-essential ingredients.

David Gorski of the Science-based Medicine Blog has a great write up about the event, the “controversy” and the truth about vaccine safety.   What’s interesting about this rally is how it shows the evolution away from the failing argument that mercury causes autism towards the more general anti-toxins argument.   I don’t think this is all that surprising.  The use of the anti-toxins argument is used throughout the pseudo-sciences.   The argument often that we’re ingesting too many toxins and we need to get rid of them in order to allow our bodies to heal naturally.   As if that wasn’t bad enough though, they’re combining the anti-toxin message with a “green” or all-natural message in some cases.  Consider a quote from a mother of two autistic children featured in an article via the science-based medicine blog found here.   “But Mason, who has two autistic children, warns that autism is on the rise, and that something has to change. ‘ideally the legislators would enact legislation that would force companies to use natural ingredients‘, she argues. ‘Not what they’re using now.'”   What is this ideal world which she is positing?   While I obviously feel for her and understand that she’s sincere in her beliefs and desires, I can’t help but point out that ingredients being natural doesn’t make them better.  After all consider that mercury and lead are natural but are in fact toxic to humans.

In the end it’s hard to not feel for the troubles which a lot of these mothers have faced, including the heart rending feeling that they might have been directly responsible for their child’s autism.   However we shouldn’t ignore that there are people trying to eliminate childhood vaccines, which are responsible for saving many lives from diseases, with a campaign which is thoroughly lacking in any real scientific evidence.  The best scientific argument they had was that mercury caused autism and that has been investigated and shown to lack any evidence.   After that, all that is left is faux green, faux consumer choice, and faux science.

Banner from \

Image borrowed from Science-based Medicine Blog

On a lighter note I’d like to submit Jim Carey’s analogy.  When I read this, it simply floored me.  I have a degree in literature and, really, this is probably one of the best analogies ever.   “If on the way to a burning building a fire engine ran people over, we wouldn’t stop using fire engines. We would just ask them to slow down a bit. Well it’s time to tell the CDC and the AAP that it’s time to slow the fire engine down. People are getting hurt on the way to the fire.”   Truer words have n’er been spoke.   Oh.  And I’m being sarcastic.