TenFourFox taps disused resources of your aging PowerPC

Many of us geeks, especially those of us in the IT world replace our hardware almost as often as fashionistas change wardrobes. For some reason, I have a thing for old hardware. While I certainly don’t use it on a daily basis, I have a hard time parting with my favorite gear. I see it as something akin to automotive enthusiasts that like to keep that prized vintage car in their garage. They won’t drive it to work every day, but that doesn’t mean they don’t love taking it for a spin on the weekends.

I have a similar relationship with some of my favorite machines. I still use a 15″ PowerBook G4 at home on a fairly regular basis. (in fact I’m typing this post on it right now) At it’s heart is a 1.67Ghz PowerPC G4 processor. (the last model and fastest PPC Apple ever delivered in a laptop) By today’s standards, this single-core machine barely stays ahead of most netbooks. It’s battery is dead, and it gets fairly hot when you push it, but I find it to be perfectly adequate for most things.

Apple’s support of the PPC architecture ended 2 OS revisions ago (Leopard or 10.5) and even the Mozilla foundation dropped support from their Firefox browser after the 3.6.x series. (and even that will stop receiving updates soon…) Needless to say I was pleasantly surprised to find a project known as TenFourFox not only recompiling the latest Firefox from source, but actually making PPC specific improvements as well! TenFourFox is basically Firefox, but separately optimized for PowerPC G3, two flavors of G4 and the G5 CPU. That would be enough in itself, but the developers didn’t stop there. They’ve swapped out the old just-in-time JavaScript compiler with their recently completed nanojit for PPC which beats even Apple’s native Safari browser. (which until recently was the only browser with anything close to decent performance) This is all possible because there are still people in the Open Source community interested in this platform who know how to leverage resources like AltiVec. Kudos to the TenFourFox team for keeping this aging platform relevant!

Computer security and the human factor

One of the most important things to remember about security is that it is a process, not a product.  All too often, people think of security as a specific problem with a specific solution.  Unfortunately for us, security (in almost any context) is a moving target.  What was a strong password yesterday is weak today.  There are good reasons to require users to change their passwords, but it’s only part of the solution.  Forcing user’s to change their password too often invariably results in weaker passwords that are easier to remember.

To understand why this is a problem we first need to define what a strong password is and what makes one weak.  Numeric passwords are the worst as there are only 10^N possibilities where N is the number of characters.  Using letters is a little better as that’s 26^N for english.  Using both upper and lower case gives you 52^N, while a full alphanumeric set gives you 62^N.  Use of a full alphanumeric characterset on passwords 8-characters and higher was considered strong enough until recently.  After all, 218 trillion possibilities seems pretty large doesn’t it?  Considering some of today’s high-end graphics cards can perform over 2 trillion floating point operations per-second, breaking even a completely random 8-character alphanumeric password is trivial.  Adding the full set of special characters   on most english keyboards yields an additional 30 characters for 92^N or 5 quadrillion possibilities for an 8-character password.

You’re probably noticing a problem; we’re out of usable characters.  From this point on, the only way to increase password security is to make them longer and longer.  Obviously we can only keep up with this for so long before technology overtakes our ability to remember a secure password.  Clearly simple password-based security is insufficient for protecting anything of real value.  What’s needed is a multifactor system that uses 2 or more separate components to authenticate a user’s credentials.

A multifactor authentication system could be as simple as the combination of a password and a physical token such as a smartcard.  To authenticate, the user must insert the smartcard and type his or her password.  Either factor by itself will be rejected.  The beauty of this system is that any data protected in this way is inaccessible without each piece of the authentication puzzle.

A system is only as secure as its weakest link.  In many cases we humans are unfortunately that link.  We have limited memory and are vulnerable to social engineering attacks that get us to reveal sensitive information to complete strangers.  A strong password is useless if a user gives that password away or writes it down.  Adding a unique physical component to the equation raises the level of difficulty for an attacker significantly.

Apple’s just the latest punching bag…

It seems everywhere I go online, there’s another person spouting off on what’s now been dubbed Antennagate.  Most of the commentary is uninformed drivel regurgitated from the myriad sloppy reporting circulating the net.  Some are the usual malcontents looking for something to gripe about, while others are just having a good time making fun of what they see as a big corporate snafu.

All of the fuss has centered on what is being commonly referred to as the “death grip” where a user covers a large portion of the phone while gripping it tightly.  The weak spot, in the case of the iPhone 4 is a gap between the phone’s two antennas.  What’s known is that the conductivity of the average human hand is enough to have an effect on reception when this gap is bridged. (especially when that hand is moist)  Brian Klug and Anand Lal Shimpi of AnandTech.com explained it best in an article published on June 30th.  That was a full 2 weeks before Bloomberg’s July, 15th article that claimed Apple was warned about the antenna design by senior engineer Ruben Caballero.  Apple CEO Steve Jobs called the Bloomberg article “a crock” and “total [BS].”  (Caballero has not come forward publicly to either confirm or deny the accusations.)

The article on AnandTech.com clearly backs up several claims made by Apple at their recent press conference (July 16th):

  • The iPhone 4 isn’t perfect
    • signal can be affected by bridging the gap (worst case -24dB)
  • Other phones experience the same kind of signal attenuation when gripped in certain ways
  • The worst case signal loss doesn’t occur in normal use even in poor coverage areas.  Exceptions to this are solved by using a case.

So what does all this mean.  Yes, Apple made a design decision that affected the performance of the iPhone 4 antenna.  Does it matter? No.  The new design is more sensitive and generally makes up for the possibility of attenuation.  I’ve confirmed Apple’s claims myself.  When you’re in an area with good coverage, it’s nearly impossible to disrupt the signal. It’s only in areas of strong interference or poor coverage that the so-called death grip has any effect.  (the same goes for the single finger bridging technique.)  There is a problem here, but it’s not as big or important as some in the media want it to be.  Nothing to see here folks… move along.