Day 659: Rules Of Technological Thumb

originally published October 20, 2013

It’s a good rule of thumb to double-check your facts before accepting any rule of thumb. Even the term itself shelters its own little deceit. Like a resilient pimple that puffs out its bulbous little pus-nipple every so often, my email box inevitably oozes little “Didja Know” trivia messages from someone, quite frequently pointing out that ‘rule of thumb’ used to refer to the permissible width of stick with which a man could beat his wife, according to the law.

Please, oh please allow us to smush the cigarette butt of truth upon this cold fried egg of an urban legend. “Rule of thumb” is most likely an old woodworking term; wife-beating has never been legal in the United States. “Correction” of a wife was allowed in England a long time ago, but no law book ever issued the thumb-width guideline.

This is why any rule of thumb needs to be held up to the light of scrutiny, poked with a stick and examined for more than its value as a catchphrase. Some simply become outdated, obsolete, or need to be modified with the times. Take, for example, the case of Moore’s Law.

Established by Intel co-founder Gordon E. Moore, this law observes that the number of transistors squished onto integrated circuits tends to double roughly every two years. This pattern could be seen when the 486 chip gave way to the Pentium, then when the Pentium evolved into the Pentium II, and so on. Moore came up with this in 1965, and has been proven correct every step of the way.

Until this year, anyway. Growth is finally slowing up, and in 2013 it’s expected that we’ll fail to hit the double-every-two-years mark for the first time in nearly a half-decade. We’re still expected to double our processing power every three years, but it seems the rapid explosion of chip technology has tapered off a little. Good news for those of us who want our computer to feel a little less outdated after having owned it for six months.

In case you haven’t noticed – and I’m willing to bet a life-size paper gorilla stuffed with Tootsie Pops you haven’t – light-emitting diodes, or LEDs, have been following a similar trend. Haitz’s Law, named for scientist Roland Haitz, claims that every decade, the cost per lumen (that’s how much light the things spew unto the world) drops by a factor of ten, while the light generated per LED package rises by a factor of twenty. So while the numbers are a touch different, LEDs are roughly conforming to a Moore’s Law-ish principle.

Except that we don’t really need LEDs to become that much more powerful, at least not in our daily lives. We’ll always want faster computers and greater performance from our graphic cards, but current LED technology can produce great light bulbs for home and commercial environments. Haitz’s Law has served the industry well, but it’s going to run its course once we can slap an LED light into every lighthouse down the coast.

Data streaming also has a place in this discussion, and a rule of thumb has been slapped into the books to explain it. Edholm’s Law (named for Nortel chief technology officer Phil Edholm) describes three types of data flow: wireless (that’d be our cellular data), nomadic (this is wi-fi; wireless but not fully portable) and wireline (plugged in and Ethernetted up). The more mobile you are, the slower your data will flow of course, but Edholm’s Law points out that all three technologies have been moving in a near lock step, getting faster and faster every year.

Still, even as all three speeds climb ever upward, they aren’t running precisely parallel. In 2004, when this law was first proposed, it was believed there would come a time when all three speeds became equal. 2030 was the target date, though it’s hard to predict advances in technology that could royally screw with this theory. Even now, I’m looking at a nine-year-old article and a Wikipedia page that hasn’t seen anything but incidental updates since 2007. I couldn’t unearth any sources online that could tell me if Mr. Edholm’s theory holds any water by 2013 standards. This one is a solid maybe.

Enter Niklaus Wirth, a computer scientist from Switzerland whose eponymous law is meant to take a touch of wind out of our sails of technological boastfulness. Yes, our computers are so unrelentingly mighty they could smite the WOPR computer from the 1982 film War Games with nary a flick of a diode. But even in 1995, Wirth didn’t think this was enough of a game-changer. As swiftly as our processors can process their processes, our software is getting bigger and bulkier at a faster rate.

Think about it. Does opening Microsoft Word on your Windows 8 laptop take that much less time than it did on your Windows 98 tower? Wirth’s Law – specifically that software gets slower faster than hardware gets faster – is depressingly accurate. Sure, we no longer have to sit and watch photos of Yasmine Bleeth download one painstaking row of pixels at a time, but is our productivity really that much faster?

The law has been restated as Gates’ Law – the speed of software halves every 18 months, named ironically after Bill Gates – and Page’s Law, after Larry Page, founder of Google. This one ain’t going away.

There are other laws that pepper the technological landscape, explaining why our toys have come so far in such a short time. Barry Henry of Kodak Australia came up with the Pixels-per-Dollar measure, showing how the cost of pixels in a digital camera has steadily declined since the technology hit the market. Kryder’s Law is like Moore’s Law, except it talks about the cost of hard disk storage per unit of information. This one hasn’t slowed down though – it has actually sped up due to some snazzy breakthroughs in hard drive technology over the past few years.

Moore himself even came up with a second law, indicating (correctly) that the costs of research, development and testing have risen steadily alongside the increase in processing power. This seems like a natural consequence, I suppose. But then what do I know?

I’ll say this much – Moore’s Law, while it has slowed down considerably, may come to an end. This is where we get into the notion of the Singularity… that moment when technology surpasses the capabilities of the human mind and it can improve upon itself faster than we can improve it.

Then we’re all screwed.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s