Category Archives: Technology

Murphy’s Computer Law

A long time ago, my family took a trip to Expo `86 in Vancouver, with stop offs in San Francisco and Los Angeles. In LA, we went on the Universal studio tour, something which I basically have no memory of. I did get a memento, though-a poster entitled “Murphy’s Computer Law” with a bunch of humorous computing “laws” on it. This poster went up in my room, accompanied me to college and has been in most of my offices at Microsoft. However, a few years ago, a corner ripped off in a move. Then while it was sitting around waiting to be repaired, it got a bit stained. And then I realized just how dated and ratty the thing looked. So, I figured it’s time to retire it. However, I would like to hang on to the “laws” since some of them are are still quite pertinent, even if some are quite outdated. So here they are, on my “permanent record:”

Murphy’s Computer Law:

  1. Murphy never would have used one.
  2. Murphy would have loved them.

Bove’s Theorem: The remaining work to finish in order to reach your goal increases as the deadline approaches.

Brooks’ Law: Adding manpower to a late software project makes it later.

Canada Bill Jones’ Motto: It’s morally wrong to allow na‹ve end users to keep their money.

Cann’s Axiom: When all else fails, read the instructions.

Clarke’s Third Law: Any sufficiently advanced technology is indistinguishable from magic.

Deadline-Dan’s Demo Demonstration: The higher the “higher-ups” are who’ve come to see your demo, the lower your chances are of giving a successful one.

Deadline-Dan’s Demon: Every task takes twice as long as you think it will take. If you double the time you think it will take, it will actually take four times as long.

Demian’s Observation: There is always one item on the screen menu that is mislabeled and should read “ABANDON HOPE ALL YE WHO ENTER HERE.”

Dr. Caligari’s Come-back: A bad sector disk error occurs only after you’ve done several hours of work without performing a backup.

Estridge’s Law: No matter how large and standardized the marketplace is, IBM can redefine it. [ed, later “Microsoft”, now “Apple,” I guess]

Finagle’s Rules:

  1. To study an application best, understand it thoroughly before you start.
  2. Always keep a record of data. It indicates you’ve been working.
  3. Always draw your curves, then plot the reading.
  4. In case of doubt, make it sound convincing.
  5. Program results should always be reproducible. They should all fail in the same way.
  6. Do not believe in miracles. Rely on them.

Franklin’s Rule: Blessed is the end user who expects nothing, for he/she will not be disappointed.

Gilb’s Laws of Unreliability:

  1. At the source of every error which is blamed on the computer you will find at least two human errors, including the error of blaming it on the computer.
  2. Any system which depends on human reliability is unreliable.
  3. Undetectable errors are infinite in variety, in contrast to detectable errors, which by definition are limited.
  4. Investment in reliability will increase until it exceeds the probable cost of errors, or until someone insists on getting some useful work done.

Gummidge’s Law: The amount of expertise varies in inverse proportion to the number of statements understood by the general public.

Harp’s Corollary to Estridge’s Law: Your “IBM PC-compatible” computer grows more incompatible with every passing moment.

Heller’s Law: The first myth of management is that it exists.

Hinds’ Law of Computer Programming:

  1. Any given program, when running, is obsolete.
  2. If a program is useful, it will have to changed.
  3. If a program is useless, it will have to be documented.
  4. Any given program will expand to fill all available memory.
  5. The value of a program is proportional to the weight of its output.
  6. Program complexity grows until it exceeds the capability of the programmer who must maintain it.
  7. Make it possible for programmers to write programs in English, and you will find that programmers cannot write English.

Hoare’s Law of Large Programs: Inside every large program is a small program struggling to get out.

The Last One’s Law of Program Generators: A program generator creates programs that are more “buggy” than the program generator.

Meskimen’s Law: There’s never time to do it right, but always time to do it over.

Murphy’s Fourth Law: If there is a possibility of several things going wrong, the one that will cause the most damage with be the one to go wrong.

Murphy’s Law of Thermodynamics: Things get worse under pressure.

Ninety-Ninety Rule of Project Schedules: The first ninety percent of the task takes ninety percent of the time, and the last ten percent takes the other ninety percent. [ed: words to live by]

Nixon’s Theorem: The man who can smile when things go wrong has thought of someone he can blame it on.

Nolan’s Placebo: An ounce of image is worth a pound of performance.

Osborn’s Law: Variables won’t, constants aren’t.

O’Toole’s Commentary on Murphy’s Law: Murphy was an optimist.

Peer’s Law: The solution to a problem changes the problem.

Rhode’s’ Corollary to Hoare’s Law: Inside every complex and unworkable program is a useful routine struggling to be free.

Robert E. Lee’s Truce: Judgment comes from experience; experience comes from poor judgment.

Sattinger’s Law: It works better if you plug it in.

Shaw’s Principle: Build a system that even a fool can use, and only a fool will want to use it. [ed: also known as “Bob’s Law”]

SNAFU Equations:

  1. Given an problem containing N equations, there will be N+1 unknowns.
  2. An object or bit or information most needed will be least available.
  3. Any device requiring service or adjustment will be least accessible.
  4. Interchangeable devices won’t.
  5. In any human endeavor, once you have exhausted all possibilities and fail, there will be one solution, simple and obvious, highly visible to everyone else.
  6. Badness comes in waves.

Thoreau’s Theories of Adaptation:

  1. After months of training and you finally understand all of a program’s commands, a revised version of the program arrives with an all-new command structure. [ed: also known the “Office Principle”]
  2. After designing a useful routine that gets around a familiar “bug” in the system, the system is revised, the “bug” is taken away, and you’re left with a useless routine.
  3. Efforts in improving a program’s “user friendliness” invariably lead to work in improving user’s “computer literacy.”
  4. That’s not a “bug”, that’s a feature!

Weinberg’s Corollary: An expert is a person who avoids the small errors while sweeping on to the grand fallacy.

Weinberg’s Law: If builders built buildings the way programmers write programs, then the first woodpecker that came along would destroy civilization.

Zymurgy’s First Law of Evolving System Dynamics: Once you open a can of worms, the only way to recan them is to use a larger can.

Wood’s Axiom: As soon as a still-to-be-finished computer task becomes a life-or-death situation, the power fails.

The Five Levels of Incompetence

In my “Learning and Teaching” post last week, I talked about the different stages of learning, from “unconscious incompetence” up to “unconscious competence.” It occurred to me today, though, that there really are different levels within those levels and, in particular, there are some very distinct levels of incompetence that I’ve encountered in my nearly (yikes!) two decades of working in the industry. The reason why the levels of incompetence are somewhat more important than the various levels of competence, it seems to me, is that incompetent people are often a very real threat to the stability of teams that they work in, while competent people usually aren’t.

The five levels of incompetence are, in increasing order of danger:

baby

Level 1: The N00b.

There’s not much to say about the n00b since, let’s face it, we’ve all been there. Hiring n00bs is unavoidable in most situations. Being a n00b is a basic fact of life.

Danger: Low, assuming that they are properly sandboxed or are experienced enough to sandbox themselves. (Otherwise, they’re likely to hit “launch” instead of “lunch” and then you’re really in trouble.)

sinking_ship

Level 2: Out of their depth.

Typically, this is someone who’s not really incompetent in general but who’s just been pushed up to a level of responsibility beyond their capabilities (a.k.a. a victim of the Peter Principle.). I’ve seen this happen most often in situations where a senior, experienced person leaves the team and the leadership decides that they have to put someone equally senior in their place regardless of whether that person can, you know, actually do the job. So they pluck someone who’s senior but not as experienced and plops them into the departing person’s chair.

Danger: Moderate. Because the person isn’t completely incompetent, they tend to be able to give the appearance of competence and avoid leading the team totally off the rails but usually end up leading the team in circles. So the team doesn’t make any forward progress and people eventually wise up and leave.

dumb-dumber_l

Level 3: Dumb and Dumber.

Now we start getting into the fun levels. This person is just plain incompetent, someone placed into a totally inappropriate position for them (which, for the most part, is going to be any position). I’ve actually encountered very few instances of this in my career, and it usually happens when someone transfers between two wildly different kinds of jobs. That tends to mask, for a little while anyway, their completely lack of ability to actually do anything under the guise of just being a n00b.

Danger: High to moderate. It really depends on how fast everyone figures out just how incompetent the person is-usually, truly incompetent people get shunted aside as soon as everyone figures out what’s going on. If that takes too long, competent people tend to get pissed off and leave.

Bozo

Level 4: Bozo.

The difference between a Bozo and a Dumb and Dumber is that a Bozo is Dumb and Dumber who thinks he is competent. Bozo’s tend to believe that they are as good or better than everyone else, deserve special treatment, and that their genius is being under-rewarded. And they ignore the fact that they have absolutely no idea what they are doing.

My best Bozo story is a Program Manager that I worked with a long time ago. I implemented a feature he specified. He entered a bug saying that the feature didn’t work correctly. I resolved the bug “by design” after I verified that the feature worked exactly the way he specified it. He then came to my office and started to argue with me that I shouldn’t have resolved the bug “by design,” because the feature didn’t work correctly. Finally, I pulled out a copy of his specification and pointed at the paragraph that said exactly how the feature should work. He then got totally exasperated with me and started ranting that I was supposed to implement the feature “the way he wanted the feature to work, not the way he specified it!”

Danger: High. Bozos are always on the lookout to get ahead (in line with their great abilities), so they often manage to worm their way in to management positions. They then tend to lead the team the way Mr. Toad drives motorcars: careening all over the road until they finally end up in the ditch. Bozo’s are adept at bringing down even the most experienced team in a surprisingly short amount of time.

evil_genius_drevil

Level 5: Evil Genius.

I was debating whether this is even a level of incompetence at all, because in many ways Evil Geniuses are not incompetent people. Quite the contrary, they are often quite adept at many things, including manipulation, spin, intimidation, self-aggrandizement, and sucking up. But I think in a deep sense, Evil Geniuses are just a more highly evolved form of Bozos because the end result tends to be the same: the team blows up in a very spectacular way. However, while a Bozo usually does this in a totally oblivious way (“What happened?”), it’s often all part of an Evil Genius’s plan to use the force of the explosion to propel them ever higher. These are the kind of guys who end up running major corporations and then running them totally into the ground. And then jumping ship to run an even bigger corporation. But, at the core, I think that Evil Geniuses act this way because they couldn’t actually figure out how to do things in an above-board manner. Thankfully, I’ve met very few Evil Geniuses in my day. And those I have met, I’ve been able to largely avoid.

There are only three types of programmers in the world…

..and they are:

  1. Programmers who want to write an operating system
  2. Programmers who want to write a compiler
  3. Programmers who want to write a database

It’s not that every programmer ever actually works on one of these, just that every programmer seems to dream of doing one of these things. It’s the primary reason why things like Linux exist. Yes, open source, blah, blah, blah, OS choices, blah, blah, blah, evil Microsoft, blah, blah, blah. But I would bet my bottom dollar that 9 out of 10 of the people donating their valuable time to the Linux project do so not because they want an alternative to Windows but because they always dreamed of being OS hackers. It’s also why there are so many damn programming languages out there, all the people who sit around dreaming of being, I don’t know, James Gosling or something.

(I think with the advent of the Internet, it’s likely that there’s now a fourth kind of programmer who wants to write websites, but I’m not totally sure about that yet.)

The interesting thing about these categories is that the Venn diagram tends, in my experience, to be pretty distinct-most “data” guys aren’t also “language” guys, and most “language” guys aren’t also “OS” guys, and so on. My theory is that it’s like the parable of the blind men and the elephant: although we all grapple with basically the same set of problems, each kind of programmer grapples with a different aspect of it.

The blind men and the elephant

I say all this because although I started out working in databases, it’s clear to me that I’ve always been a “language” guy. In college, I did so-so in the OS course and never touched a database course (I’m not even sure they were offered), but my compiler course netted me a special letter of commendation from the professor (the only one I ever got). Anyway, now I’m back in the “data” world as an even more confirmed “language” guy and the most interesting thing is how many of the problems are the same, but the way they’re conceptualized, handled, or even talked about, are different from what I’ve been used to working on programming languages. It’s kind of. refreshing to see things in a different light. More on that soon.

I Heart Beagle Brothers

Jeff Atwood’s little entry on cheatsheets sure brought back some memories… I loved Beagle Brothers. As a general measure of comparison, I think Beagle Brothers had more cool in one little tip/trick box than Google has ever had with their cute variations on the Google logo. Definitely one of the things I look back on with fondness…

I’ve also thought about trying to create a VB.NET language cheat sheet one of these days, but it’s on that list of “things to do when I have time.” Yeah, right…

Latin as a prerequisite for programming?

I’m catching up on my blog reading and just plowed my way through Joel’s curmudgeonly “old guy” rant about The Perils of JavaSchools. I don’t have a lot to say about the central thesis of his rant — I’ve always been of two minds about the efficacy of the Darwinian theory of weeding the weak out through hazing-type classes — but there was an analogy that caught my eye:

Heck, in 1900, Latin and Greek were required subjects in college, not because they served any purpose, but because they were sort of considered an obvious requirement for educated people. In some sense my argument is no different that the argument made by the pro-Latin people (all four of them). “[Latin] trains your mind. Trains your memory. Unraveling a Latin sentence is an excellent exercise in thought, a real intellectual puzzle, and a good introduction to logical thinking,” writes Scott Barker. But I can’t find a single university that requires Latin any more. Are pointers and recursion the Latin and Greek of Computer Science?

I actually took four years of Latin in high school because I had had such a horrible experience trying to learn to speak French in middle school that I was desperate for any language that I didn’t have to listen to or speak. The joke ended up being on me, though, because when I took an Italian class in college, I realized that — difficulties with French aside — Latin was much, much harder to learn than most modern Romance languages. After all, in most of them a noun tends to have just two aspects: gender and/or number. In Latin, though, you have declensions in which the noun changes form based on its role in the sentence. Just that alone made Latin quite a challenge. And a pleasure, I might add, due to the fact that I had an excellent teacher.

Interestingly, though, I think that Latin actually has helped me a lot with my current job. After all, pretty much all you do in Latin class is translate Latin to English and back again on paper (unless you work in the Vatican). And, if you think about it, pretty much all compilers do is sit around day after day translating one language into another. So a lot of the same concepts and methodologies that I learned translating Arma virumque cano, Troiae qui primus ab oris… map fairly well into translating something like If x = 5 Then y = 10. Sure, there are lots of differences between human languages and computer languages, but at some level language is language. So I guess I’m one of those four pro-Latin people and maybe the only pro-Latin person who thinks that learning Latin might help you later when you learn computer programming…

(I should also add that the real payoff of Latin is the opportunity to translate some of the really great masters of Roman literature. Translating the Catiline orations by Cicero gives you a chance to see a really master politician and orator at work in the midst of a pretty gripping political thriller. And Virgil’s Aeneid — at least, the parts we made it through in a year — was just wonderful. While watching the otherwise wretched Troy, I was able to keep myself awake by speculating whether Aeneas would show up with his father on his back when Troy finally burned; the fact that he did was pretty much the only thing that I liked about that movie.)

Sometimes it’s the little victories that matter the most…

I think Rico’s spot on when he says that the real way you win the performance war is 5% at a time. Actually, I think he’s been overly optimistic — a lot of the time, it seems like you win the performance war 1% at a time. It’s much more like trench warfare than blitzkreig.

There’s also a larger idea at work here. Rico’s point is that in a mature product, you shouldn’t be able to come up with a huge performance win in most cases because, if you can, why didn’t somebody think of it before? The thing is, this applies to pretty much any aspect of a mature product. As we think about the future of Visual Basic, I can assure you that we all sit around dreaming of the revolutionary new feature that will return us to the days of explosive growth that the product experienced early in its life. And, hey, it’s always possible that we’re going to latch on to the next game-changing development methodology that will revolutionize how people write programs and causes an influx of another 30 million or so programmers. It just isn’t likely. After all, a lot of very smart people inside and outside of Microsoft have been looking at this problem for a very long time and so far we haven’t gotten radically beyond many of the fundamental ideas that made VB so hot a decade ago.

It’s also why I really don’t envy the guys working on Office. After all, if you’re a developer in Word, what are you doing? It’s not like there’s some new radical paradigm for text editing out there — we’ve wrung most of the major gains that are to be had out of WYSIWYG. Same goes for Excel — the spreadsheet metaphor has reached a high level of maturity. So what do you do besides dreaming up newer and newer ways to arrange your toolbars and menus? Collect your paycheck and go home?

This is where we get back to trench warfare. Even though, yes, a lot of the “big ideas” have been pretty mined out, it’s not like we’ve reached a state of perfection. Looking at Word and Excel and Visual Basic, there are still lots and lots and lots of little things that can be better. Refactoring isn’t going to revolutionize programming the way that a GUI builder did, but it’s still a nice, incremental improvement over what came before. It’s the 5% gain or the 1% gain instead of the 50% gain, and that’s in many ways just where we are as an industry.

Personally, I would love to be the guy who dreams up the next really big thing in the programming world, the one that’s going to put my name in the history books (or, at least, computer history books). And, who knows? Maybe I’ll win the lottery. It happens. But if that day never does come, I’d still be happy improving the lives of VB developers by 50% or 75% just by making those incremental improvements that makes their lives easier one step at a time. It might not be enough to get us another 30 million developers in one shot, but in the long run, who knows?

Everything old is new again…

In one of the comments to the “Introducing LINQ” entry that I wrote, Unilynx wrote:

Sounds like what we’ve been doing for five years already 🙂

This was a comment that came up several times at the PDC from various sources: “What’s so revolutionary about this stuff? We’ve been doing this kind of thing for years!” On the one hand, what’s unique about LINQ is how it’s built, it’s openness and flexibility, and it’s unification of data querying across domains. But on the other hand, yeah, let’s be honest: as Newton would say, if we’re seeing further, it’s only because we’re standing on the shoulders of giants. My standard response to this line of thought is: there are really only 15 good ideas in computer science and all of them were discovered thirty years ago or more. What happens is that the programming world just rediscovers them over and over and over again, each time prentending like the ideas are brand new.

Erik Meijer had a good comment in the languages panel that if you want to know what the next big thing in programming is going to be, all you have to do is look at what was hot twenty years ago. Because that tends to be the length of time it takes for the wheel to turn a full crank…