Quote of the week – second best

… distortions in one market … can justify government intervention in other, related markets. Second-best arguments have a dubious reputation in economics, because the right policy is always to eliminate the primary distortion, if you can.

Paul Krugman, The big green test

Before jumping into my commentary, I want to emphasize that this is not intended to be a political “quote of the week”, and any comments that are political rather than technical will not be approved. Actually, since I haven’t put up the page with my moderation policy yet, I will point out that any comments that are religious rather than technical will also be consigned to /dev/null. Of course, if Pope Francis or President Obama ever venture an opinion on lambdas vs. std::bind feel free to comment on that subject.

I suggest reading the whole piece since I had to mangle the quote to get it into a reasonable form. The reason I am quoting Paul Krugman at all is that I think that the economic distortions he is talking about have an equivalent in code. Something big is wrong (call it X) and can’t be changed (or someone with some authority believes it can’t be changed) so the team spends the rest of the project working around it, usually with increasingly fragile hacks.

I believe that the cost of not changing X is routinely underestimated. The cost doesn’t come in one big lump, it is spread out over weeks and months of having to work around X – a colleague of mine referred to these problems as “sharp corners”. They don’t actually stop you from doing something but they make it unpleasant enough that you try to avoid the sharp corners. That leads to second best solutions which then lead to further sharp corners and more second best solutions. Changing X costs you in the short term, not changing it results in long term smaller, cumulative costs.

You need to know why you can’t change X. The reasons are important, the reasons might apply now, they might not apply in 6 months. Take the long view – do you still want to be dealing with X in 2 years (or 5 years or 10 years)? Even if you can’t change X right now, perhaps there are incremental changes you can make so that it will be possible to change X at a future date.

Think about what X would look like if you could change it. How close can you get to that? You might be able to wrap X, although wrapping comes with costs of its own. I have worked on systems that wrapped almost everything and they were a pain to deal with.

I have been on teams that fail to get to grips with their version of X, and I have been on teams where we did manage to handle X well. It is wonderful when you can take something that was causing constant pain and eliminate it – it’s like a toothache that you get treated.

8 thoughts on “Quote of the week – second best

  1. CarlD says:

    I like the analogy – have grappled with many X’s in the last 30+ years of software development.

    Editorial nits:
    “…you try _to_ avoid the sharp corners…”
    “…perhaps there _are_ incremental changes…”

    (proofreader hat is permanently attached).

  2. Daniel says:

    I definitely agree. I had more than enough “We cannot fix bug XY because this would break existing code” arguments that lead to years of pain.

    Sometimes you just need to force users to adjust.

  3. On the topic at hand, I’ve been wrestling with this lately in my transition from classic brute force web scripts, to heavily databound silverlight thick clients, and now to server-side MVC. Each has “sharp corners” where it would be easier to do something using another technique. I guess that’s not a great example, because you can’t do what Krugman alludes to, and eliminate the root issue.

    But really, can you ever truly “eliminate” any of these issues? All you can do is make different trade-offs. Like you mentioned, time horizons can be short or long, and both have different schedules for you getting paid back. I know I will often code a “sharp corner” on purpose, but try to put an api over it, so that I can come back later and round that corner off without too much difficulty. For example, right now my MVC app caches a huge swath of the database, more than it should, but I can get away with it because the app is young. When it gets older I won’t be able to cache everything this way. But that’s okay because all the “calling code” works independent of the cache, it’s not written to assume things are cached. So when it comes time to smooth that corner off, it’ll be okay. And it’s been a sharp corner… every time I deploy, the cache is invalidated, and it seems like somebody gets hit with a null reference exception!

    Going meta… Kudos for venturing out into semi-uncharted quote territory. My own personal quote list http://johndehope.weebly.com/quotes.html includes both Gandhi and also Donald Rumsfeld. Even a stopped clock is right twice a day, eh? Can you tell us a bit later roughly how many more comments you had to reject than usual? I’m curious.

    • Bob says:

      But really, can you ever truly “eliminate” any of these issues?

      If you are being strict about definitions, no, you probably can’t eliminate the problems, but I have found it is often possible to reduce the pain from them enough that they are effectively eliminated. Your point about trade offs is an important one – there are nearly always trade offs, and it drives me crazy when developers refuse to admit there are trade offs.

      Hiding things behind a good API is a great way to make sure that short term hacks are contained. Writing a good API is an art in itself – just after I posted this there was a discussion on Hacker News about “the worst API ever made” which has some interesting links to books and talks about designing APIs.

      Sadly I have worked on teams that didn’t believe in designing APIs at all – good or bad. Just make all the members public so that anyone can do anything they want. A senior developer on that team informed me that encapsulation was “a dogmatic OO technique”.

      I have not had to reject any comments, I suspect because I have few enough readers that I haven’t attracted trolls yet. Of course, my readers are also far too smart to get pulled into off topic discussions.

      • I’m going to try to dig up a nice article on API design. Not something overly academic, but more a rule-of-thumb sort of thing. One such rule I apply is “prefer fewer arguments”. Just because I think (or know) the client has some data, doesn’t mean I let them provide it. I’ll look it up again within the implementation. It just feels cleaner to me.

        “I have not had to reject any comments, I suspect because…” Also, the style of your writing attracts a certain kind of audience. Your posts don’t invite trolling. Sure we’re smart and well dressed and good looking. But we’re also self-selecting for an appreciation of your tone and thoughtful critique of programming issues.

  4. John Payson says:

    I find it interesting the number of design decisions that were made decades ago but which still plague programmers today, *even in cases where compatibility isn’t required*. For example, I don’t particularly fault K&R for using a leading zero to represent octal notation. Octal notation was needed often enough, and memory was sufficiently tight, that a single-character prefix was preferable to a two-character one, and C didn’t want to rely upon the available of a character like $ or @ for the purpose. I further understand that because there exists a significant body of C code which uses leading-zero octal notation, it’s not possible to define any other meaning to integer constants written with leading zeroes. If leading-zero notation had been deprecated at the same time as old-style function declarations were, programmers today wouldn’t have to worry about it except when importing really ancient code that compilers wouldn’t particularly like anyway. The failure to fix the issue, however, means that it continues to plague programmers decades later.

    What I find most interesting, however, is that such notation seems not only to have infected a particular language, but also a culture, since a number of languages which have been invented in decades since continue that usage. Perhaps as a “last-in-first-out”, it seems JavaScript/ECMAScript is trying to undo the damage (though IMHO it should have provided a new notation for octal notation such as 0q123 or 8×123, since it is occasionally useful). Perhaps the maintainers of that language are aware that the sooner the problem is nipped in the bud, the less total annoyance it will cause?

Leave a Reply to Daniel Cancel reply

Your email address will not be published. Required fields are marked *