Get Professional

What Programmers Don’t Know – Dealing with Cognitive Bias

21 Dec , 2016  

The first principle is that you must not fool yourself – and you are the easiest person to fool. – Richard Feynman

So imagine this scene.

You and your colleagues are sitting around a computer at work, staring at a log message. It’s an error. Only the whirr of the computer’s fans breaks the silence.

This is an unnatural situation and won’t last for long. It’s unnatural for developers to stare at an error message in silence.

Someone will curse, or exclaim “But that’s impossible!”

Inevitably, a theory will be developed.

“We’ve had so many users losing their sessions lately.”

“But why would that be? It’s never happened before, and it’s not like we changed anything.”

“Nothing’s changed?”

“Nothing.”

“OK, but we don’t know what’s happening on the client side. We never hear from those guys.”

“I heard they pushed out a new version last week.”

“Really.”

“Yeah.”

“You wanna find out what they did?”

“OK, let’s check out their code.”

And so begins a long journey down a rabbit hole.

That little piece of dialog indicates all kinds of problems, but one of them is mistaking suppositions for knowledge.

Has it ever happened to you? You spent ages trying to work out a weird bug, only to find it wasn’t weird at all, just in an unexpected place?

Today we’re going to find out what causes this peculiar blindness, and what we can do about it.

What programmers don't know cognitive bias

Lies programmers tell themselves.

But you know, it’s not just programmers.

Everyone, and I mean everyone is affected by this. Mostly, it’s down to cognitive bias, or what I like to call the BCA – Butt Covering Autopilot.

If you check out the Wikipedia page for cognitive bias, you’ll find a whole menagerie of biases waiting to trip us up.

The two most interesting ones, for our discussion, are Confirmation bias and Belief bias.

Confirmation bias is where you believe whatever conforms to your preconceptions. For example, “Those front-end JavaScript savages can’t code for nuts. The problem’s definitely in the front end.”

And Belief bias? This is where we think the strength of an argument is determined by whatever the argument is for. Like, the UX reviewer says major changes are required for a complex frontend component you just “finished”. At the end of a long day, whatever that guy says is going to sound bananas.

So, you can see why I call this the Butt Covering Autopilot, right?

We’ve got this kind of automatic protection in our brain that’s always working to make us feel good about ourselves and our decisions.

Sadly, there’s maybe one other person in the world with the same protective mechanism working on your behalf – and that’s you mother.

To be an effective problem solver you’ll need to learn to see beyond these built-in biases.

And that takes one little thing.

Overcome bias to solve problems effectively

Here’s the thing – if we adopt the scientific standard of gaining knowledge you’ll realise you (and I, and everyone else) “know” very little.

Of all the declarations you hear people making day in and day out, how many have been verified by objective measurements, controlled studies or have been replicated by independent parties?

Mostly none.

And you know, it’s no big deal.

Does it really matter that episode five is the best of the Star Wars films, or if your buddy at the next desk really had right of way in his latest fender-bender.

Not so much.

If we start insisting on independent scientific verification of every single tiny little thing someone says, we’ll start sounding like this guy:

Sheldon from Big Bang Theory

Sheldon Cooper in action

And don’t get me wrong – intuition really is a thing, and some people are incredibly good at making informed guesses with limited information.

But when it comes to building complex systems in collaboration with other people – and mistakes cost a lot of time and money – then replacing guesses with knowledge matters.

So what’s the first step?

It’s easy – admit (to yourself at least) that you know nothing.

Actually, that’s not easy!

Sure, you’re an expert in your field and you’re qualified to investigate difficult problems and fix them. But with a new problem you and I are as ignorant as everyone else.

Ok then, you don’t know. Where to from here?

Let’s see about going from ignorance gaining some verifiable knowledge.

The first step to knowledge

Here’s what Kent Beck says in his article, ‘Mastering Programming’:

“Concrete hypotheses. When the program is misbehaving, articulate exactly what you think is wrong before making a change.”

I like this because:

  • It forces you to admit that what you think is wrong is a hypothesis.
  • You must make a prediction out loud, which leaves you nowhere to hide. You’re about to prove where the problem is, or not.

And you know what hypotheses are for, right? Testing.

So the way we move from ignorance of the cause of a problem to knowledge about it, is first forming a hypothesis about the cause of the problem, and then devising a test that will prove or disprove that hypothesis.

A great thing to do at this point (also mentioned in Beck’s article), is reduce the problem to its most simple form, and test that.

This can involve something as simple as putting some debug output in your logs, or re-implementing the code in the simplest way possible to make it more testable.

And of course, you can just step through the code in your debugger, which is a great thing to do when you run your code for the first time anyway.

If you’ve proven the hypothesis, great! Get on with fixing the problem (but still make predictions, out loud, about the effect your change will have).

If your test doesn’t prove the hypothesis, or is inconclusive, you need either another hypothesis, or a better test.

At this point you might want to try rubber ducking to introduce another point of view into your problem solving – another great way to make you leave your unconscious biases behind.

But wait, there’s more!

So, you put in the hard work to get some concrete results out of your debugging.

This carries two benefits – the first is that you’ve either conclusively proven that the problem isn’t in your code, or that is was in your code, and you’ve fixed it.

The second benefit is that this approach produces hard evidence. Remember, you’re not the only one with cognitive biases. If the results of your work are hard for other people to swallow, you’re now armed with actual proof of the problem and its solution.

Now that’s hard to argue with.

Wrapping up programmers’ cognitive bias

So here we are. I’m curious – what do you think about cognitive bias, and uncertainty about what you know?

I think that discovering and developing this peculiar ignorance is at first worrying, and finally liberating.

It’s worrying obviously because it eats at the foundations of what we believe about our value as knowledge workers.

Liberating?

Because once we free ourselves of the illusion of knowledge, we’re ready to really start learning.

By


2 Responses

  1. Laura says:

    Very well explained!

    I agree that the biggest blockage in process of learning is only you on your own. Its really hard sometimes to accept the reality that what you know already is just a fraction of the total, despite the fact that you’re highly experienced in your field.

    Acceptance of ignorance is the key to success and then you can start your journey of gaining knowledge by unlocking the door!

    • Luke says:

      Thanks Laura!
      I absolutely agree, accepting ignorance is the first step to learning. A hard thing to do in an industry where ‘smarts’ are so highly valued!

Leave a Reply

Your email address will not be published. Required fields are marked *