BA 42/52: The Monkey Wrench Called Optimism
We like to think of ourselves as rational creatures. We watch our backs, weigh the odds, pack an umbrella. But both neuroscience and social science suggest that we are more optimistic than realistic. On average, we expect things to turn out better than they wind up being.
- Tali Sharot (The Optimism Bias)
Every machinery and system will eventually meet its monkey wrench. Something that turns a sound system upside down.
A common theme in software engineering is that more than half the projects are over budget and very delayed.
We tried to replace our well crafted, detailed plans (waterfall) with ones that shifted with the arch of time and accommodated for what reality might throw at us (agile). Yet, this pattern of over budget and delays persists.
Issue is not of mechanics or process, it is something more human — optimism.
As much as we see ourselves as rational creatures capable of weighing the pros, the cons, the past experiences, the lessons learned, yet, still we persist and hope that this time things will be different, they will be better.
This human trait manifests itself perfectly in software engineering.
“We expect it will only take us a sprint to deliver this feature.”
“We think a week worth of testing is fine.”
“Well, we are only planning a week between design hand off and updating from eng UI spec to production UI spec. It will be fine, its just a paint job change.”
“We discussed this early; stub APIs for now and then we will switch over to final production APIs before deploying to PROD.”
“Testing shouldn’t need too much time, a day turnaround is not too much to expect”.
All of the above statements are innocuous at face value but each is steeped in optimism bias that everything will be alright. Happy thoughts.
“We expect it will only take us a sprint to deliver this feature.” — What if we encounter a more difficult issue in the final stretch close to the deadline andit pushes us over a sprint.
“We think a week worth of testing is fine.” — What if there is a sudden absence of testing resources or testing finds a lot of failures where they can’t even get through a single end to end test pass within he allotted time.
“Well, we are only planning a week between design hand off and updating from eng UI spec to production UI spec. It will be fine, its just a paint job change.” — What the UI spec designed is different than how the eng spec was built underneath. There was miscommunication.
“We discussed this early; stub APIs for now and then we will switch over to final production APIs before deploying to PROD.” — what if integration testing reveals showstoppers that requires re-architecture of the API design.
“Testing shouldn’t need too much time, a day turnaround is not too much to expect”. — what if testing can’t turn around something in 24 hrs.
You will be surprised how often optimism creeps into planning and decision-making. Why? Again, not mechanical, but human. We hope for everything to be alright.
So, how do we fight or account for this human trait when charting out our glorious ambitious features, products, roadmaps. I have some ideas:
Pre-mortems — Shreya Doshi advocates this marvelous technique where you game out at the beginning of the project “what if everything that can go wrong, goes wrong, what do you do?”. The idea is simple, easy, and something you can “implement on Monday” according to Shreya.
Risk Register — Something I picked up from Adam and Brad’s book Risk Up Front and leveraged extensively when working on Android Things at Google. Similar to Pre-mortems, as part of the planning phase, for every critical milestone or step, ask the team to imagine all possible risks that can be encountered and possible solutions to these risks. Note risk + solution down in an excel sheet or document for reference. This way you are prepared with quick solutions to deploy if and when you encounter risks. You can be optimistic BUT prepared.
Pragmatic Decision Makers — There are two places where optimism bias is most common: planning and decision making. Both are either side of the same coin. If plans can be built on hopes, so can decisions, like if “add more people to the problem it will be fine” or “maybe just another week to finish it off (when we all know its best to cancel the project)”. Whenever you are making a decision, check yourself - how much optimism is present in this decision I am about to make?. Ruthless decision making is something I learned full on at Apple: for every yes, there are a thousand no.
The Green Book Methodology — Oh, I love this one; you can learn wonderful things from unexpected places. This is especially important for consulting and outsourcing services companies when dealing with clients. Her Majesty’s Treasury (soon to be His Majesty’s Treasury) publishes a book called The Green Book which in simple terms provides government agencies a data driven approach to analyze, evaluate and plan for public projects. The basis of these guidelines and frameworks is analysis of past public projects to asses and find trends that can help adjust for potential over runs. There is a special chapter in there on… optimism bias in infrastructure projects including IT Projects. Over time, as you continue to take on more difficult and challenging projects, track the baseline to actual plan delta. Develop your own Green Book which you can use to add buffer or risk tolerance in your next client project.
Accountability & Honesty — I cannot stress this enough, accountability and honesty makes or breaks everything. I know everyone in the org chart has various pressures to deal with. However, when engineers tell their leader a potential plan only to have said leader, in an effort to not upset or politics, decides to oversell to the senior leadership, no one wins. Put the DRI framework in place. Trust your experts and people who know the code. Sometimes, all you need to do to counteract optimism is setting the right expectation.
Adjustable Risk Tolerance Buffer — With every plan, think of a percentage confidence value of whether you believe the project will go over budget or over planned time. Now, instead of adding a blind buffer to everything, identify a corresponding percentage value, Risk Tolerance Buffer, which you can use to add additional money to your budget or time to your project plan. As development proceeds, uncertainty in the project will drop, your confidence in the plan will go up; therefore you can continue to adjust that buffer.
Why is this better? Rather than blindly adding a sprint here and there, a few thousand dollars on this line and that, take a more analytical approach. Coupled with the Green Book suggestion earlier, this gives you a more data driven way to develop a more robust planning and decision making framework for your projects.
This is not meant to be an exhaustive list but something to get you started. I am confident, many of you have recognized the sounds of optimism bias in your projects. I am also super confident that many of you have developed your own frameworks and approaches to counteracting said optimism bias.
Now, a final piece of thought — optimism is not a bad thing. The greatest of empires, the most revolutionary products, the game changing companies, and rebellions in space operas, all have one thing in common, optimism.
Optimism and hope that a group of like minded and crazy enough people can achieve the impossible. The cautionary tale is when blinded by said optimism, we throw caution to the wind only to be surprised when things don’t go according to plan.
It is not foolish and obstructionist to build a great plan, take pause, look at the team and say: Okay, now, what if something goes wrong, what do we do then?
Until next time 👋!
How is this week’s newsletter?