<img height="1" width="1" style="display:none;" alt="" src="https://dc.ads.linkedin.com/collect/?pid=150057&amp;fmt=gif">

Data literacy and thinking in uncertain terms

September 06, 2022 |   | 
3 minutes read
James Arrow

James Arrow

Coping with change and transition

Wisdom is knowing when to update your beliefs and revise your original plans. That might sound straightforward enough, but this often goes against many people’s instincts. Most people take pride in their knowledge and expertise, resolutely staying true to their beliefs and opinions. The traditional view of a leader is one who holds steadfast. This may have served us well in a stable world, but today the real-world is not like that. We live in a transitory world where the only certainty is change, as I mention in part one of this series, Improving Project Management Performance Capabilities.

For many decades, convention within the project management world has been to describe projects in certain terms, by developing and reporting against single-point cost and schedule estimates. This can be misleading, implying a level of confidence in a project plan that is simply not warranted. Consequently, some leaders or project sponsors may be overconfident or subscribe to the school of wishful thinking (or, more accurately, desirability bias) and question the need to add contingency money for unknowns. Too many organizations continue the outdated practice of adding a fixed contingency sum. Best practice is to scrutinize all available data (covered in our previous post entitled Collecting Data That Matters), and attempt to quantify known risk more accurately. By performing a stochastic analysis, teams can better appreciate the full range of likely outcomes and agree to fund contingency at a level that better aligns with their level of risk appetite.

Bias and the overconfidence cycle

In his book titled, Think Again, Adam Grant highlights that people often fall foul of an overconfidence cycle. Bias tends to validate a belief, inflating the ego so that pride reinforces convictions. This in turn enables biases, and so the overconfidence cycle continues. 

Tetlock in his book tilted, Superforecasting: The Art and Science of Prediction, describes something similar. In a landmark 2005 study, drawing on more than two decades of research and the results of a massive, government-funded forecasting tournament, he highlighted that people who displayed poorer forecasting skills organized their thinking around big ideas. They were confident in their beliefs, likely to declare things as impossible or certain and reluctant to change their minds, even after being presented with evidence that their predictions had failed. By contrast, more pragmatic experts who displayed better forecasting skills, gathered as much information as they could from as many sources as possible. This group were comfortable talking about possibilities and probabilities. They thought in uncertain terms, acknowledged when they were wrong and they were willing to change their minds. Tetlock referred to this commitment to self-improvement and belief updating as, perpetual beta (referenced in our first post in this series entitled Establishing The Risk Risk Culture). Interestingly, Grant similarly describes that, by employing a rethinking cycle, people are better positioned to overcome an overconfidence cycle.

In the project management world, teams who are open to novel ways of working, and able to collect and review diverse but relevant risk data, are better positioned to embrace doubt and overcome the influence of bias. In a rethinking cycle, doubt promotes curiosity to discover new insight and invites intellectual humility. Read more about Data That Matters in part five of this blog series.

Safran - Data literacy and uncertain thinking Mini Series Graphic

Choosing courage over comfort

Emerging, relevant data should prompt courage over comfort, nudging teams to champion uncertain thinking and potentially revise their plans in the face of dynamic, unforeseen conditions. In this spirit, there is a growing awareness within the project controls community that the standard practice of referring to pre-determined accuracy ranges such as +/- 10% has been misleading by inferring levels of confidence that are not warranted, thereby erroneously anchoring team risk assessments. A more meaningful, data-literate approach is for teams to use Monte Carlo simulation tools such as Safran Risk instead, in order to express estimate accuracy as a percentage range at the 90% confidence interval.

Finally, project teams can bring more realism to planning and project delivery if they choose not to lean entirely on an internal view or the judgement of their own team, from the bottom-up. Emerging best practice, adopted by several government procurement agencies around the world, is to run QRA (quantitative risk analyses) in parallel with secondary models that serve to provide an independent, outside view. This approach often uses top-down empirical or historical data, however, advances in machine learning are now providing opportunities for an independent outside view to be artificially generated from the bottom up. This opens the door to the exciting prospect of more frequent, automated risk analyses than can improve project prediction accuracy for the good of all.

Doubt and continual improvement are your friends in disruptive times like these, and your team needs first-in-class software tools like Safran Risk to help them capture and review data from a range of sources. Feel free to contact Safran if you would like to speak to a trusted advisor who will discuss your assumptions and internal organizational challenges. 

Safran - Data literacy and uncertain thinking Consultation Graphic