<img height="1" width="1" style="display:none;" alt="" src="https://dc.ads.linkedin.com/collect/?pid=150057&amp;fmt=gif">

Bonini’s Paradox and the art of developing reasonable estimates

September 27, 2022 |   | 
5 minutes read
James Arrow

James Arrow

Beyond traditional pricing and plans

In the summer of 1990, I began work as a project surveyor. Based at the site office, I represented a HVAC subcontractor, working to build a community hospital in Central England. Among my tasks at that time, I helped price change orders. Technology has changed a lot since then! In those days, work was very much analog. Engineers back in the home office would drop revised drawings, printed to scale, in the mail. A day or two later, we would review them by physically comparing the previous revision with the latest version. Armed with an impressive array of colored highlighters, handheld Casio calculator, notepad and pen, we would highlight changes, use a scale rule to take off quantities and then apply a schedule of rates (typically a dot matrix print, several hundred pages long) to develop a final price for agreement with the main contractor! 

Back in those days, pricing did allow for some uncertainty, but only in the simplest sense. There were pre-agreed, fixed percentages for unknowns, referenced in the last blog post Avoiding The Risk Of Using The Wrong Risk Tool. There was no discussion or allowance to account for the probability, likelihood, or chances of success. All pricing and plans were communicated using single-point values and, invariably, thought of as certain.

The risk of overstating certainty

Price estimates such as those are, in effect, a simple model of a complex system. Technically, “A [mathematical] model may help to explain a system and to study the effects of different components, and to make predictions”. On a broader, project-wide scale, the convention is to apply similar estimating processes and develop a bill of quantities (or bill of materials) so that the client may have a good sense of the anticipated final cost before agreeing to proceed with work. Traditionally, estimating this way generates a single-point value with no indication of the chance of success. Consequently, single-point estimates inadvertently convey a level of certainty or confidence that is not warranted. “Statistical evidence shows that … complexity and unplanned events are often unaccounted for, leaving budget and time contingencies inadequate”.

Safran - Bonini’s Paradox Mini Series Graphic

Probability theory and the Monte Carlo method

For most of us, it is simply human nature to be fatalistic in our outlook. It is what it is … but it doesn’t have to be. Attributed to Louis Pasteur is the idiom: chance favors the prepared mind. To enable meaningful foresight and robust reasoning, the most powerful tool available is the scientific method and, more specifically, probability theory. Arguably, probability theory and our ability to scientifically discern how likely one event will be over another has enabled some of the greatest advances in science. Gerolamo Cardano (b. 1501, d. 1576) was, “One of the key figures in the foundation of probability theory and among the most influential mathematicians of the Renaissance”. Attributed to Cardano, the law of large numbers states that, the more a game is played, the better mathematical probability predicts the outcome.

In the late 1940s, when working on The Manhattan Project, Stanislaw Ulam invented the modern version of the Markov Chain Monte Carlo method. While convalescing and playing solitaire, he challenged himself to determine the chance of success. “After spending a lot of time trying to estimate them by pure combinatorial calculations, I wondered whether a more practical method than ‘abstract thinking’ might not be to lay it out say one hundred times and simply observe and count the number of successful plays. This was already possible to envisage with the beginning of the new era of fast computers”. 

Over the last twenty years, personal desktop computing and spreadsheet software have made dynamic modeling the norm for project professionals. The cost or time impact of change can be calculated in near real-time. At a basic level, spreadsheet software, can be considered a form of AI (artificial intelligence) or, more technically, an artificial agent mimicking cognitive function. In recent years, Sam Savage also helped highlight the fallacy of point estimates with his humorously succinct aphorism, The Flaw of Averages. This evolving sense of data literacy, combined in parallel with the widespread availability of spreadsheet and Monte Carlo simulation (MCS) software, has made probabilistic modelling in the project management profession more commonplace.

Employing both data and theory when using stochastic simulation tools 

“The democratization of data analytics in project delivery will only accelerate in the coming years. This, in itself, is something for the project management profession to celebrate, however, great harm is possible if a new generation of tools are not employed with care and responsibility. Today, it is recognized that although stochastic simulation tools are widely available, they are ‘often misused’. In order to employ good care, responsible project professionals need to be aware of and account for all factors influencing cost and schedule prediction accuracy. Risk data is not all that is needed. Meaningful analysis requires applying a good balance of both data and theory.”

Oftentimes, stochastic analyses are misused when they are employed for the first time, each time. To quote decision science thought leader Paul Shoemaker, “Experience is inevitable; learning is not. Overconfidence persists in spite of experience, because we often fail to learn from experience as cited in the introduction to this series Improving Project Management Performance Capabilities. In order to learn, we need feedback about the accuracy of our opinions and doubts. We also need the motivation to translate this information into better metaknowledge.” In their book, Superforecasting: The Art & Science of Prediction, Phil Tetlock and Dan Garner write, “The strongest predictor of rising into the ranks of superforecasters is perpetual beta, the degree to which one is committed to belief updating and self-improvement”. Such metaknowledge and self-improvement (see Stakeholder and Change Management blog post) can only be acquired by teams who have the processes and Risk Culture in place to look back at completed projects and objectively assess the accuracy of the subjective assessments. A Growth Mindset, combined with structured data capture, data curation and data management are key enablers of perpetual beta. Ultimately, for any organization that relies upon the repeated delivery of capital assets, moving away from a bespoke strategy in favor of a repeatable program strategy is going to ensure projects are delivered faster, more cheaply and with less risk.

The future of complex data modeling

An anticipated evolutionary path for modelling within the project controls profession, is for advanced analytics / AI to allow Bayesian Inference or, more simply, frequent analyses that include emerging data to enable increasingly accurate project performance predictions. Fortuitously, to overcome known project forecasting problems, we have the technology available to us today. The question is, at what point do project models become so complex that they cease to provide value? French philosopher-poet Paul Valéry once said, "If it's simple, it's always false. If it's not, it's unusable." This is also known as Bonini’s Paradox: "As a model of a complex system becomes more complete, it becomes less understandable. Alternatively, as a model grows more realistic, it also becomes just as difficult to understand as the real-world processes it represents."

George Box, world renowned statistician, coined the aphorism, all models are wrong, but some are useful.  Essentially, we should be aware of the limitations of our models in order to use them effectively. Machine learning today is notoriously unexplainable. One aspirational form of next generation AI is XAI or explainable artificial intelligence. When experimenting with AI, a topic I touch upon in a previous blog post The Optimal Balance of People, Process and Technology, conventional MCS solutions such as Safran Risk can serve as a glass-box (rather than black-box) solution that remains explainable to the project team. Any hybrid project modelling or simulation approach that experiments with unexplainable AI will benefit from using tools like Safran Risk to serve as a primary model, or experimental control, helping a human-in-the-loop to interpret and trust novel solutions that are unfamiliar to project stakeholders. 

Does your organization have the risk culture and Data Literacy skills to ensure that it is future-ready? Are your people equipped with best-in-class software tools? Safran Risk’s simple and intuitive layout makes it the tool of choice for risk practitioners who work closely with project teams. Reach out and contact us if you need a trusted advisor to help your organization easily quantify and manage uncertainty that matters.

Safran - Bonini’s Paradox Consultation Graphic