Quantifying uncertainty associated with our models is the only way we can express how much we know about any phenomenon. Incomplete consideration of model-based uncertainties can lead to overstated conclusions with real-world impacts in diverse spheres, including conservation, epidemiology, climate science, and policy. Despite these potentially damaging consequences, we still know little about how different fields quantify and report uncertainty. We introduce the “sources of uncertainty” framework, using it to conduct a systematic audit of model-related uncertainty quantification from seven scientific fields, spanning the biological, physical, and political sciences. Our interdisciplinary audit shows no field fully considers all possible sources of uncertainty, but each has its own best practices alongside shared outstanding challenges. We make ten easy-to-implement recommendations to improve the consistency, completeness, and clarity of reporting on model-related uncertainty. These recommendations serve as a guide to best practices across scientific fields and expand our toolbox for high-quality research.
Grace Simmonds, Emily; Kwaku Prepah Adjei; Christoffer Wold Andersen; Janne Cathrin Aspheim; Claudia Battistin; Nicola Bulso; Hannah M. Christensen; Benjamin Cretois; Ryan Cubero; Iván A. Davidovich; Lisa Dickel; Benjamin Dunn; Etienne Dunn-Sigouin; Karin Dyrstad; Sigurd Einum; Donata Giglio; Haakon Gjerløw; Amélie Godefroidt; Ricardo González-Gil; Soledad Gonzalo Cogno; Fabian Große; Mari F Jensen; John James Kennedy; Peter Egge Langsæther; Jack H. Laverick; Debora Lederberger; Camille Li; Elizabeth G. Mandeville; Caitlin Mandeville; Espen Moe; Tobias Navarro Schröder; David Nunan; Jorge Sicacha-Parada; Melanie Rae Simpson; Emma Sofie Skarstein; Clemens Spensberger; Richard Stevens; Aneesh C. Subramanian; Lea Svendsen; Ole Magnus Theisen; Connor Watret & Robert B. O'Hara (2022) Insights into the quantification and reporting of model-related uncertainty across different disciplines, iScience 25 (12).