FeaturedTranslation

The Comparative Uncertainty Criterion: A New Approach to Assessing Model  Accuracy

Boris Menin

The scientific community is currently grappling with a reproducibility crisis, marked by an alarming increase in the number of published articles that have been retracted due to errors or fraudulent practices. This crisis can be attributed to various factors, including the intense pressure to publish, insufficient training in research methods, and flaws in the peer review process.

Equally interesting and important in the flood of falsified papers, however, are seemingly unimportant papers that simply point out curious observations or lay out a few important logical steps, and thereby contribute little to the dense undergrowth of science. Each explorer knows similar gems from his field.

One of the main challenges in addressing the reproducibility crisis is the lack of a universally applicable criterion for assessing model-phenomenon mismatch. Currently, there is no single way to measure the accuracy of a model, and different disciplines use different methods. This makes it difficult to compare results across disciplines and to identify potential problems with a particular model.

In this article, I propose the comparative uncertainty criterion as a new approach to quantifying model uncertainty. The comparative uncertainty criterion is grounded in the informational approach and adheres to the International System of Units (SI). The criterion can be used to quantitatively assess the model’s proximity to the object of study and to detect subtle deviations from widely accepted principles.

The comparative certainty criterion is based on the following two principles:
The principle of information transmission: A model is an information channel that transforms information about the object of study into a set of predictions.
The principle of uncertainty: The uncertainty of a model is determined by the amount of information that is lost during the transformation process.

The information quantity of a model is a measure of its uncertainty. A system with an amount of information close to optimal is considered more certain, while a system with high difference is more uncertain.

The comparative uncertainty criterion can be used to assess the accuracy of a model in a number of ways. For example, the criterion can be used to compare the accuracy of different models of the same phenomenon. The criterion can also be used to track the accuracy of a model over time.

The comparative uncertainty criterion is a valuable tool for advancing scientific rigor. The criterion can help to ensure that scientific investigations are more reproducible and reliable, and it can help to identify potential problems with existing models.

A comparative uncertainty analysis was performed on scientific and technical works, related to the investigation of “underwater electric discharge,” comparing the obtained comparative uncertainty of the model (εmod) with the theoretically justified uncertainty (εopt). When the uncertainties exhibit close values (εmod/εopt→1), it confirms the reliability and utility of utilizing either model to describe the studied process. Conversely, a substantial disparity between these uncertainties (εmod/εopt<<1) signifies a significant risk associated with the application of a specific model. A total of 800 articles were reviewed in this analysis, selected based on four simultaneous criteria to ensure a comprehensive evaluation. These criteria were applied to assess the legitimacy and practical feasibility of the presented ideas related to “underwater electrical discharge” and to facilitate comparisons with other models to identify the most suitable model for describing the process. Except for only one article (!), none of the other examined studies thoroughly explain the calculation of relative uncertainty in their experiments which can be renormalized to comparative uncertainty.

—A total of 800 articles were reviewed in this analysis. Except for only one article (!), none of the other examined studies thoroughly explain the threshold mismatch between a model and phenomenon studied.—

Recognizing the progress achieved in several researches the adoption of an informational approach emphasizes the significance of incorporating a specific number of variables in models that closely align with the recommended guidelines. Specifically, regarding underwater electrical discharges, only one (!) model takes precedence by encompassing several variables close to the optimal values.

However, there are challenges and considerations that need to be addressed when applying the comparative uncertainty criterion. One challenge is that the criterion is not widely used or known in the scientific community. Researchers need to be made aware of this criterion and understand how to calculate it effectively, including accounting for the number of variables used in a model.

Another challenge is that the comparative uncertainty criterion requires a good understanding of the object of study. The assumptions and structure of a model are formulated by the researcher based on their knowledge, experience, and intuition. The informational approach can only help the researcher, before any experiment, to choose the most suitable model of a physical phenomenon by verifying the comparative uncertainty inherent in the group of phenomena (GoP), which is defined based on the base quantities in the model and the number of variables chosen by the researcher.

Despite these challenges, the comparative certainty criterion is a promising new approach to quantifying model uncertainty. The criterion has the potential to significantly improve the quality of scientific research by making it easier to assess the accuracy of models and to identify potential problems with existing models.

Applying the comparative uncertainty criterion to diverse experimental data poses challenges and requires careful consideration. Researchers must consider the system of units, variable selection, and potential distortions in transforming the source into a model. Accounting for various sources of uncertainty and limitations can enhance the accuracy and reliability of predictions.

In conclusion, the development and adoption of a universally applicable criterion, such as the comparative uncertainty criterion, is crucial for advancing scientific rigor. By evaluating the deviation between models and observed phenomena, researchers can enhance the reproducibility and reliability of scientific investigations. Rigorous assessment practices and objective evaluations based on this criterion will ensure the robustness and credibility of scientific research in the face of escalating costs and concerns about the accuracy of reported results.

It is important to acknowledge that the comparative uncertainty criterion is not a definitive solution to all challenges. It is not a “silver bullet.” However, as an important addition to scientists’ repertoire, it has the potential to significantly improve the quality of scientific research endeavors.

Translation of the article: Advancing Scientific Rigor: Towards a Universal Informational Criterion for Assessing Model-Phenomenon Mismatch, in American Journal of Computational and Applied Mathematics 11(7), 1817-1836, 2023. https://www.scirp.org/pdf/jamp_2023071113462472.pdf.

Cite this article in APA as: Menin, B. The comparative uncertainty criterion: A new approach to assessing model accuracy. (2023, July 20). Information Matters, Vol. 3, Issue 7. https://informationmatters.org/2023/07/the-comparative-uncertainty-criterion-a-new-approach-to-assessing-model-accuracy/

Author

  • Boris Menin

    BORIS M. MENIN (Member, IEEE) received an MSc degree in 1973 at Electro-Technical Communication Institute, department of Multichannel Electrical Communications and received a PhD in Mass and Heat Transfer at the Technological Institute of Refrigeration Industry, Russia, St-Petersburg in 1981. Dr. Menin was Director of the Laboratory of Ice Generators and Plate Freezers in St. Petersburg from 1977 to 1989, after which he emigrated from the Soviet Union to Israel. There he was the Chief Scientist at Crytec Ltd. (1999–2008) and managed the development, production, and marketing of pumpable ice generators and cold energy storage systems, while also modeling and manufacturing high-accuracy instrumentation for heat and mass processes. He is now an Independent Mechanical & Refrigeration Consultation Expert. In addition, he has managed Task 3.1 of the European FP6 project in the field of food cold chain and several of Israel’s (EUREKA, integrated project of EU and Chief Scientist Office of Israel’s Ministry of Industry) in the field of cold energy storage systems based on pumpable ice technology. He is an author of five books and 67 journal articles, and is a member of ASHRAE (USA) and SEEEI (Israel).

Boris Menin

BORIS M. MENIN (Member, IEEE) received an MSc degree in 1973 at Electro-Technical Communication Institute, department of Multichannel Electrical Communications and received a PhD in Mass and Heat Transfer at the Technological Institute of Refrigeration Industry, Russia, St-Petersburg in 1981. Dr. Menin was Director of the Laboratory of Ice Generators and Plate Freezers in St. Petersburg from 1977 to 1989, after which he emigrated from the Soviet Union to Israel. There he was the Chief Scientist at Crytec Ltd. (1999–2008) and managed the development, production, and marketing of pumpable ice generators and cold energy storage systems, while also modeling and manufacturing high-accuracy instrumentation for heat and mass processes. He is now an Independent Mechanical & Refrigeration Consultation Expert. In addition, he has managed Task 3.1 of the European FP6 project in the field of food cold chain and several of Israel’s (EUREKA, integrated project of EU and Chief Scientist Office of Israel’s Ministry of Industry) in the field of cold energy storage systems based on pumpable ice technology. He is an author of five books and 67 journal articles, and is a member of ASHRAE (USA) and SEEEI (Israel).