Is the Accuracy of Modeling Related to the Consciousness of the Observer?

Is the Accuracy of Modeling Related to the Consciousness of the Observer?

Boris Menin

Translation of the article: Construction of a model as an information channel between the physical phenomenon and observer, in JASIST 72:9

One of the most interesting questions in science is how accurately (truthfully) researchers reproduce (simulate) the observed physical phenomena. The proposed article presents a controversial ambitious answer. The idea is based on the representation of the model of the investigated object as a communication channel between the phenomenon and a conscious observer. Using the mathematical apparatus of information theory in combination with the concept of the complexity of the International System of Units (SI), it is possible to calculate the amount of information contained in the model and its minimum achievable uncertainty. This approach is theoretically substantiated and mathematically proven, which made it possible to apply the information method to calculate the accuracy of models in various fields of science and technology, including refrigeration equipment, climate, and measurement of physical constants.

—One of the most interesting questions in science is how accurately (truthfully) researchers reproduce (simulate) the observed physical phenomena.—

All existing methods of verifying the plausibility of a model of a studied physical phenomenon or technical object (experts call them validation and verification), as well as the statistical methods of processing experimental and computer results proposed so far, are focused and are intended for analyzing the data of an already constructed model. In this case, the generally accepted accuracy limit for any model is determined by the Heisenberg inequality, although it is insignificant for the macrocosm because of the very small value of the Planck constant. Unfortunately, until now the exact boundary between the macro- and microcosm has not been established, although there are several studies aimed at establishing such [Tebbenjohanns, F., Mattana, M.L., Rossi, M. et al. Quantum control of a nanoparticle optically levitating in cryogenic free space. Nature 595, 378-382 (2021)]. In addition, the researchers left out some important aspects. In particular, in the process of modeling from the point of view of information theory, there are two interacting objects: the International System of Units (SI), which “supplies” the model with information about the phenomenon under study using dimensional and dimensionless variables, and a researcher who selects certain variables from the SI using intuition, knowledge and experience to determine the accuracy of the model.

The characteristics of the process, the amount and quality of data available, prior knowledge and the type of model assume optimal model complexity (a number of variables are considered). Each additional parameter, on the one hand, makes the model more flexible; on the other hand, it makes it difficult to accurately estimate the optimal values ​​of the parameters [Nelles O. (2001) Model Complexity Optimization. In the book: Identification of nonlinear systems. Springer, Berlin, Heidelberg]. Therefore, intuitively or simply based on practical wisdom, it can be assumed that there is an optimal number of variables to achieve the highest model accuracy.

Combining the information-oriented and theoretically proven method with the realized SI, it is possible to calculate the optimal number of variables and formulate the accuracy limit of any physical law or formula describing the observed phenomenon. This has never been described in the literature.

However, until now, the scientific community has completely ignored the existence of an initial and unavoidable uncertainty due to the qualitative and quantitative set of variables in the model (systematic effect [Pavese, F. (2018). The new SI and the CODATA recommended values of the fundamental constants 2017 compared with 2014, with a comment to Possolo et al. Metrologia, 55, 1–11.]).

The formalism presented by the informational approach is based on three axioms. First, the specific variables are chosen from an existing system of units (for example, SI). The individuality of each model is then determined by the set of the selected base quantities from the system of units used (for example, in SI the base quantities are: length, time, mass, thermodynamic temperature, the electric current, the light intensity, and the amount of substance). However, the most recent (and perhaps the most controversial) axiom is that the selection of any variables in the model is carried out on an equiprobable basis. Essentially, different observers select groups of variables that differ from each other, to describe (to model) the same real-world phenomenon, which is explained by the original philosophical concept (knowledge, experience, intuition) of each research team. Human expectations are biased and differ from person to person, with measurable consequences for behavior and cognition [Lynn, C. W., Papadopoulos, L., Kahn, A. E., & Bassett, D. S. (2020). Human information processing in complex networks. Nature Physics, 16, 965–973.]. Therefore, the admissibility of considering (accounting) this axiom can be explained as follows: consider that one observer describes an electron as a particle and another as a wave. Because both the observers use different mathematical descriptions of the same physical object, neither description can be considered as false. They are both right, even if they differ in their conclusions about the characteristics of the behavior of the electron.

Any system of units, including SI, contains a limit number of variables (for SI, it equals 38,265) with a “portion of information about the observed object,” which is bounded from above. Considering that the number of variables in a model is always limited, it can be inferred that the amount of information contained in the SI and in the model are limited.

It is surprising that such statements, which are not obvious at first glance, allow calculating the smallest achievable uncertainty for any model of the phenomenon under study.

This study shows that the amount of information transmitted to and contained within a model, which in turn is constructed by the will of the observer, can play an unexpected and important role in assessing the attainable accuracy when representing the modeled phenomenon.

The information-based method is not associated with external interference arising from the experimental and measurement errors. This is largely owing to the physical-philosophical approach of the researcher used to explain the phenomenon being studied.

This new knowledge of the modeling process provides scientists with the basis for experimental verification of the veracity of models in various fields of science and technology, including heat and mass transfer, refrigeration technology, and measurements of fundamental physical constants.

It can be seen that our experience and decisions in the macroscopic world are based on objective facts. At the same time, given the results of the information method, the most accurate scientific theories (theory of relativity and quantum mechanics) can be based on subjective facts (the philosophical view of the researcher) at the most fundamental level, which raises deep epistemological questions about the fundamental nature of reality. It is in tiny deviations from the generally accepted principles of modeling physical phenomena that the first hints of new physics may appear.

Cite this article in APA as: Menin, B. (2021, September 28). Is the accuracy of modeling related to the consciousness of the observer? Information Matters. Vol.1, Issue 9.

Boris Menin

BORIS M. MENIN (Member, IEEE) received an MSc degree in 1973 at Electro-Technical Communication Institute, department of Multichannel Electrical Communications and received a PhD in Mass and Heat Transfer at the Technological Institute of Refrigeration Industry, Russia, St-Petersburg in 1981. Dr. Menin was Director of the Laboratory of Ice Generators and Plate Freezers in St. Petersburg from 1977 to 1989, after which he emigrated from the Soviet Union to Israel. There he was the Chief Scientist at Crytec Ltd. (1999–2008) and managed the development, production, and marketing of pumpable ice generators and cold energy storage systems, while also modeling and manufacturing high-accuracy instrumentation for heat and mass processes. He is now an Independent Mechanical & Refrigeration Consultation Expert. In addition, he has managed Task 3.1 of the European FP6 project in the field of food cold chain and several of Israel’s (EUREKA, integrated project of EU and Chief Scientist Office of Israel’s Ministry of Industry) in the field of cold energy storage systems based on pumpable ice technology. He is an author of five books and 67 journal articles, and is a member of ASHRAE (USA) and SEEEI (Israel).