Definitions/origins of the terms ontic/ontological and epistemic/epistemological from Susanne Bobzien:
ontic: is concerned with being (Greek ‘to on’, that which is)
ontological: is concerned with the theory of being (Greek ‘logos’,among other things: theory)
epistemic: is concerned with knowledge or with cognition (Greek ‘episteme’, knowledge)
epistemological: is concerned with the theory of knowledge or of cognition (Greek ‘logos’: see above)
I use the term Reality (“R” is capitalized) instead of “Being.” Our knowledge of Reality is limited and will always be limited. The term “epistemic uncertainty” mentioned in the title of this blogpost is NOT referring to this limitation. Rather, “epistemic uncertainty” refers mostly to the variability in the “representation of knowledge.”
Epistemic uncertainty (type 1): variability due to time, place and person
Reality is One and unified but there are infinite number of perspectives. The “knowledge of Reality” refers to the totality of these perspectives. Perspectives change according to time, place and person. This is the first kind of epistemic uncertainty.
Epistemic uncertainty (type 2): variability in the representation of knowledge
There are other kinds of variabilities in our “knowledge of Reality.” Consider the motion of an electron in space-time. The motion can be represented by the Dirac equation or the Klein-Gordon equation. The Dirac equation takes the electron spin into account whereas the Klein-Gordon equation ignores the spin. Both the Dirac and the Klein-Gordon equations are relativistic and both are quantum mechanical. There is another equation for the motion of an electron known as the Schrodinger equation which is quantum mechanical but not valid at relativistic speeds. Schrodinger equation ignores electron spin too. Maxwell equations describe the motion of an electron also but they are neither quantum mechanical nor relativistic. Maxwell equations ignore spin too.
This is variability in the representation of knowledge. We know how electron moves because we observe what it does in different physical settings. We have developed multiple equations to describe its motion. Our knowledge includes the observations (behavior) as well the models (equations) that explain the behavior. But, there is multiplicity in the models. I am not saying all models are equal in predictive and explanatory power, I am just saying there is multiplicity. This variability (multiplicity) in the representation of (experimental + theoretical) knowledge is the second type of epistemic uncertainty.
Epistemic uncertainty (type 3): lack of information regarding inputs
I am going to use the Dirac equation as an example for the third kind of epistemic uncertainty. The Dirac equation is the most accurate and most predictive equation for the motion of an electron. But, there may be uncertainties regarding the inputs to this equation. We may not be able to quantify the input (functional shape of the external field the electron is feeling) with precision. This will lead to the uncertainties in the prediction of the Dirac equation.
Do not confuse this type of epistemic uncertainty with quantum mechanical uncertainty. I will discuss quantum mechanical uncertainty in a future post about Ontic uncertainty.
This third type of epistemic uncertainty is reduced with more information about the inputs. If the input to our model is more precise the output of the model will be more precise as well.
Epistemic uncertainty (type 4): statistical uncertainty
Statistical uncertainty of a quantity is given by the standard deviation (SD) of the statistical distribution of that quantity. Strictly speaking, the measure of standard deviation is valid for Gaussian distributions only but it can be useful to learn something about non-Gaussian distributions as well.
Most of the time we are interested in the average (mean) of a distribution which is not knowable in principle but we can take many samples and measure the mean and the SD of each sample. If you record the mean of each sample and compute the standard deviation of that set then you get the “standard error of the mean” (also known colloquially as the “margin or error”). The “standard error of the mean” is a measure of the variability of the mean from sample to sample. The epistemic uncertainty of the fourth kind is the “standard error of the mean.”
Loss of information
Loss of information regarding inputs will clearly result in higher epistemic uncertainty.
You may have heard the phrase “loss of information” in the debates involving Quantum Mechanics and the physics of Black Holes. In those debates the term “loss of information” is part of the ontological discussion. So, that specific debate is beyond the scope of this blogpost and it is not really part of the epistemological discussion.
What happens to epistemic uncertainty when a model is ruled out
New findings from the Large Hadron Collider ruled out the majority of the particle models involving Supersymmetry. There are still more convoluted versions of supersymmetric particle models that may survive, that’s not the question. The question is this. Does the epistemic uncertainty increase or decrease as a result of the elimination of some models?
Logically speaking, if the multiplicity of models is reduced the second kind of epistemic uncertainty will be reduced. What about the impact on the “knowledge of Reality?
If there are multiple verified models and they each describe some aspects of Reality within their declared limitations then each addition of such a verified model will increase epistemic uncertainty but it will also increase the existing knowledge of Reality (example set: Dirac, Klein-Gordon, Schrodinger, Maxwell equations)
The addition of theorized yet unverified models will NOT increase the existing knowledge of Reality but it will certainly increase the epistemic uncertainty (example set: particle models involving Supersymmetry).
Therefore, it seems to me that epistemic uncertainty will continually increase as we come up with more and more models of Reality.
Epistemic uncertainty is similar to entropy
Remember that entropy is based on the concept of variability. Entropy always increases in isolated systems so does epistemic uncertainty as we discover more perspectives of Reality.