# Glossary

Please cite the OceanUQ glossary as follows: OceanUQ online glossary. Retrieved from https://oceanuq.org/learn/glossary/ https://doi.org/10.5065/sqvt-p303

** Accuracy** – Closeness of a quantity value to a true or reference value. Not to be confused with Precision, Resolution. See also Error, Trueness, True Value.

** Aleatoric Uncertainty** – see Uncertainty.

** Calibration** (procedure) — process of mapping instrument indications or model output to sufficiently accurate quantity values. One approach, sometimes known as black-box calibration, employs a mapping relation obtained by previous comparisons with highly-accurate quantity values. Another approach, sometimes known as white-box calibration, attempts to correct for component sources of systematic error, often using an Uncertainty Budget.

** Calibration** (probability forecasting) — statistical consistency between forecasted probabilities of events and frequencies of occurrence of those events. A probabilistic forecast system is calibrated if, when sufficiently many forecast attempts are considered, events forecasted to have N% probability occur (close enough to) N% of the time.

** Definitional Uncertainty** – uncertainty stemming from vagueness or ambiguity in the definition of the quantity to be measured or modeled. See also Uncertainty, Types.

** Epistemic Uncertainty** – uncertainty stemming from a lack of information or from disagreement among sources of information, which is in principle reducible. See also Uncertainty, Types.

** Error** – Difference between a quantity value and the true or reference value of the quantity. Also known as Total Error. See also Error, Types.

– component of total error that in replicate measurements remains constant or varies in a predictable manner [adapted from the JCGM VIM 2012]. Also known as Bias.__Systematic Error__– component of total error that in replicate measurements varies in an unpredictable manner [adapted from the JCGM VIM 2012]. Also known as Noise.__Random Error__– error stemming from a mismatch between the scale at which a quantity value is estimated and the scale at which information is sought. Commonly present if satellite data or numerical modelling results, representing average conditions over large volumes, are used when information is sought about conditions at particular points. Can be systematic or random. Also known as Representativeness Error.__Representation Error__– see Error.__Total Error__

** Estimate** – an approximate quantity value, which may be rough and/or affected by sources of error that are not corrected for.

** Estimation** – the process of arriving at an estimated value for a quantity. In statistics, the process of calculating a value for a population-level parameter from a sample drawn from the population.

** Initial Condition Uncertainty** – uncertainty about the initial values that should be assigned to variables in a dynamical model, in order to accurately represent a starting state of the target system and/or produce sufficiently accurate results. See also Uncertainty, Sources in Modeling.

** Measurand** – Quantity intended to be measured. [From the JCGM VIM 2012]

** Measurement Model** – Mathematical relation among all quantities thought to be involved in a measurement process. A general form of a measurement model is the equation h(Y, X1, …, Xn) = 0, where Y, the output quantity in the measurement model, is the measurand, the quantity value of which is to be inferred from information about input quantities in the measurement model X1, …, Xn. [From the JCGM VIM3]

** Measurement Result** – A set of possible quantity values being attributed to a measurand, together with any other available relevant information about the set, such as that some values may be more representative of the measurand than others. A measurement result is generally expressed as a single measured quantity value and a measurement uncertainty. [adapted from the JCGM VIM 2012]

** Model Inadequacy** – inability of a model to deliver results that track observations of the represented system to within their specified uncertainty, due to the model’s failing to represent the system’s processes or features with sufficient fidelity. Post-processing of model output is sometimes undertaken to try to mitigate or correct for model inadequacy, e.g., via bias correction, application of a statistical model of the remaining discrepancy, etc.

** Ontic Uncertainty** – uncertainty stemming from randomness or indeterminism in the world. This uncertainty is irreducible in principle. Also known as Aleatoric Uncertainty. See also Uncertainty, Types.

** Parametric Uncertainty** – uncertainty about which (if any) parameter value assignments in a model structure will accurately represent target features of the modeled system and/or produce a model that gives sufficiently accurate results. Also known as parameter uncertainty. See also Uncertainty, Sources in Modeling.

** Precision** – Closeness of agreement among quantity values obtained from repeated measurements. Often expressed numerically in terms of standard deviation or variance. Not to be confused with Resolution, Accuracy.

** Quantity** – a property of a phenomenon, body, or substance, where the property has a magnitude that can be expressed as a number and a unit [adapted from JCGM VIM 2012]. See also Quantity Value.

** Quantity Value** – a number and unit together representing the magnitude of a quantity, e.g., a length of 7.31 centimeters or a temperature of 20.3 Celsius [adapted from JCGM VIM 2012]. See also Quantity.

** Random Error** – component of total error that in replicate measurements varies in an unpredictable manner [adapted from the JCGM VIM 2012]. Also known as Noise. See also Error, Types.

** Representation Error** – error stemming from a mismatch between the scale at which a quantity value is estimated and the scale at which information is sought. Commonly present if satellite data or numerical modelling results, representing average conditions over large volumes, are used when information is sought about conditions at particular points. Also known as representativeness error. See also Error, Types.

** Resolution** (of a measuring process) – the smallest value of a quantity that can be distinguished by the measuring process (e.g., measuring to 0.01C vs. to only 0.1C).

** Structural Uncertainty** – uncertainty about the form that modeling equations should take, in order to accurately represent a target system and/or obtain a sufficiently accurate result. See also Uncertainty, Sources in Modeling.

** Systematic Error** – component of total error that in replicate measurements remains constant or varies in a predictable manner [adapted from the JCGM VIM 2012]. Also known as Bias. See also Error, Types.

** Total Error** – see Error.

** True Value** – quantity value consistent with the definition of a quantity. Or, the value obtained after an infinite series of measurements performed under the same conditions with an instrument not affected by systematic errors. [https://www.roma1.infn.it/~dagos/cern/node7.html; see also JCGM VIM 2012]

** Trueness** – Closeness of agreement between the arithmetic mean of a large number of estimates of a quantity and the true or accepted reference value. [adapted from ISO 5725-1:1994(en)]

** Uncertainty** – A state of limited knowledge about the value of a quantity or, more generally, about some matter of interest. Typical sources of uncertainty include imprecision and lack of resolution in data, disagreement among measurements or model projections, limited understanding of sources of model error, inherent randomness in processes, and ambiguous terminology. Uncertainty about the value of a quantity is often expressed by attributing a set of possible values to the quantity, rather than a single value, or by a parameter (such as a standard deviation) that characterizes the dispersion of the values that could reasonably be attributed to the measurand [see JCGM GUM 2008]. See also Uncertainty, Types.

** Uncertainty, Sources in Data Assimilation** – Uncertainty associated with the results of data assimilation can stem from uncertainty about: the quality of input data (observations), the accuracy of model-based forecasts used in the assimilation process, limitations of the assimilation algorithm, and the reliability of the observation operator used to map modeling results to point values.

** Uncertainty, Sources in Modeling**:

– Uncertainty about the form that modeling equations should take, in order to accurately represent a target system and/or obtain a sufficiently accurate result.__Structural Uncertainty__– Uncertainty about which (if any) parameter value assignments in a model structure will accurately represent target features of the modeled system and/or produce a model that gives sufficiently accurate results.__Parametric Uncertainty__– Uncertainty about the initial values that should be assigned to variables in a dynamical model, in order to accurately represent a starting state of the target system and/or produce sufficiently accurate results.__Initial Condition Uncertainty__– Uncertainty about the boundary conditions or other situational conditions that will occur in a time period.__Scenario Uncertainty__

– Uncertainty stemming from a lack of information or from disagreement among sources of information, which is in principle reducible.__Epistemic Uncertainty__– Uncertainty stemming from randomness or indeterminism in the world. This uncertainty is irreducible in principle. Also known as Aleatoric Uncertainty.__Ontic Uncertainty__– Uncertainty stemming from vagueness or ambiguity in the definition of the quantity to be measured or modeled.__Definitional Uncertainty__

** Uncertainty Assessment** – process of investigating and characterizing uncertainty associated with a measurement or modeling result. Ideally, uncertainty assessment considers all sources of uncertainty that could impact the result in a significant way. See also Uncertainty Characterization, Uncertainty Quantification.

** Uncertainty Budget** – a quantitative accounting of the component sources of error and uncertainty in a measurement or modeling procedure, which includes estimates of the contributions from the component sources and combines those contributions in some way in order to arrive at a more accurate quantity value and an estimate of its associated uncertainty. A measurement model, when available, guides the construction of an uncertainty budget. The process of constructing and using an uncertainty budget to arrive at a measurement result is sometimes referred to as white-box calibration.

** Uncertainty Estimate** – a set of possible values for a measured or modeled quantity, indicating the limitations of current knowledge about the true value of the quantity. Ideally, the estimate takes account of all significant sources of uncertainty. When important sources of uncertainty have not been considered, this should be indicated along with the set of possible values that is reported.

** Uncertainty Quantification** – process of quantifying the extent to which there is limited knowledge of the true value of a quantity, usually in terms of a set of possible values for the measured or modeled quantity. See also Uncertainty, Uncertainty Assessment, Uncertainty Budget, Uncertainty Characterization, Uncertainty Propagation.

** Uncertainty Characterization** – process of describing the extent to which knowledge of a measured or modeled quantity is limited. Can be quantitative (e.g., reporting an interval of possible values) or qualitative (e.g., reporting that particular marks of quality were present/absent from the measurement process). See also Uncertainty, Uncertainty Assessment, Uncertainty Budget, Uncertainty Propagation.

** Uncertainty Propagation** – process of transforming uncertainty about the values of input quantities (to a measurement model, mathematical model, or simulation model) into uncertainty about the values of output quantities. Often done with the help of Monte Carlo or other sampling techniques. Uncertainty propagation does not take account of structural uncertainty, i.e., uncertainty about the

*form*of the model through which input uncertainties are propagated.

** Validation** – process of comparing instrument or model output to trusted observational data, in order to learn about the performance characteristics of the instrument or model, such as the accuracy and trueness of its results, the circumstances in which it tends to perform better or worse, etc. Good fit between modeling results and observational data does not guarantee that the model’s equations accurately represent the processes at work in the real system; model performance might vary for different boundary conditions.