Sensitivity analysis, sensitivity auditing and impact assessment

Dealing with such uncertainties and multiple scenarios in policy-relevant data is an important challenge for decision making. Since the attention given to the robustness of scientific advice is increasing, the JRC applies sensitivity analysis and sensitivity auditing in order to ascertain how model results used in impact assessments and elsewhere depend upon the information fed into them, their structure and underlying assumptions.

Impact assessment (IA) is a key tool to ensure that Commission initiatives and EU legislation are prepared on the basis of transparent, comprehensive and balanced evidence. The IA work is a fundamental to the development of Commission proposals, and the College of Commissioners takes the IA report into account when taking its decisions. Impact assessments increasingly rely on scientific and statistical evidence to understand the impacts of policy options, so it is very important to understand and quantify, where possible, any uncertainties that are present in this information.

Sensitivity analysis

Mathematical computer models are used in a very wide range of disciplines, from engineering and physical sciences, to economics and social sciences, to help to understand the behaviour of complex systems such as engineering structures, the global climate or stock markets. The results of these models can be used to make important commercial or policy decisions, but as models become more sophisticated, more information is needed to specify them. Very often this information is uncertain, due to a lack of data or an incomplete understanding of the system. It is therefore very important to know how the results of the model are affected by these uncertainties.

Uncertainty analysis is the practice of quantifying the uncertainty in the model outputs due to the uncertainty in its inputs. Sensitivity analysis goes further, and quantifies the contribution of each model input to the output uncertainty. Sensitivity analysis implicitly includes uncertainty analysis as part of its procedure.The JRC has a strong pedigree in sensitivity analysis, having published numerous academic articles over the past 25 years and written several core textbooks on the subject. Examples of our applications include: transport emission modelling, fish population dynamics, composite indicators, hydrocarbon exploration models, macroeconomic modelling, and radioactive waste management.

Sensitivity auditing

When preparing evidence for policy-making, it is extremely important to account for uncertainties in models and calculations. While sensitivity analysis is sufficient to explore the effects of quantifiable uncertainties, i.e. the effect of varying model parameters within known ranges, there will typically be a number of non-quantifiable assumptions used in a model which should also be accounted for. A concept known as sensitivity auditing gives a set of general guidelines which should be followed to ensure quality of scientific evidence and transparency in the modelling and analysis used for IA.

Sensitivity auditing embodies a number of concepts which aim to ensure quality, objective evidence in the possible presence of high stakes and irreducible uncertainty. At the heart is the idea of critically examining all assumptions used in the model and analysis of data, both quantifiable and non-quantifiable, and ensuring that the uncertainty in the conclusions drawn from the model is clearly and objectively presented so that it can be accounted for in policy decisions. Furthermore, it recommends checks against artificial deflation or inflation of model uncertainty which may encourage or discourage certain policy options, or to give more credit to an otherwise unusable model. Overall, the aim is for complete transparency in the production of evidence, for example making models and calculations available for examination by third parties, and acknowledging frankly when and when not to use modelling and quantification of costs and benefits due to lack of data or very high uncertainty. The JRC collaborates with the Secretariat General (SEC GEN), and although not visible in policy documents (all input to the SEC GEN is strictly confidential), the JRC's work is crucial as it helps to evaluate the correct use of models and inferences on which policy proposals are based.