The JRC delved into the potential flaws of rankings and ratings, and its new method can help developer revise their composite indicators© Sergio Roberto Bichara (Stock.xchng), 2008
JRC method to assess statistical coherence of multi-dimensional indices
Are ratings and rankings non-debatable? Does the weight given to one variable actually reflect its real importance? In a recently published article in the Journal of the Royal Statistical Society A the JRC describes its statistical method to assess the quality of multi-dimensional indices (composite indicators). The analysis shows that for several of these indices equal importance is attributed to the underlying variables, while in many cases, only a few variables drive the results and others are merely cosmetic. The JRC method can help developers revise their composite indicators in order to obtain statistical coherence.
Composite indicators are popular tools in public discourse and for the general public and the media they are perhaps the best known face of statistics when it comes to assessing countries’ performance on human development, perceived corruption, innovation, competitiveness or other complex phenomena. Multi-dimensional indices combine a set of variables using a mathematical formula, which is often a weighted arithmetic average. The weights are meant to reflect the variables’ importance in the index. The JRC delved into the potential flaws of rankings and ratings built as weighted arithmetic averages.
The JRC method measures the importance (or “main effect”) of a given variable within existing composite indicators through a statistical measure known as the Pearson correlation ratio. The article demonstrates to what extent knowledge of the main effects can be used to enhance the statistical coherence of the indices and render them more transparent and defensible. The analysis was applied to six well-known composite indicators, including the United Nations’ Human Development Index and two popular league tables of university performance, the Academic Ranking of World Class Universities (ARWU) by the Shanghai University and the Times Higher Education Supplement (TIMES) university ranking.
The JRC methodology has already been applied internationally by the Yale and Columbia University in the Environmental Performance Index 2012 and by the Cornell University, the INSEAD leading business school and the World Intellectual Property Organization in the development of the Global Innovation Index 2013, and has also been used in the development of the new innovation output indicator of the European Commission prepared at the request of the European Council.
To develop its statistical method, the JRC distilled its experience gained over the last 10 years after helping over 60 international organisations to fine-tune their multidimensional measures, such as Transparency International, Harvard, M.I.T., INSEAD, World Intellectual Property Organization, United Nations, World Economic Forum, European Central Bank and others.