IMPORTANT LEGAL NOTICE - The information on this site is subject todisclaimercopyright notice
Evaluation
  EUROPA > European Commission > EuropeAid > Evaluation > Methods > How
Last updated: 09/11/2005
EuropeAid

Methodological bases
Evaluation process (how?)
Methodological design

EuropeAid
 

Design table per question

 


Methodology
• Evaluation Guidelines
• Methodological bases
• Evaluation tools
• Examples
• Glossary
• Sitemap

Methods
• What
• When
• Why
• Who
• How

How
• Overview
• Strategy
• Questions
• References
• Design
• Data collection
• Analysis
• Judgment
• Quality
assurance

Design
• Overview
• Design tables
• Overall design
• Developing tools
• Feasibility

 
 

What does this mean?

The design table explains how an evaluation question will be answered, including the chain of reasoning which connects data, findings and conclusions.
An example of a design table is provided on this site.

Top

When is it constructed?

A design table is developed for each question and progressively refined in successive versions:

  • Preliminary version appended to the inception report.
  • Draft version(s) prepared during the first phase of the evaluation as the methodological design is progressively optimised.
  • Final version attached to the first phase report (desk).

Top

Opening the table

The first lines of the table summarise the steps which have already been taken (see from question to indicator), i.e.


Top

Sub-questions

The next line of the table describes the chain of reasoning through which the evaluation team plans to answer the question. The reasoning is structured through a series of sub-questions.

Sub-questions pertaining to indicators

These sub-questions may pertain to:
(1) the current level / status of the indicators, possibly with a break-down per country, area, social group, etc., for instance:

  • What is the current value of quantitative indicator X at country level, and for targeted areas/groups?

(2) changes in the indicators, for instance:

  • Do stakeholders perceive a positive change in qualitative indicator Y over the evaluated time period?

As seen in the examples above, the sub-questions may be quantitative or qualitative.

Top

Sub-questions pertaining to analysis

These sub-questions are written with a view to:
(3) Confirming assumptions about the success of the intervention and substantiating a positive answer to the question, for instance:

  • Has the design of EC support included a commitment to monitor performance indicators related to effect X?
  • Was such monitoring actually undertaken?
  • Was the monitoring data subject to periodic discussion among development partners?
  • Have partners taken action as a follow-up to such discussions?
  • Were such actions designed with a view to achieving effect X?
  • Etc.

(4) Challenging assumptions about the success of the intervention and substantiating a negative answer to the question, for instance:

  • Have other development partners pushed for monitoring performance indicators related to effect X?
  • Have non-state actors contributed to putting the issue of achieving effect X onto the political agenda?
  • How far did other partners contribute towards shaping the actions taken in favour of disadvantaged groups?

Top

Sub-questions pertaining to judgement

These sub-questions are meant to assist in the formulation of conclusions involving explicit value judgements. They are written with a view to:
(5) applying and possibly refining the judgement criteria in the specific context of the intervention, for instance:

  • Do stakeholders spontaneously focus on the same judgement criteria as those selected for the evaluation? If not, why not?

(6) applying or developing the targets in the specific context of the intervention, for instance:

  • Which are the areas / groups in the country with the best performance as regards the selected judgement criterion? Among them, which ones can legitimately be compared with targeted areas / groups?

Top

Analysis strategy

Four strategies can be considered:

  • Change analysis, which compares measured / qualified indicators over time, and against targets
  • Meta-analysis, which extrapolates upon findings of other evaluations and studies, after having carefully checked their validity and transferability
  • Attribution analysis, which compares the observed changes and a "without intervention" scenario, also called counterfactual
  • Contribution analysis, which confirms or disconfirms cause-and-effect assumptions on the basis of a chain of reasoning.

The three last strategies are appropriate for the questions which require a cause-and-effect analysis. The first one is appropriate in other instances.

Top

Investigation areas

The evaluation team may consider collecting and analysing data at the level of the intervention as a whole, or investigating some areas more specifically, for instance:

  • All sub-questions will be addressed through an investigation into a selection of countries and will include country notes and a visit to each country.
  • In addition to using national statistics, the evaluation team investigates a selection of districts respectively typical of (a) the targeted group /area, and (b) best performing groups / areas.
  • The evaluation team will carefully select ten projects which will be subject to an in-depth investigation in order to address some of the sub-questions.

Top

Information sources and tools

The design table identifies the sources of information to be used, such as:

  • Statistics including context indicators available through international databases
  • Management or monitoring databases
  • Reports, reviews, audits, evaluations, articles, etc.
  • Stakeholders' statements
  • Experts' statements

Each source is meant to inform on several sub-questions (sometimes just one). As far as possible, each sub-question should be covered by several independent sources in order to allow for cross-checking. In most instances a source of information is associated with an evaluation tool, for example a given category of stakeholder will be reached through interviews or focus groups, the opinion of experts will be obtained through email interaction or through a panel, etc.

Top