IMPORTANT LEGAL NOTICE - The information on this site is subject todisclaimercopyright notice
  EUROPA > European Commission > EuropeAid > Evaluation > Methodology > Basics > How?
Last updated: 13/01/2006

Methodological bases
Evaluation process (How?)
Evaluation questions


Preparing an evaluation question


• Evaluation Guidelines
• Methodological bases
• Evaluation tools
• Examples
• Glossary
• Sitemap

• What
• When
• Why
• Who
• How

• Overview
• Strategy
• Questions
• References
• Design
• Data Collections
• Analysis
• Judgment
• Quality assurance

• Overview
• Selection
• Preparation
• Examples


Why is it important?

The questions serve to concentrate the work on a limited number of points in order to ensure that the conclusions are useful and of a high quality. They therefore have to be carefully prepared and worded with precision.

Ensure that the answer to the question will be useful

As far as possible, the evaluation questions are proposed together with a comments on the following points:

  • Which users will be interested in the answer to the question?
  • How will they use it?
  • Considering the time needed to finalise the evaluation, will the answer to the question arrive in time to meet users' expectations?

If there is uncertainty on the usefulness of the question, it is better to exclude it and to concentrate the evaluation on other more useful questions.


Specify the nature of the expected use

A question may be intended to further knowledge and understanding, for example:

  • To what extent has the intervention helped to generate effect X and in what way?

A question may also aim help to formulate or clarify a judgement, for example:

  • To what extent has the intervention contributed to generate effect X at a reasonable cost?

Finally, a question may be asked with a view to suggest concrete recommendations for making a decision:

  • Has the use of a particular implementation modality contributed to generate effect X more sustainably?

The three types of question are not exclusive. On the contrary, there is a progression in the nature of the questions:

  • in order to judge one first has to know
  • in order to decide one first has to know and then to judge.


If all the questions of the same evaluation have no purpose other than furthering knowledge and understanding, the exercise is more a study or a piece of research than an evaluation.

Get further

Questions for different uses (examples)
Ways of using evaluation

The European Commission's evaluation standards and good practices 


Ensure that the question concerns evaluation

Before drafting a question, ensure that it does not concern audit or monitoring

  • If the question concerns only compliance with rules and procedures, it is an audit rather than an evaluation question.
  • If the question covers only the progress of outputs, it is a monitoring question.

If a question concerns audit or monitoring, there are two options:


Specify the scope of the question

The scope of the question can be the entire intervention or a particular dimension of its design or implementation, for instance:

  • Has the intervention helped to generate effect X as expected? (entire intervention)
  • Has the actors' participation in the design of the intervention helped to generate effect X more successfully (design modality)
  • Have the measures taken to ensure coordination with the other donors helped to generate effect X more successfully? (implementation modality)
  • Has the choice of sector budget support helped to generate effect X more successfully (funding modality).

A question concerning both the entire intervention and all its effects would be very difficult to answer, except in the case of a simple intervention,. At least one of the two elements has to be specified.

Get further

How to define the scope of the question
Questions on design and implementation (examples)

Link the question to the intervention logic

The intervention logic identifies the main expected effects (outputs, results and impacts). Which effect does the question concern? For example:

  • Has the intervention helped to generate effect X as expected?
  • Has the intervention helped to generate effect X for the poorest groups?
  • Has the intervention increased the poorest groups' chances of obtaining effect X?

If the question concerns a short-term result or specific impact, it will be more easily evaluable than if it concerned a global impact. Conversely, a question on a global impact will be more useful for decision-makers at a strategic level.

It is possible to ask a question that is not connected to the intervention logic, for example:

  • Has the intervention generated unexpected effects for the people affected?


Questions concerning the whole range of expected effects are likely to be too vast for the evaluation team to be able to give a satisfactory answer.

Get further

How can a question be specified in relation to the intervention logic?
How to reconstruct the intervention logic?

Link the question to an evaluation criterion

The question is drafted with an evaluation criterion in mind (e.g. relevance, effectiveness, efficiency, sustainability, impact, Community value added, and coherence/complementarity), unless it is designed only to further knowledge and understanding.

Questions on efficiency and sustainability are asked more rarely, partly because they are difficult to answer. Yet they are often the most useful.

Get further

Specify the family of evaluation criteria
Questions belonging to the different families of evaluation criteria (examples)
Families of evaluation criteria


Finalise the writing of the question

Choose either open wording that calls for a qualified answer or a closed question that requires a "yes or no" answer.

Draft the question simply and concisely.

For information, add comments on all or some of the following points:

  • Details on the scope of the question.
  • Definition of the main terms used.
  • Details on the family of evaluation criteria.
  • Sub-questions to consider in order answering the question.
  • Nature of the intended use
  • Existing studies and evaluations not to be duplicated.