IMPORTANT LEGAL NOTICE - The information on this site is subject todisclaimercopyright notice
Evaluation
  EUROPA > European Commission > EuropeAid > Evaluation > Methodology > Basics > How?
Last updated: 13/01/2006
EuropeAid

Methodological bases
Evaluation process (How?)
Methodological design

EuropeAid
 

Developing a tool

 


Methodology
• Evaluation Guidelines
• Methodological bases
• Evaluation tools
• Examples
• Glossary
• Sitemap

Methods
• What
• When
• Why
• Who
• How

How
• Overview
• Strategy
• Questions
• References
• Design
• Data Collections
• Analysis
• Judgment
• Quality assurance

Design
• Overview
• Design tables
• Overall design
• Developing tools
• Feasibility

 
 

No evaluation method without tools

Evaluation tools are needed in order to collect primary and secondary data, to analyse data, and to formulate value judgements (or reasoned assessments). The external evaluation team may carry out some of these tasks without evaluation tools, but several tools are always needed for answering each evaluation question.
Tools range from simple and usual ones like database extracts, documentary analysis, interviews or field visits, to more technical ones like focus groups, modelling, or cost benefit analysis. This site describes a series of tools that are frequently used.
By using an appropriate mix of tools, the evaluation team strengthens the evidence basis of the evaluation, the reliability of data, the validity of its reasoning, and the credibility of its conclusions.

Top

How are tools chosen?

Tools are chosen during the iterative process of designing the overall evaluation method, with an aim to:

  • Contribute to answering all questions and sub-questions, and to formulating an overall assessment
  • Gather / collect data, assist in analyses and formulation of value judgement (or reasoned assessment)
  • Facilitate cross-checking and triangulation
  • Reinforce each other through appropriate combinations
  • Match contextual constraints like: availability of expertise and data, allocated resources, time schedule.
Top

When is it developed?

All evaluation tools are developed progressively and finalised before the beginning of the field phase, although some tools need to be implemented, and therefore developed earlier in the process, e.g. interviews with key stakeholders at the inception stage, analysis of management databases at the desk phase, etc.

Top

Developing a tool

Whilst the set of evaluation tools is to be selected as a part of the overall evaluation design, each tool is to be developed in a separate way. An example of developed tool is provided on this site.
Development a tool may be a matter of a section in the inception or desk report. However the task may needs to be further formalised, including in the form of fully fledged terms of reference, when several members of the evaluation team work separately, for instance if the works extend to different countries, or if the tool is being sub-contracted.
Tool development proceeds through seven steps as follows:

Top

Questions and sub-questions

The evaluation team lists the questions and sub-questions that have to be addressed by the tool. It refers to the design tables.

Top

Technical specifications

The evaluation team develops the technical specifications of the tool through a preparatory stage. Technical specifications depend on the type of tool. They cover issues like:

  • Sampling, selection of interviews, of case studies, of documents …
  • Questions, themes of a documentary analysis, content of a monograph, fields of a database …
  • Mode of implementation
  • Duration of interview, focus group, visit, …

Caution! - When developing a questionnaire or an interview guide, the evaluation team should not proceed by copying and pasting evaluation sub-questions. If evaluation questions and sub-questions are naïvely passed to informants, there are considerable risks of biases.
Technical specifications need to be further formalised when several members of the evaluation team work separately, especially if the works extend to different countries.

Top

Risk management

The evaluation team assesses the main risks with data collection and analysis, as well as potential biases. As far as relevant, it prepares second best solutions in case the tool cannot be applied satisfactorily. The following lines provide examples of risks associated with evaluation tools, and examples of second best solutions:

  • Considering database X, if more than 20% data are still missing after 4 weeks, then analyse the available data and ask expert Y to comment upon the validity of findings.
  • Considering the series of interviews X, if more than 30% informants cannot be reached after 3 weeks, then collect alternative "second hand" information from expert Y.
  • Considering the questionnaire X, if the number of respondents falls below 200, then gather a focus group meeting and cross-check results with that of the questionnaire survey.

This list is not limitative.

Top

Mode of reporting

The outputs vary from one tool to another. They may take the form of tables, lists of quotations, lists of verbatims, monographs, etc. The evaluation team decides on how the outputs will be reported, for instance:

  • Full length data delivered to the evaluation team leader
  • Full length data included in a CDROM attached to the final report (in which case, some documents may need to be anonymised)
  • Tables or boxes to be inserted in the final report

The evaluation team also decides on how to report about the implementation of the tool and the associated limitations if any, e.g.

  • Note to be inserted in the methodological annex appended to the final report
  • Short methodological comment to be inserted in the final report itself.

Top

Responsibilities

The tasks and roles are shared among the evaluation team members, e.g.

  • Who is to implement the tool?
  • Who will ensure harmonisation in case several team members implement the tool in various countries / areas?
  • Who is to decide upon second best alternatives in case of difficulties?
  • Who is responsible for delivering data?
  • Who will write the methodological comments?
  • Who is to assure quality?

Top

Quality

Quality criteria are precisely defined. Depending on the tool they may cover issues like:

  • Coverage of themes, questions, issues
  • Accuracy
  • Identification of potential biases
  • Respect of anonymity rules or other ethical rules
  • Formal quality, language, spelling, layout

This site proposes a series of quality check-lists for frequently used tools.

Top

Time schedule and resources

In some instances, it may be necessary to specify practicalities like:

  • date of start / end
  • human resources allocated
  • travel arrangements
  • technical expenditures

Top