How monitoring and evaluation influence future interventions?
DG INTPA is a learning-based organisation. As such, it aims at systematically translating accumulated knowledge and expertise into operational reality. This constitutes a continuous exercise of applied knowledge, with the ultimate objective to reinforce the link between DG INTPA processes, the delivery of specific products and the achievement of intended results.
Learning (along with accountability, management and communication) is one of the purposes for monitoring, auditing and evaluating. For optimal results, learning from evaluations should be combined with learning from monitoring (internal, external/ROM, results collection) and from audits (intervention as well as the more strategic audits by the Internal Audit Service and the Court of Auditors). Monitoring, evaluation and auditing are supposed to provide an in-depth understanding of the performance of an intervention or of strategic issues, against initial expectations, thus forming a solid base for deriving lessons for future interventions or programming decisions.
The EU is committed to the ‘evaluation first’ principle, to make sure any policy decision takes account of lessons learned from past EU actions. DG INTPA staff relies to a great extent on lessons learned and findings from past experience to prepare new interventions and inform future decision-making: operational managers and decision-makers are interested in clear and operational recommendations credibly grounded on evidence. From this perspective, monitoring and evaluating (and auditing to a certain extent) are the institutionally established mechanisms by which to learn from experience, to better understand what works and what does not, and to accumulate knowledge for application in future actions.
- Monitoring, ex ante and mid-term evaluations, as well as audits have an immediate impact on a given intervention, although lessons learned will certainly contribute to and shape future interventions — assuming the translation of knowledge into operational reality. In turn, the conclusions and recommendations of final and ex post evaluations feed directly into new actions in all phases of the intervention cycle, while the findings of strategic evaluations inform aid programming processes.
- Evaluation goes beyond the assessment of what has happened, it represents a key learning tool for the EC to understand not only what works and what does not, but why and under what circumstances. Evaluations are one of the key components of the overall organisational learning effort in DG INTPA, as represented by the monitoring and evaluation pyramid. Data and evidence produced by continuous internal monitoring support and guide regular external monitoring exercises — which in turn provide data and findings informing further analysis by means of ad hoc evaluations at the intervention or strategic level.
In 2024, DG INTPA released a comprehensive “Evaluation Handbook” aimed at guiding the preparation, launch, and management of evaluations. By providing hands-on practical guidance for planning, launching and managing evaluations as well as guidance for specific evaluation aspects and contexts, it aims at increasing the regularity and quality of intervention-level evaluations and to strengthen the feedback loop of evaluation results into evidence-based policymaking, intervention design and implementation. Management and staff in EU delegations and at headquarters are continually encouraged to make extensive use of evaluation findings to better support the efforts of partner countries to eradicate poverty, improve governance and attain sustainable growth.
Where to find lessons learnt ?
Institutional memory is vital, and newly posted operational managers should examine lessons learned identified in past evaluation reports and records of meetings with key informants.
ROM experts also produce insightful thematic and country/regional fiches as part of their annual consolidated analyses. If external experts are contracted to support the identification and formulation process, operational managers should ensure that they review lessons learned before drafting a new programme. Former ROM and EVAL modules, now integrated in OPSYS , are some of the practical tools that can facilitate this work, along with thematic groups at Capacity4dev.eu.
DG INTPA is currently developing a project for internal staff to provide support for a) on-demand knowledge from Monitoring and Evaluation and b) collect monitoring and evaluation knowledge to inform the design and programming phases. The intent will be to provide a set of tools (library, search tool, aggregated knowledge products) to make monitoring and evaluation findings available to the end users in different formats.
Evidence Support
- DG INTPA has put in place an evaluation helpdesk for internal staff, that can be reached under the following functional mailbox: INTPA-EVALUATION-SUPPORT@ec.europa.eu.
- For DG INTPA staff on the Knowledge extracted from Monitoring and Evaluations, please contact INTPA D4: aurelie.poinsot@ec.europa.eu
Other step:
Methodological fiche(s):
INTPA intranet page on Evaluation