Decrease text Increase text

LEADER Monitoring and Evaluation

3. LDS Monitoring and Evaluation Objectives

Why is it so important that LAGs set out clear objectives for their monitoring and evaluation activities?

Monitoring and evaluation are now obligatory tasks for LAGs1. The need for improvement if LAGs are to measure and understand the effects of the LAGs Local Development Strategy (LDS) has been clearly highlighted and a plan for these activities should be included in the LDS.

As with any other part of the LDS there should be a clear logic linking the needs, proposed activities and resources to clear objectives to provide a clear focus and direction for what is done. This is essential in steering any evaluation activity whether it be self-evaluation or externally contracted. In other words LAGs need to understand and clearly state what it is they want to monitor and evaluate, to achieve through monitoring and evaluation in order to plan, resource and undertake these activities efficiently and effectively. This is vital if monitoring and evaluation design and performance is to improve and the benefits of LAGs work be clearly demonstrated.

What are the key considerations for LAGs in developing their monitoring and evaluation objectives?

Monitoring and evaluation is not an end in itself, it is done for a purpose and in setting evaluation objectives the targeted uses and users of monitoring and evaluation outputs should be identified and considered. There are four main purposes to consider here in formulating your objectives;

  • Capitalising on learning; i.e. when and how the LAG and others learn from the experiences and make use of this.
  • Improving implementation; implementing the delivery lessons.
  • Informing future programming and policy; in establishing and feeding back on effective approaches to meeting needs.
  • Public accountability; in demonstrating value for money on different levels in what is achieved and the added value of doing things the LEADER way.

Many aspects of LEADER monitoring and evaluation are specific to LEADER, the Local Development Strategy and the LEADER method. Clearly LAGs monitoring and evaluation objectives should reflect this specificity. There are four main considerations which arise in relation to this in setting monitoring and evaluation objectives.

  • Firstly the objectives should address the monitoring and evaluation of the delivery of i) their Local Development Strategy in terms of its own specific intervention logic and ii) the specific objectives for LEADER set out in the RDP. The failure to do so adequately in the past was a key weakness identified by the European Court of Auditors.
  • Secondly LAGs monitoring and evaluation objectives should be adapted to take account of the effects of the LEADER approach to the delivery of the LDS e.g. re projects target setting, data specification, collection and reporting or the involvement of beneficiaries.
  • Thirdly the evaluation of the effectiveness and efficiency of the LAGs delivery mechanism should be reflected in the objectives.
  • Finally the LAGs monitoring and evaluation objectives should address the implementation of the LEADER method itself and establishing

A part of the whole; fitting within a common evaluation framework

LAGs involvement in monitoring and evaluation extends beyond the LDS focus and objectives must also be set for how LAG monitoring and evaluation activities will fit and contribute to the common framework2 of the RDP, in what is known as the ascendant evaluation approach. In considering their monitoring and evaluation objectives LAGs should also therefore take account of the contribution they should make to the RDP and its monitoring and evaluation approach. Managing Authorities should give LAGs some guidance on their Monitoring and Evaluation objectives in their Evaluation Plan well in advance and specifically how they will help LAGs to contribute to these3.

Taken together these four elements clearly imply a final and overarching consideration, the importance of establishing a systematic approach which links the LDS and its delivery with monitoring and evaluation as a single coherent system. This in turn must link to the RDP framework as part of a coordinated overall system. It is worthwhile considering developing consistent database tools to ensure data which will be required in collected and can be shared from LDS across national level.

What about monitoring project implementation and performance?

This is a different form of monitoring activity, monitoring the implementation of the activities supported4 under the LDS. One of the important aspects of LEADER is the on-going ‘life cycle’ support which LAGs provide for projects, the aim is to have successful projects which deliver against the LDS. Monitoring projects implementation and performance is an important management consideration for LAGs in helping to ensure this aim is achieved, identifying any support needs or the need to adjust the LDS or the way in which projects are supported. This form of monitoring activity is therefore important for LAGs in their on-going work and in reality checking what is actually going on at project level. Monitoring activities may involve visits of project sites, meetings, surveys and other activities allowing the LAG members and staff to have feedback and an overview on project implementation.

1 Regulation (EU) No 1303/2013 Art 34.3 (g) [PDF ]
2 For the 2007 – 2013 RDPs this is the Common Monitoring and Evaluation Framework (CMEF), for 2014 – 2020 the Common Monitoring and Evaluation System (CMES), the structure of indicators, evaluation questions, judgement criteria and evaluation activities.
3 In developing their approach it may well be worthwhile for MAs and LAGs to consider which elements of this may also be applied to the CMEF for the LEADER elements of the ex post evaluations of the 2007 – 2013 RDPs.
4 See Regulation (EU) No 1303/2013 Article 34 (g) [PDF ]

The text on LEADER evaluation in the toolkit is a practical guide aimed at contributing to how all actors involved in LEADER could make LEADER evaluation more effective. It does not serve as guidance for formal RDP evaluation related to LEADER. For the latter please visit Evaluation Helpdesk resource page

  • Working out your evaluation objectives in advance will help you understand and manage the process.
  • Clear objectives will help you strengthen the fit within the overall RDP evaluation system.
  • Evaluation is part and parcel of a strategic approach, clear objectives are part of the strategy linking your strategy to your achievements.
Last update: 19/06/2014 | Top