Rolling Five-Year Development Evaluation Work Plan 2013-2014 - 2017-2018
April 2013
Table of Contents
- List of Abbreviations
- Executive Summary
- 1.0 Background
- 2.0 CIDA’s Evaluation Function
- 3.0 Developing the Work Plan
- 4.0 Planned Evaluations for FY 2013-2014
- 5.0 Additional Evaluation Division Activities for FY 2013-2014
- 5.1 Support to Branch-led Evaluations
- 5.2 Review of Country Program Evaluation Methodology
- 5.3 Commitments in Treasury Board Submissions
- 5.4 Optimizing the Learning / Knowledge Benefits from Evaluations
- 5.5 Measuring the use of evaluation within the Agency
- 5.6 Corporate Initiatives
- 5.7 Strategic Alliances
- 6.0 Program Branches’ Evaluation Plans for FY 2013-2014
- 7.0 Budget
- CIDA Evaluation Initiatives (FY 2013-2014 to FY 2017-2018)
- Evaluation Commitments to TBS
- Annex 1: Establishing the Evaluation Universe
- Annex 2: Evaluation Risk Assessment
List of Abbreviations
- CIDA
- Canadian International Development Agency
- CDPF
- Country Development Programming Framework
- DAC/EVALNET
- Network on Development Evaluation of the Development Assistance Committee (of the OECD)
- ED
- Evaluation Division
- FAA
- Financial Administration Act
- GPB
- Geographic Programs Branch
- MGPB
- Multilateral and Global Programs Branch
- MOPAN
- Multilateral Organizations Performance Assessment Network
- OECD/DAC
- Development Assistance Committee of the Organization for Economic Development and Cooperation
- PWCB
- Partnership with Canadians Branch
- TB
- Treasury Board
- TBS
- Treasury Board Secretariat
Executive Summary
The present Rolling Five-Year Evaluation Work Plan (hereafter, the Plan) identifies corporate evaluations scheduled for FY 2013-2014 to 2017-2018, with a focus on FY 2013-2014. Execution of these evaluations moves CIDA towards achieving evaluation coverage of 100% of its direct program spending as required by the 2009 Treasury Board (TB) Policy on Evaluation. The Plan also includes a list of Branch-led evaluations planned for FY 2013-2014.
In 2012, the Evaluation Division (ED) conducted an in-depth review of CIDA’s evaluation universe, with the intention of ensuring full evaluation coverage of the Agency’s direct program spending over 5 years. Approximately 12 corporate evaluations per year should be undertaken to meet the 100% coverage requirement by 2017-2018.
In 2013-2014, the Evaluation Division proposes two pilot country-cluster evaluations in countries of focus: Mozambique/Tanzania and Ethiopia/Ghana, where there are sectoral and programmatic similarities. This approach will allow thematic review and comparison of lessons within clusters, while maximizing efficiency of resource use. In the future, the Evaluation Division proposes to conduct cluster evaluations based on thematic priorities for modest presence countries, and for regional programming, provided that the experience with these pilot cluster evaluations proves positive.
Planned Evaluation Work – FY 2013-2014
In 2013-2014, the Evaluation Division will initiate and/or complete the following evaluations:
- Countries of concentration and regional programs: Country Program Evaluations of Mozambique and Tanzania, Ethiopia and Ghana, Indonesia, and Pakistan.
- Fragile States: Country Program Evaluations of West Bank and Gaza, Afghanistan and Haiti.
- Multilateral Organizations: Review of the development effectiveness of the Inter-American Development Bank, UNICEF, and the International Fund for Agricultural Development.
- Partnerships with Canadians Branch (PWCB): An evaluation of PWCB’s Economic Growth and Environmental Sustainability programming.
- Corporate evaluations: A meta-evaluation of Branch-led evaluations.
- Horizontal Evaluations: A review of Canada’s Corporate Social Responsibility (CSR) Strategy for the Canadian International Extractive Sector, led by Department of Foreign Affairs and International Trade Canada with participation of Natural Resources Canada and CIDA.
Equally importantly, the Evaluation Division will:
- Produce a second annual report on the state of performance measurement;
- Produce a second annual lessons learned report from CIDA’s evaluations entitled CIDA Learns;
- Undertake a review of the current methodology for country program evaluations;
- Complete an analysis of the use of evaluation within the Agency;
- Update evaluation tools and guides for inclusion in the new Agency Programming Process;
- Provide technical advice and quality assurance to Branch-led evaluations on a responsive basis, taking into account risk and materiality;
- Continue to implement the dissemination strategy for evaluation knowledge; and,
- Serve as Secretariat for CIDA’s Evaluation Committee.
Resources
While an increased level of evaluation activity is planned, it will be within existing resource ceilings. This work plan will require an operations and management (O&M) budget of $2.2 million for FY 2013-2014, and a salary budget of $1.7 million for 18 full-time equivalents (FTEs).
1.0 Background
1.1 Introduction
The present iteration of CIDA’s Rolling Five-Year Evaluation Work Plan takes into account a range of considerations, including:
- The evolving international, Canadian and CIDA contexts;
- The achievements, experiences and lessons learned from previous years;
- Treasury Board of Canada requirements;
- Accountability;
- Value for money; and,
- A strategic assessment of evaluation riskFootnote 1
The scope and content of the Plan is consistent with the requirements of the 2006 Financial Administration Act (FAA) and the 2009 Treasury Board (TB) Policy on Evaluation. The Plan proposes an approach that will position CIDA to achieve 100% evaluation coverage of direct program spending as required by the FAA and the 2009 TB Policy on Evaluation, while contributing to the Agency’s learning and decision making needs and processes.
1.2 Context
In 2005, the Paris Declaration on Aid Effectiveness articulated a consensus on reforming donor and developing country approaches to delivering and using aid for better development results. The 2008 Accra Agenda for Action further built on the Paris Declaration by promoting policies and actions that improve transparency and accountability in strengthening aid effectiveness.
The Paris Declaration evaluation highlighted that most principles and commitments have proven relevant to improving the quality of aid. However, evidence gathered on the implementation of Paris commitments shows that while progress has been made, it has not been to the extent and pace envisioned.
The 2011 Fourth High Level Forum on Aid Effectiveness in Busan reaffirmed the Paris and Accra principles while highlighting ownership, results, inclusiveness, transparency and accountability. Furthermore, it responded to a shift from a focus on aid effectiveness to effective development cooperation, and it included a more diverse set of actors, such as emerging economies. The event culminated in the endorsement of the Busan Partnership for Effective Development Cooperation.
As part of the Government’s plan to return to balanced budgets through $4 billion in ongoing annual savings by 2014-2015, the Economic Action Plan 2012 identified planned savings of 9.7% from the International Assistance Envelope over 2012-2015. For CIDA, this will result in reductions of $152.7 million in 2012-2013, $191.6 million in 2013-2014, and $319.2 million in 2014–2015, and ongoing.
The Government of Canada continues to encourage departments to use evaluation to strengthen programs. In line with the Financial Administration Act (FAA), the 2009 TB Policy on Evaluation requires that 100% of direct program spending be evaluated over a five-year cycle. The TB Policy on Evaluation also suggests the use of flexible evaluation approaches guided by risk, scale, and scope. It requires departments to develop a strategically focused evaluation plan that is founded on an assessment of evaluation risk, departmental priorities, and government priorities. The expected results of this updated policy are a more robust evaluation function focused on value for money; accountability; and credible, timely and neutral information on the ongoing relevance and effectiveness of all direct program spending.
1.3 CIDA Context
In May 2009, Canada introduced five thematic priorities to frame its international assistance efforts: increasing food security, securing a future for children and youth, stimulating sustainable economic growth, advancing democracy, and ensuring security and stability. CIDA concentrates its programming on the first three, with environmental sustainability, gender equality, and governance as crosscutting themes.
CIDA’s Deficit Reduction Action Plan saving proposals were guided by aid effectiveness principles and developed following a rigorous process. The Agency considered programs and operations based on their effectiveness, efficiency, affordability, and alignment to government priorities.
In addition to implementing the decisions of the Economic Action Plan 2012 and adapting to new financial realities, CIDA is taking steps to put its operations and services on a sustainable financial basis. Changes at CIDA affect all areas of the Agency, and they will fall under two broad categories:
First, optimizing resources by:
- ending bilateral programming in eight countries of modest presence (Cambodia, China, Malawi, Nepal, Niger, Rwanda, Zambia, and Zimbabwe) for, among other, operational cost considerations;
- consolidating regional programming in Africa into one—the Pan-Africa regional program—and reducing the Southeast Asia regional program;
- reducing program budgets for Bolivia, Pakistan, Mozambique, Ethiopia, Tanzania, Afghanistan, and South Africa; and,
- reducing and consolidating contributions to a number of multilateral and global programs.
Second, restructuring and streamlining corporate services and program operations to reduce operational costs by further consolidating and simplifying financial and human resources, and information technology and information management services. As well, the communications function will be centralized to ensure a more coordinated approach across the country. There will also be a consolidation of complementary activities such as Cabinet and Parliamentary Affairs, and evaluation and performance management functions, to maximize efficiency and effectiveness. In addition, program branches will be taking steps to rationalize their corporate and support functions.
CIDA is reducing its full-time equivalents to meet its projected budget levels. The Agency is making use of voluntary attrition and prudent vacancy management to help manage the reduction, consistent with workforce adjustment directives. The Agency is implementing efforts to ensure that the knowledge base and results achieved by programs and personnel are identified and integrated into improving ongoing programming.
The Organisation for Economic Cooperation and Development's (OECD) peer review of CIDA, published in June 2012, confirmed that the Agency has made progress since 2007 in numerous areas, including:
- improving geographic and thematic focus;
- making solid and strategic contributions to the multilateral system;
- strengthening inter-departmental cooperation in conflict-affected states;
- supporting greater gender equality and women’s empowerment; and,
- improving transparency and accountability.
Areas identified by the OECD peer review as important for the future of Canadian foreign assistance include anchoring Canada’s aid within the foreign-policy context, adjusting CIDA’s action plans to increase aid effectiveness, and completing the decentralization of program activities to the field. In addition, the review suggested continuing to make use of Multilateral Organization Performance Assessment Network (MOPAN) to measure multilateral agencies’ performance and contributing to joint evaluations within this network and through the Evaluation Network of the OECD’s Development Assistance Committee.
2.0 CIDA’s Evaluation Function
Evaluation at CIDA provides Canadians, Parliamentarians, Ministers, central agencies, partners, beneficiaries and CIDA’s management with credible, neutral, and evidence-based assessments of relevance and performance, including results achieved and lessons learned in developing countries, of the Agency’s policies, programs, and projects.
CIDA’s organizational improvements in recent years include bolstering the independence of its evaluation function by strengthening the governance and independence of CIDA’s Evaluation Committee. The committee is now comprised of six members from outside government and five CIDA senior executives (including the President, who chairs the committee). The Evaluation Committee acts as an advisory body to the President on evaluation activities.
CIDA’s evaluation function is governed by the Agency’s Evaluation Policy, which is being updated to reflect the new requirements outlined in the 2009 TB Policy on Evaluation. The policy and 5 year rolling work plan guide the Agency’s Corporate-level and Branch-led evaluation functions. The Evaluation Division leads the corporate evaluation function, focusing on program-level assessments of relevance and performance. The evaluations led by the Evaluation Division respond to the requirements of the 2009 TB Policy on Evaluation.
In 2012, the Evaluation Directorate went through a process of renewal to maximize the relevance and credibility of strategic information available for corporate decision making and to foster improved performance management in the Agency. The Directorate was integrated with the new Strategic Planning, Performance and Evaluation Directorate. The new Directorate comprises three divisions: Performance Management, Strategic Planning, and Evaluation.
Program branches lead and manage, alone or jointly with other donors, their Branch-led evaluations. The Evaluation Division provides advice and quality assurance throughout this process on a responsive basis, taking into account the evaluation risk and materiality. These evaluations cover investments below the sub-sub program level of the Program Alignment Architecture. Branch-led evaluations support decision-making, program and project improvement, mutual accountability and learning. While the results of these evaluations improve CIDA’s aid effectiveness, the primary beneficiaries are developing country partners. These Branch-led evaluations also serve as building blocks for corporate evaluations.
Corporate and Branch-led evaluations are integral to CIDA’s oversight function, while also contributing to learning and the adoption of better practices. The present Plan outlines both corporate and branch-led evaluations, including some multidonor initiatives.
The key responsibilities of both the Evaluation Division and program branches are outlined below.
2.1 Responsibilities of the Evaluation Division
The Evaluation Division is responsible for:Footnote 2
- developing, in coordination with Branches, the Rolling Five-Year Evaluation Plan, which include corporate and branch-led evaluations;
- ensuring that corporate evaluations are conducted in a neutral, rigorous and cost- effective manner;
- developing and implementing a dissemination approach to ensure that evaluative knowledge is broadly disseminated across the Agency to promote organizational learning;
- instituting a systematic approach to track the implementation of commitments in management responses and action plans;
- providing technical advice and quality assurance, including training and tools, in support of branch-led evaluation, as resources permit, and prioritized according to risk and materiality;
- managing a standing offer of consultants qualified to undertake evaluations, and helping Program/Service Branches identify those best suited to individual evaluations;
- reviewing and providing advice on the accountability and performance provisions to be included in Cabinet documents (memoranda to Cabinet, Treasury Board Submissions);
- forging and maintaining beneficial strategic alliances with key stakeholders, inside and outside the Agency; and,
- supporting the Evaluation Committee to ensure effective governance and oversight.
2.2 Responsibilities of Program Branches
Program branches are responsible for:Footnote 3
- ensuring that sufficient performance information is available to effectively support corporate evaluations;
- preparing on an annual basis their multiyear evaluation work plans for inclusion in CIDA’s Rolling Five-Year Evaluation Work Plan;
- conducting the evaluations as set out in their multiyear evaluation work plans and implementing resulting recommendations;
- disseminating the results for learning and follow-up actions, and to ensure transparency;
- providing comments on draft reports and prepare management responses and/or action plans; and,
- implementing approved management responses and/or action plans.
3.0 Developing the Work Plan
3.1 Challenges and key Considerations
Several challenges and considerations need to be taken into account in developing CIDA’s evaluation work plan for FY 2013-2014:
- the requirement defined in the 2009 TB Policy on Evaluation to achieve 100% evaluation coverage of CIDA’s direct program spending over 5 years;
- the period of implementation of particular evaluations sometimes extend beyond the original timeline for a variety of factors, including in-country circumstances and availability of resources;
- the importance of maximizing the value for money of the corporate evaluation effort, and managing within the currently allocated budget envelope going forward;
- the impacts of the Deficit Reduction Action Plan on program and corporate resources;
- the emphasis that must be put on using evaluation findings for program improvement and decision making, in line with recent Treasury Board guidance, and the need therefore to tailor evaluation design to user needs;
- the need to ensure that the approaches and methodologies used in conducting CIDA evaluations take into account the evolving thinking on best practices in development evaluation, including the recent Treasury Board guidance on theory-based evaluation;
- the imperative of assessing and managing levels of risk for specific evaluations, including in fragile states. Both natural disasters and ongoing security issues make certain field-based evaluations problematic and expensive; and,
- the limited number of skilled evaluators (both consultants and staff). In the past year, several experienced evaluators have retired from CIDA.
The present Plan has been developed assuming a level of Agency resources for evaluation in 2013-2014 equivalent to those in 2012-2013.
3.2 Addressing challenges
In order to address the challenges identified above, the Plan incorporates the following strategies:
- In order to fulfill Treasury Board Secretariat’s evaluation coverage requirement, the Evaluation Division needs to increase the annual rate of corporate evaluations to be implemented from 2013-2014 onwards, in comparison to the implementation rate of recent years;
- Evaluation risk is being addressed in various ways, depending on the specific situation. Joint evaluations are one way to leverage the resources of multiple donors and mitigate security risks (e.g., Multi-Donor Evaluation of Conflict Prevention and Peacebuilding Activities in Southern Sudan). In other situations, the Evaluation Division has engaged consultants with significant experience in fragile states (e.g. Afghanistan);
- To address recent staff departures, a succession plan has been developed, with four junior evaluation officers having been hired. They will be provided with evaluation specific training, as well as work in hybrid teams with external evaluators to acquire skills on the job. The Agency’s standing offer arrangement for external consultants will also be renewed in the coming year; and,
- To address the challenges related to delays from original implementation plans, the Evaluation Division will over-plan by 20% to ensure an appropriate level of deliverables per year.
3.3 Coverage and Sequencing
The Evaluation Division conducted an in-depth review of CIDA’s evaluation universe, with the intention of ensuring full evaluation coverage of the Agency’s direct program spending over 5 years. Approximately 12 corporate evaluations per year should be undertaken to meet the 100% coverage requirement by 2017-2018. (A detailed description of the Agency’s evaluation universe is found under Annex 1: “Establishing the Evaluation Universe”.) CIDA also consulted with TBS to determine its evaluation obligations in regards to the closing and sunsetting programs. It was determined that the Agency does not have an obligation to evaluate these programs, but could do so for learning or other objectives.
An examination of sequencing of evaluations was undertaken so that their usefulness and value would be optimized. As recommended by the Treasury Board Secretariat, sequencing was accomplished by considering a combination of needs and evaluation risks.Footnote 4 The needs component includes factors such as the contribution of the evaluation to decision making processes (e.g. renewal of the programming strategy or replenishment of funding to partner organizations and institutions). The evaluation risk component includes factors such as complexity and materiality (importance in financial terms) of the program/activity to be evaluated. The proposed sequencing of evaluations over a five-year cycle is outlined in Table 1.
3.4 Evaluations carried over from 2012-2013
Some projects are implemented over multiple fiscal years, usually for reasons related to evaluation complexity or unexpected delays. In addition to the planned evaluations described in Section 4 of this document, in 2013-2014 the Evaluation Division will complete several evaluations carried over from 2012-2013. These include:
Evaluation: Haiti
Expected Completion Date: February 2014
Rationale: Complex evaluation that has required extensive consultation with branches and other government departments and has faced some contracting issues.
Evaluation: Afghanistan
Expected Completion Date: October 2014
Rationale: Implementation was delayed due to contracting issues.
Evaluation: Indonesia
Expected Completion Date: October 2013
Rationale: Implementation was delayed due to contracting issues.
Evaluation:Pakistan
Expected Completion Date: October 2013
Rationale: Implementation was delayed due to contracting issues.
4.0 Planned Evaluations for FY 2013-2014
Given the variety of CIDA’s programming channels, the Evaluation Division has adopted a multi- pronged corporate evaluation strategy.
4.1 Countries of Focus and Modest Presence
The main emphasis will be on CIDA’s 20 countries of focus, where at least 80% of bilateral resources are spent. In 2013-2014, the Evaluation Division proposes two pilot country cluster evaluations: Mozambique/Tanzania and Ethiopia/Ghana where there are sectoral and programmatic similarities. This approach will allow thematic review and comparison of lessons within clusters. The evaluations of Pakistan and Indonesia, initiated in 2012-2013, will be completed in 2013-2014. It is expected that one of the evaluation clusters, and one of the single-country evaluations, will be initiated later in 2013-2014 and will be completed the following year.
In the future, for modest presence countries, the Evaluation Division proposes to conduct pilot cluster evaluations based on thematic priorities, beginning with Sustainable Economic Growth (i.e. Cuba, Guatemala, Nicaragua, Philippines, Sri Lanka, Egypt, Jordan, Morocco, South Africa). A cluster evaluation of regional programming (Inter-American, Pan-African, and South East Asia Regional Programming) is also proposed, given similarities in programming themes, regional architectures and executing agencies, and the potential for lessons across regions. The objective of cluster evaluations is not only to make better use of financial resources, but also to improve the effectiveness of evaluations, in line with the principle of aid effectiveness.
4.2 Fragile States
The evaluations of Afghanistan and Haiti, both initiated in 2012-2013, will be completed in 2013-2014. An evaluation of West Bank and Gaza will be initiated in 2013-2014. These evaluations will include consideration of the extent to which fragile state programming aligns with the OECD/DAC Principles for Good International Engagement in Fragile States and Situations. These principles direct international interventions in fragile states towards governance and state building for long-term stability.
4.3 Multilateral and Global Programs (MGPB)
Multilateral and Global Programs Branch provides support to a large number of international institutions and funds. The Evaluation Division has used a methodology aimed at assessing the development effectiveness of these organizations, using their own performance reporting and evaluation data. The approach, intended to be joint among donors, allows greater coverage of more institutions over a shorter period, at less cost to each shareholder, and with greater leverage for affecting needed changes. Uptake by other donors is slowly occurring with the review of the United Nations Children’s Fund (UNICEF) completed in 2013 with the partnership of the Government of the Netherlands.
Since the approach is complementary to the work done by the Multilateral Organization Performance Assessment Network, involving multiple donors, the Evaluation Division and Multilateral and Global Programs Branch intend to use that forum to promote greater buy-in for joint evaluations of multilateral development effectiveness in 2013.
In 2013-2014, the Evaluation Division will conduct reviews of the Inter-American Development Bank, the United Nations Children’s Fund, and the International Fund for Agricultural Development.
4.4 Partnerships with Canadians Branch
Partnerships with Canadians Branch’s programming is organized around five thematic areas, corresponding both to its logic model and organizational structure. The following evaluation units align with this programming structure:
- Governance;
- Economic Growth and Environmental Sustainability;
- Human Development;
- Engaging Canadians; and,
- Knowledge Creation and Sharing.
In 2013-2014, the Evaluation Division will evaluate programming in Economic Growth and Environmental Sustainability. Country program evaluations also cover investments by Partnerships with Canadians Branch.
4.5 Corporate Policy and Process Evaluations
In addition to performance information at the program-level, the Agency also requires information on the performance of cross-agency initiatives such as policies, thematic issues and delivery mechanisms. In 2013-2014 a meta-evaluation will be conducted to assess the quality of Branch-led evaluations. As well, the Treasury Board Policy mandated report on the State of Performance Measurement will be prepared. This report is expected to be produced annually.
4.6 Horizontal Evaluations
In FY 2013-2014, CIDA has been asked to participate in the planning and implementation of a horizontal evaluation of the Government of Canada’s Corporate Social Responsibility (CSR) Strategy for the Canadian International Extractive Sector. This was foreseen at the time of approval of the CSR Strategy in March 2009. The departments responsible for implementing the CSR Strategy and conducting a review are the Department of Foreign Affairs and International Trade Canada (DFAIT), Natural Resources Canada (NRCan), and the Canadian International Development Agency (CIDA).
4.7 Evaluation Risk Assessment
Treasury Board Secretariat (TBS) requires that government departments and agencies determine the proposed five-year evaluation schedule based on identified evaluation needs and risks. An evaluation risk framework was developed in order to incorporate four evaluation risk criteria: political sensitivity; materiality; complexity; and context. The four criteria were weighted according to the rank of the associated corporate risk(s) in the Corporate Risk Profile.
In developing the evaluation risk framework and conducting the assessment, the Evaluation Division consulted both Treasury Board Secretariat (TBS) guidelines and CIDA’s Corporate Risk Profile. The evaluation risk assessment has been used to:
- inform the allocation of financial and human resources for implementing each evaluation;
- inform the evaluation approach/methodology for each evaluation; and,
- inform the sequencing of evaluations to ensure appropriate distribution of evaluation risks across the five year plan, where appropriate.
The evaluation risk framework, assessment methodology, and final evaluation risk ratings are reported in Annex 2.
5.0 Additional Evaluation Division Activities for FY 2013-2014
5.1 Support to Branch-led Evaluations
As mentioned above, Branch-led evaluations are “building blocks” to Evaluation Division-led Corporate-level evaluations. In addition, they complement Corporate-level evaluations by providing a more detailed picture of the Agency’s performance.
The Evaluation Division will be:
- including the evaluation plans of the program branches in the Rolling Five-year Evaluation Work Plan;
- providing technical advice and quality assurance on a responsive basis, taking into account evaluation risk and materiality; and
- ensuring that the new Agency Programming Process includes up-to-date tools and guides for evaluation.
The Evaluation Division is updating its tools in order to better assist program staff. The first step was to update the generic Terms of Reference, which are now available in both official languages. As a second step, the Evaluation Division will prepare a guide on how to design evaluations. A central database to house all branch-led evaluations that use the Evaluation Standing Offer Arrangements is in development.
In order to assist Branches and maximize limited resources, the Evaluation Division will be providing advice to programs on Branch-led evaluations through agreed upon channels. Furthermore, Evaluation Division will provide expertise and advice to support the development of strategic annual plans and the International Aid Transparency Initiative (IATI) Implementation Plan for Activity Information.
A synthesis of lessons learned from closing programs will be developed by the Branches to capture CIDA’s experience and inform future programming. The Evaluation Division will provide advice and support for this exercise upon request from the Branches. Finally, the Evaluation Division will initiate work on a new supply arrangement for external evaluation consultants and will consult with program Branches throughout this process.
5.2 Review of Country Program Evaluation Methodology
In 2013-2014, the Evaluation Division will undertake a review of the current methodology for country program evaluations. The review will ensure the methodology is in line with best practices and standards in the field of international program evaluation.
5.3 Commitments in Treasury Board Submissions
Over the past few years there have been several evaluation commitments made in Treasury Board Submissions to obtain approval for major initiatives. Many of these major initiatives are sub-elements of programs. Four of the 11 evaluation commitments to Treasury Board Secretariat (committed between 2006-2007 and 2010-2011) are the responsibility of the Evaluation Division. The remaining seven evaluation commitments are the responsibility of the program branches. Table 2 presents a list of these evaluation commitments.
These evaluation commitments may be fulfilled through planned program evaluations, rather than conducting separate evaluations. The level of risk of such evaluations depends on the country, program and institution under consideration. The number of these evaluations is expected to decline in the future.
5.4 Optimizing the Learning / Knowledge Benefits from Evaluations
Evaluations at CIDA create a source of credible and neutral information, which feeds into strategic reviews, renewals of terms and conditions, Treasury Board submissions, the Report on Plans and Priorities, the Departmental Performance Reports, and memoranda to Cabinet. They also provide input that informs decision making on investments.
Though CIDA has traditionally benefited from an evaluative culture, there remains a need to better optimize the learning and knowledge benefits from evaluations. The Agency’s approach and work plan for the dissemination of evaluation knowledge will increase the visibility, accessibility and use of evaluation findings and lessons, both within and outside the Agency.
Since 2011-2012, the ED has been implementing an approach and work plan for the dissemination of evaluation knowledge, which seeks to increase the visibility, accessibility and use of evaluation results and lessons, both within and outside the Agency. A key element of the dissemination work plan is an annual report on lessons learned from evaluations. The report communicates key lessons derived from corporate evaluations and presents one-page summaries of the evaluations discussed. In December 2012, the ED launched its first annual lessons learned report, entitled CIDA Learns: Lessons from Evaluations 2011-2012.
In 2013-2014, the Evaluation Division will:
- produce the second CIDA Learns: Lessons from Evaluation report;
- exchange on evaluation knowledge and lessons learned via Evaluation Learning Cafés;
- deliver interactive presentations on main findings from evaluations;
- provide guidance to Branches on the preparation of evaluation summaries; and,
- increase usage of CIDA’s knowledge networks to share findings and lessons from evaluations.
5.5 Measuring the use of evaluation within the Agency
Building upon a pilot-test by Aboriginal Affairs and Northern Development Canada (AANDC), the Evaluation Division developed and pilot-tested CIDA-specific tools to measure the use of evaluation amongst both program personnel and policymakers in the Agency. In 2013-2014, the Evaluation Division will revise the tools based on the results of the pilot test. The revised tools will be applied to generate a baseline assessment of evaluation usage. The exercise will be both retrospective (to measure past usage) and forward-looking (to improve future usage). Lessons gleaned from this exercise will allow the Evaluation Division to better target its evaluations and the dissemination of results.
5.6 Corporate Initiatives
Support and input will be provided to corporate-level initiatives, such as the Departmental Performance Report; the Report on Plans and Priorities; and Management Accountability Framework. Corporate committees and working groups represent a significant portion of the Evaluation Division’s workload. They offer mechanisms for knowledge dissemination, accountability, and influence on policy and program decision making.
5.7 Strategic Alliances
In Canada, the Evaluation Division will strive to maintain its good working relationship with the Treasury Board Secretariat Centre of Excellence in Evaluation and develop a stronger relationship with the Canadian Evaluation Society. Liaising with these communities of practice is important to maintain and strengthen Canadian leadership in evaluation, and provide the Evaluation Division with sources of best practices and lessons learned.
Internationally, the Evaluation Division will focus its engagement with the Network on Development Evaluation of the Development Assistance Committee (of the OECD) in order to further promote uptake of its methodology to assess the development effectiveness of multilateral organizations.
6.0 Program Branches’ Evaluation Plans for FY 2013-2014
Program branches conduct evaluations to improve the effectiveness of projects and programs that benefit developing countries.
The following section describes the evaluation activities planned by the branches during the FY 2013-2014 period. Branch-led activities complement the Evaluation Division’s evaluation work by providing performance assessments, monitoring data, and other relevant information.
6.1 Geographic Program Branch (GPB) – Branch-led Evaluations Planned for FY 2013-2014
Monitoring, evaluation and reporting is ongoing in the Geographic Programs Branch. The list below provides a provisional plan for FY 2013--2014.
Country | Initiative |
---|---|
Americas | |
Inter-American | Deployment for Democratic Development |
Inter-American | Strengthening the Role of Parliaments |
Bolivia | Strategic Governance Mechanism |
Cuba | Development of the Forestry Sector |
Colombia | Planning the Future |
Haiti | Formation initiale et perfectionnement des cadres de la Police Nationale d'Haiti (FIPCA) |
Haiti | Systeme de financement et d'assurances agricoles en Haiti (SYFAAH) |
Haiti | Crédit écolage |
Peru | Multidonor Basket Fund, Defensoría del Pueblo |
Africa—Southern and Eastern | |
Mozambique | Community-Based Health Training Practice |
Mozambique | Support for Education Materials in Mozambique Phase II |
South Africa | Regional Public Sector Training |
Africa—West and Central | |
Regional Program in West and Central Africa (PRACO) | Appui à la lutte contre la violence faite aux filles et jeunes femmes dans la région des Grands Lacs |
République Démocratique du Congo (RDC) | Renforcement capacités Banque centrale du Congo |
RDC | PROSAKIN - Réhabilitation des services de santé - Province de Kinshasa |
Senegal | Centre de traitement informatisé en microfinance |
Senegal | Programme d'appui au développement en Casamance |
Sénégal | Programme de développement des marchés agricoles au Sénégal |
Burkina Faso | Société d'acompagnement au renforcement des capacités (SARC) |
Ghana | District Wide Assistance Transition Project (DWAP) |
Ghana | Water and sanitation project in Northern Ghana (NORST) |
Mali | Formation des professionnels de la santé |
Mali | Appui au Plan decennal de devéloppement sanitaire et social (PDDSS) (Nord-Mali) |
Mali | Projet appui au secteur micro finance au Mali (PASMIF) |
Mali | Projet d'appui à l'irrigation de proximi (PAIP) |
Mali | Appui aux filières agricoles au Mali (PAFA) |
Europe, Middle-East & Maghreb, Afghanistan and Pakistan (EMMAP) | |
Morocco | Projet d'appui à la gestion des établissements scolaires au Maroc (PAGESM) |
Ukraine | Municipal Local Economic Development |
Ukraine | Evidence-based Economic Development |
West Bank and Gaza | Support to Public Prosecution Services |
Asia | |
Philippines | GREAT Women Project |
Sri Lanka | National Languages Project |
6.2 Partnership with Canadians Branch– Branch-led Evaluations Planned for FY 2013-2014
Partnerships with Canadians Branch has developed a Rolling Four-Year Evaluation Plan covering fiscal years 2013-14 to 2016-17. The Plan focuses on the implementation of thematic evaluations in programming areas of priority for the Agency. For FY 2013-2014, Partnerships with Canadians Branch plans to execute the following evaluations:
- PWCB’s Water and Sanitations programming;
- Provincial and Regional Councils for International Cooperation supported by PWCB; and,
- Global Multilateral Electoral Observation Program Evaluation.
Partnerships with Canadians Branch will also execute the following project performance evaluations to meet specific evaluation commitments to Treasury Board:
- Summative evaluation of the Jules and Paul Émile Léger Foundation Program;
- Summative evaluation of the Partnerships with Canadians Branch’s Volunteer Cooperation Program; and,
- Formative evaluation of the Aga Khan Foundation Canada’s Partnership for Advancing Human Development in Africa and Asia Initiative.
During FY 2013-2014, pending the availability of resources, Partnerships with Canadians Branch will implement evaluations of the following elements of its programming:
- Children and Youth Protection;
- Early Childhood and Basic Education;
- Strengthening Professional Health Associations;
- Extractive Industries Pilot Corporate Social Responsibility Projects;
- Skills for Employment; and,
- Rule of Law/Access to Justice.
6.3 Multilateral and Global Programs Branch (MGPB) – Branch-led Evaluations Planned for FY 2013-2014
Multilateral and Global Programs Branch engages in three types of programming evaluated according to distinct criteria.
Long-term Institutional Support
Canada, both as a member of the governing board and through direct consultation with the institution, participates actively alongside other donors in evaluation processes, including reviewing evaluations and providing direction on recommendations. Examples of evaluations being completed by our partners in 2013-2014 include:
- Programmatic Evaluation: Violence Against Children, the United Nations Children's Fund (UNICEF)
- 2013 Annual Evaluation Review, Asian Development Bank
- Evaluation of the implementation of the Plan de Gestion de Modernisation, Organisation internationale de la Francophonie
Overarching assessments of key partners, such as through the Multilateral Organisation Performance Assessment Network (MOPAN), provide further opportunities for dialogue and comparison.
Global Initiatives
Each global initiative has a strong monitoring and reporting component. These evaluations generally take the form of a mid-term and final evaluation. In most cases, the evaluation is conducted by the partner. CIDA, and other donors, frequently play a role in the process but the relationship is supportive rather than prescriptive. In a few instances, CIDA conducts the evaluation. For example, as part of CIDA’s recent funding for the Micronutrient Initiative (MI), MI agreed to an external evaluation and this was completed in March 2012. Examples of evaluations of global initiatives taking place in 2013-2014 supported by MGPB include:
- Mid-term evaluation of H4+ Accelerating Progress in Maternal and Newborn Health, managed by the United Nations Population Fund (UNFPA)
- Endline evaluation of Community Based Treatment of Malaria and Pneumonia, by Save the Children
- Final project evaluation for EcoHealth Emerging Infectious Diseases, Global Health Research Initiative, Evaluation managed by International Development Research Centre (IDRC)
International Humanitarian Assistance
Although CIDA supports evaluations conducted by our humanitarian partners on their programming, CIDA also supports system-wide learning, to determine better how the humanitarian response system as a whole is meeting identified needs. The Office for the Coordination of Humanitarian Affairs (OCHA) conducts evaluations to promote transparency, accountability and learning through systematic and objective judgments about the relevance, efficiency, effectiveness and impact of humanitarian interventions.
CIDA also supports specific learning and evaluation mechanisms, such as the Active Learning Network for Accountability and Performance in Humanitarian Action (ALNAP). In almost all instances, CIDA’s NGO partners include monitoring and evaluation as part of the project – this pertains to both emergency response and complex humanitarian initiatives. In some cases, CIDA conducts an evaluation. For example, an evaluation focusing on results of Canadian Food Grains Bank programming was completed in 2010. In addition to real-time evaluations following a crisis, examples of evaluations being undertaken in 2013 include:
- Evaluation of Response Preparedness, Office for the Coordination of Humanitarian Affairs
- The State of the Humanitarian System 2013, Active Learning Network for Accountability and Performance in Humanitarian Action
- Final Evaluation for First Responder Initiative, Canadian Red Cross
The list below provides a provisional plan for FY 2013--2014 evaluations for which the Multilateral and Global Programs Branch is directly involved.
Evaluation | Evaluation Type | Start Date |
---|---|---|
Gender Equality Institutional Assessments | Gender Equality Institutional Assessments | 2013-2014 |
Legal Empowerment of Women Initiative | Program Evaluation - end of program | 2013-2014 |
Diagnostic Community Based Management of Malaria Control, Population Services International | Final Evaluation | 2013 |
Advanced Market Commitments (AMC) vaccines | Impact evaluation | 2014 |
Support to GAVI Alliance | Mid-term evaluation | 2013-2014 |
Scaling up Nutrition Interventions (REACH) | Mid-term evaluation | 2013 |
Catalytic Initiative / Integrated Health Systems Strengthening, UNICEF | External Evaluation | 2013 |
EcoHealth Emerging Infectious Diseases IDRC/GHRI | Final Evaluation | 2013-2014 |
H4+ Accelerating Progress in Maternal and Newborn Health, UNFPA | Mid-term and final evaluation | 2013-2014 |
7.0 Budget
As indicated above, the Evaluation Division’s program of work will increase to approximately 12 evaluations per year, and maintain existing responsibilities to the Agency and the branches New efficiencies such as cluster and thematic evaluations, greater focus on evaluation design and greater use of in-house expertise, however, will allow the Division to respect the existing budget ceiling The work plan also over-programs by 20% to account for potential slippage in execution.
Based on past experience, some evaluations require implementation over multiple fiscal years due to complexity, contracting issues, or unforeseen difficulties arising from execution in developing and fragile states.
7.1 Non-Salary O&M
As shown in Table 1 a budget of $2.2 million is required for 2013-2014 in order to implement the proposed plan. This covers the costs of engaging external professional expertise, and other non-salary costs including staff travel and translation. The Evaluation Division will strive to improve coordination between Corporate-level and Branch-led evaluations in an effort to maximize resource efficiency.
7.2 Salary Budget
Evaluation Division has a salary budget of approximately $1.7 million and a staff complement of 18 full-time equivalents (FTEs): the Director, 16 evaluators and 1 administrative support staff.
CIDA Evaluation Initiatives (FY 2013-2014 to FY 2017-2018)
Proposed Evaluation Initiatives | Eval Risk Level5 | Total Direct Program Expending FY 12-136 | Program Operational Cost FY 12-137 | G&Cs Amount FY 12-138 | Internal Evaluation Resources FY 13-149 10 | External Evaluation Resources FY 13-1411 12 | Start Date for Planned Initiatives FY 13-1413 | Approval Date for Planned Initiatives FY 13-1414 | FY 2012-2013 Activities15 | 2013-2014 Five-Year Evaluation Plan | 2014-2015 Five-Year Evaluation Plan | 2015-2016 Five-Year Evaluation Plan | 2016-2017 Five-Year Evaluation Plan | 2017-2018 Five-Year Evaluation Plan | Evaluation Approach, Design, Methodology | Performance Measurement Information | Considerations |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Codes: E = Led by Evaluation Division; D = Dissemination / knowledge sharing; C = Support to the Corporate level 5 “low”=Low; “med”=Medium; “sub”=Substantial; “high”=High. See Annex 2. 6 Figures are in millions of Canadian dollars. 7 Figures are in millions of Canadian dollars. 8 Figures are in millions of Canadian dollars. 9 The amounts listed in this table cover only disbursements for FY 13/14, even for evaluations whose contracts cover more than one fiscal year. 10 Only evaluation initiatives for FY 2013-2014 have cost estimates. Amounts include all non-salary O&M (such as professional services, printing, travel etc.). 11 The amounts listed in this table cover only disbursements for FY 13/14, even for evaluations whose contracts cover more than one fiscal year. 12 Only evaluation initiatives for FY 2013-2014 have cost estimates. Amounts include all non-salary O&M (such as professional services, printing, travel etc.). 13 The start date refers to the initial planning stages. The end date refers to when the evaluation is published. 14 The start date refers to the initial planning stages. The end date refers to when the evaluation is published. 15 FY 2012-2013 is retained in the table in order to provide a more comprehensive picture of the evaluation universe. However, it is understood that the 5-Year Rolling Plan includes FY 2013-2014 to FY 2017-2018. 16 Shows multilateral institutions to be reviewed in 2013-2014. The sequencing of institutions to be reviewed in future years is tentative and will be firmly identified in future updates of the Plan, to reflect agreements reached with other donors on the sequencing of upcoming reviews under MOPAN and the DAC Evaluation Network Approach. | |||||||||||||||||
Asia Country Program Evaluations | |||||||||||||||||
Bangladesh Program | med | 67.56 | 1.71 | 65.85 | E | D | Last evaluated in FY 2008-2009. | ||||||||||
Indonesia Program | low | 20.69 | 1.83 | 18.86 | 50K | 200K | JAN 2013 | OCT 2013 | E | E/D | Country Program Evaluation Methodology | CDPF expired 2009. | Evaluation activities for the upcoming FY. Evaluation was delayed because the program was being audited. | ||||
Vietnam Program | med | 17.10 | 1.39 | 15.71 | E | D | Last evaluation completed in FY 2009/10. | ||||||||||
Africa Country Program Evaluations | |||||||||||||||||
Mali Program | med | 81.28 | 1.51 | 79.78 | E | D | E | Joint-Evaluation of Budget support interventions undertaken in FY2009-2010. | |||||||||
Senegal Program | med | 49.41 | 1.37 | 48.04 | E | D | CDPF expired in 2011. | Last evaluation completed in FY 2010-2011. | |||||||||
Ghana and Ethiopia Program | med | 159.26 | 2.83 | 156.42 | 100K | 300K | APR 2013 | FEB 2014 | E | D | The evaluation methodology and approach will be based on themes that are common among this evaluation cluster, possibly food security and PBAs. | Ghana corporate evaluation in the Risk and Results-Based Management and Accountability Framework for FY 11/12. | Evaluation activities for the upcoming FY. Ghana was last evaluated in FY 2007-2008. Ethiopia is a TBS commitment. Last evaluated in FY 2009-2010. FY 2012-2013 Disbursements are as follows: Ethiopia: 35.10 Ghana: 7.07 | ||||
Tanzania and Mozambique Program | sub | 111.11 | 1.51 | 109.60 | 100K | 300K | APR 2013 | FEB 2014 | E | D | The evaluation methodology and approach will be based on themes that are common among this evaluation cluster, possibly agriculture and transition to SEG as well as PBAs. | Evaluation activities for the upcoming FY. Mozambique was last evaluated in FY 2009-2010. Tanzania was last evaluated in FY 2005/06. Joint evaluation led by the European Commission for the budget support interventions in 2011-2012 and 2012-2013. FY 2012-2013 Disbursements are as follows: Tanzania: 13.44 Mozambique: 45.31 | |||||
Darfur/Sudan Program | sub | 50.45 | 1.91 | 48.54 | E | D | Multidonor evaluation of Southern Sudan (led by the Dutch) completed in FY 2010-2011. Case study in the evaluation of IHA Programming. | ||||||||||
Canada Investment Fund for Africa | med | D | E | D | TB Condition: Mid-term evaluation completed in FY 2011-2012. A summative evaluation is planned for FY 2014-2015. | ||||||||||||
Americas Country Program Evaluations | |||||||||||||||||
Bolivia Program | low | 14.43 | 0.95 | 13.49 | E | D | E | CDPF expired in 2009. | Last evaluated in FY 2012-2013. | ||||||||
Caribbean Regional Program | med | 35.46 | 1.77 | 33.69 | E/D | E | RDPF priority for 2007. | Incl. Guyana, Jamaica. Last evaluated in FY 2012-2013. | |||||||||
Haiti Program | high | 59.07 | 3.47 | 55.60 | 50K | 250K | APR 2012 | FEB 2014 | E | E/D | Fragile states evaluation methodology and approach, based on the OECD/DAC guidance on evaluating fragile states. | Last evaluated in FY 2002-20032 Deferred due to earthquake. Corp. eval. in the Risk and Results-Based Management and Accountability Framework for FY 2010-2011, in addition to Branch-level eval. in 2009. | |||||
Honduras Program | med | 22.89 | 0.96 | 21.93 | D | E | D | Last evaluated in 2011. | |||||||||
Colombia Program | low | 15.64 | 0.91 | 14.73 | E | D | E | Last evaluated in 2012. | |||||||||
Peru Program | med | 19.91 | 0.57 | 19.34 | D | E | D | Last evaluated in 2011. | |||||||||
Europe, Middle-East & Maghreb, Afghanistan and Pakistan (EMMAP) Program Evaluations | |||||||||||||||||
West Bank / Gaza Program | high | 24.52 | 1.51 | 23.02 | 60K | 250K | APR 2013 | MAY 2013 | E | D | Fragile states evaluation methodology and approach, based on the OECD/DAC guidance on evaluating fragile states. | Evaluation activities for the upcoming FY. Based on discussions with the program, evaluation deferred to FY 2013-2014. | |||||
Ukraine Program | low | 26.57 | 1.09 | 25.48 | E | E | D | The Ukraine CDPF covers the period from 2009 to 2014. | Last evaluated in FY 2010-2011. | ||||||||
Pakistan Program | med | - | - | - | 50K | 200K | JAN 2013 | OCT 2013 | E | E/D | Country Program Evaluation Methodology | CDPF 2007-2015. | Evaluation activities for the upcoming FY. Last evaluated in FY 2005-2006. | ||||
Afghanistan Program | high | 90.81 | 3.81 | 87.00 | 60K | 300K | OCT 2011 | OCT 2014 | E | E | E/D | Fragile states evaluation methodology and approach, based on the OECD/DAC guidance on evaluating fragile states. | Evaluation activities for the upcoming FY. Last evaluated in FY 2007-2008. Evaluability assessment conducted in FY2011-2012.The total budget for this evaluation is $600k. Since the evaluation spans two fiscal years, $300k will be spent in FY 2012-2013 and the remaining $300k will be spent in FY 2013-2014. Program evaluation in the Risk and Results-Based Management and Accountability Framework for FY 2009-2010. | ||||
Modest Presence County Program Evaluations | |||||||||||||||||
Benin, Burkina Faso, DR Congo, Kenya, Nigeria, Mongolia | med | 65.77 | 3.75 | 62.02 | E | D | Evaluation methodology to be determined based on common themes among this cluster, possibly Children and Youth and Food security. | FY 2012-2013 Disbursements are as follows: Benin: 1.70 Burkina Faso: 1.86 DR Congo: 2.41 Kenya: 3.66 Nigeria: 4.64 Mongolia: 0.13 | |||||||||
Cuba, Guatemala, Nicaragua, Philippines, Sri Lanka, Egypt, Jordan, Morocco, South Africa | low | 58.53 | 3.12 | 55.41 | E | D | Evaluation methodology to be determined based on common themes among this cluster, possibly Sustainable Economic Growth. | FY 2012-2013 Disbursements are as follows: Cuba: 1.19 Guatemala: 3.55 Nicaragua: 3.39 Philippines: 1.93 Sri Lanka: 3.00 Egypt: 1.91 Jordan: 3.04 Morocco: 2.53 South Africa: 2.99 Note: Guatemala contributes to SEG through food security and rural development programming. | |||||||||
Regional Programming | |||||||||||||||||
Inter-American Regional Program Pan-African Regional Program Southeast Asia Regional Program | sub | 59.78 | 1.55 | 58.23 | E | D | Evaluation methodology to be determined based on common themes among this cluster. | Reg. Inter-American Program was last evaluated in 11/12. The ED may consider making IAP a smaller component of the regional evaluation. FY 2012-2013 Disbursements are as follows: Inter-American Regional Program: 2.72 Pan-African Regional Program: 9.63 Southeast Asia Regional Program: 1.96 | |||||||||
Partnerships with Canadians Programs | |||||||||||||||||
Governance | sub | 31.41 | 1.06 | 30.34 | E | D | E | ||||||||||
Economic Growth and Environmental Sustainability | sub | 69.93 | 1.73 | 68.21 | 30K | 100K | APR 2013 | MAR 2014 | E | D | Meta-synthesis approach. | Evaluation activities for the upcoming FY. The cost estimate is based upon ED’s experience doing other hybrid evaluations. | |||||
Human Development | sub | 87.95 | 1.92 | 86.04 | E | D | Evaluation activities for the upcoming FY. | ||||||||||
Engaging Canadians | sub | 11.76 | 0.60 | 11.16 | E | D | |||||||||||
Knowledge Creation and Sharing | sub | 62.42 | 1.14 | 61.28 | E | D | |||||||||||
Multilateral Programs16: Financial Institutions | |||||||||||||||||
Asian Development Bank | med | 81.80 | 81.80 | E/D | E | Note that the current replenishment goes until 2016/17, but given the recent evaluation, it would be too soon to evaluate in 2015-2016 to feed into the replenishment process. | |||||||||||
African Development Bank | med | 137.14 | 137.14 | E | D | E | D | Evaluation activities for the upcoming FY. In order to feed into replenishment process the evaluation would need to take place in 2016-17. | |||||||||
Caribbean Development Bank | med | 21.00 | 21.00 | E | D | Evaluation activities for the upcoming FY. This evaluation should take place in 2015-2016 in order to feed into the next replenishment. | |||||||||||
Inter-American Development Bank | med | 63.35 | 63.35 | 30K | 100K | APR 2013 | FEB 2014 | E | D | To be reviewed using the approach developed under the guidance of and endorsed by the DAC/EVALNET. | |||||||
World Bank | med | 137.59 | 137.59 | E | D | ||||||||||||
International Fund for Agriculture Development | med | 12.50 | 12.50 | 30K | 100K | APR 2013 | NOV 2013 | E | D | To be reviewed using the approach developed under the guidance of and endorsed by the DAC/EVALNET. | Evaluation activities for the upcoming FY. MOPAN will be reviewing IFAD in 2013. | ||||||
Multilateral Programs: UN Development and Humanitarian Organizations/Programs, Commonwealth and Francophonie | |||||||||||||||||
United Nations Development Programme (UNDP) | low | 1.13 | 1.13 | D | E | D | Last evaluation completed in FY 2011-2012. | ||||||||||
Global Environmental Facility (GEF) | low | 57.29 | 57.29 | E | D | Evaluation activities for the upcoming FY. Replenishment discussions will begin in the summer of 2016. The evaluation would be more helpful just in advance of replenishment processes. | |||||||||||
World Health Organization (WHO) | med | 23.46 | 23.46 | E/D | E | Evaluation activities for the upcoming FY. Report completed based on results of the Multilateral Initiative pilot test, which used a methodology developed under the guidance of and endorsed by the DAC/EVALNET. | |||||||||||
World Food Program (WFP) | med | 88.77 | 88.77 | D | E | D | Last evaluation completed in FY 2011-2012. Moved a year ahead based on MGPB suggestion. | ||||||||||
Office of the United Nations High Commissioner for Refugees (UNHCR) | med | 16.20 | 16.20 | E | D | ||||||||||||
Global Fund for Aids, Tuberculosis and Malaria (GFATM) | med | 160.00 | 160.00 | E | D | Evaluation activities for the upcoming FY. The Global Fund is transitioning to a new funding model that significantly changes the way allocations, and therefore programming, is done. It would be beneficial to wait until evaluations are available under the new model as there would be no evaluation material available in the proposed period. | |||||||||||
Consultative Group on International Agricultural Research (CGIAR) | low | E | Evaluation activities for the upcoming FY. | ||||||||||||||
United Nations Population Fund (UNFPA) | low | 10.00 | 10.00 | E | D | Evaluation activities for the upcoming FY. | |||||||||||
UNICEF | med | 11.60 | 11.60 | 5k | JAN 2013 | JUN 2013 | E | E/D | To be reviewed using the approach developed under the guidance of and endorsed by the DAC/EVALNET. | Evaluation activities for the upcoming FY. This is a joint evaluation with the Netherlands. | |||||||
UNAIDS | low | 0.03 | 0.03 | E | The current strategic plan ends in 2015. The new plan will be developed in 2014. | ||||||||||||
UNWOMEN | low | 10.00 | 10.00 | E | D | ||||||||||||
Commonwealth Institutions | low | 2.60 | 2.60 | E | D | Possibility of conducting the review jointly with other donors. | |||||||||||
International Organization of la Francophonie | low | 5.27 | 5.27 | E | D | Possibility of conducting the review jointly with other donors. | |||||||||||
Operational Costs for Multilateral orgs is kept separate due to structure of CIDA not aligned with organizational spending | 13.31 | 13.31 | |||||||||||||||
Horizontal Evaluations | |||||||||||||||||
Corporate Social Responsibility Strategy | med | TBD | TBD | TBD | TBD | E | D | The specific timing, responsibilities, and resources for this evaluation are to be determined. DFAIT will lead this horizontal evaluation. | |||||||||
Corporate/Agency Wide | |||||||||||||||||
Corporate Evaluation of CIDA’s Humanitarian Assistance | high | D | E | D | Last evaluated in FY 2011-2012. | ||||||||||||
CIDA Learns | - | 20k | APR 2013 | NOV 2013 | C | C | C | C | C | C | |||||||
State of Performance Measurement Report | - | 20k | APR 2013 | NOV 2013 | C | C | C | C | C | C | |||||||
Muskoka Initiative Thematic Evaluation | sub | E | D | A commitment in the TB submission to carry out a thematic evaluation. An Evaluation Strategy was developed in 11/12 to lay the foundation for the evaluation to be carried out in 2015-2016. | |||||||||||||
Muskoka Initiative Review | sub | E | D | The ED will conduct a review of Canada’s contribution to the Muskoka Initiative in order to report back to cabinet in 2015. | |||||||||||||
Support to Branch-led Evaluations (incl. Tools, training, Standing Offers and consultant referrals, etc) | - | 25k | 100k | ongoing | E/S | E/S | E/S | E/S | E/S | E/S | Provision of support and advice to program branches on Branch-led evaluations. | Evaluation activities for the upcoming FY. | |||||
Dissemination / Learning Approach | - | 20k | 0 | ongoing | C | C | C | C | C | C | Evaluation activities for the upcoming FY. | ||||||
Evaluation Committee Secretariat | - | 10k | 100K | ongoing | C | C | C | C | C | C | Evaluation activities for the upcoming FY. | ||||||
Other: travel, training, translation miscellaneous | - | 200K | ongoing | C | C | C | C | C | C | Evaluation activities for the upcoming FY. | |||||||
Total Resources | 690K | 2.6M | |||||||||||||||
Total Program Expenditures covered by program evaluations | 2,166.77 | 57.27 | 2,109.50 | ||||||||||||||
Total of CIDA’s Grants and Contributions for 2012-13 | 2,884.82 |
Evaluation Commitments to TBSFootnote 17
The table below presents specific evaluation commitments indicated in the TB Submissions that were approved between FY 2006-2007 and FY 2010-2011. It excludes evaluation commitments already completed.
TB Submission (abridged title) | Lead* | Evaluation Commitments made to TBS and TB Conditions |
---|---|---|
Note: * PB: Program Branch; ED: Evaluation Division 18 The Afghanistan Evaluation was postponed to FY 2011-12 in order to ensure that sufficient information was available to conduct a summative evaluation. | ||
Corporate/Agency Wide | ||
1. Muskoka Initiative | ED | Thematic evaluation to be completed by 2015. A review of the initiative to be completed by 2014. |
2. CIDA’s Humanitarian Assistance Program and Organizations and Pakistan Humanitarian Response (Dec. 2010) | ED | Humanitarian Assistance program evaluation was completed in 2012. |
Country Programs | ||
3. Afghanistan Program | ED | Summative evaluation to be completed by FY 2009-201018 (postponed to FY 2012-2013). |
4. Burkina-Faso | PB | Summative evaluation to be completed by 2015. |
5. Canada Investment Fund for Africa | ED | Summative evaluation to be completed by FY 2013-2014. |
6. Pakistan (Debt Relief) | PB | Summative evaluation to be completed by FY 2013-2014. |
7. Senegal (Education Sector) | PB | Summative evaluation completed in 2011. |
8. Tanzania (Education Sector) | PB | Meta-evaluation to be completed by FY 2012-2013. |
9. Palestinian Authority’s Justice Sector | PB | Project delayed; evaluation timelines to be re-determined once re-scoping approved. |
Multilateral Programs | ||
11. UN Central Emergency Response Fund (CERF) | PB | An independent five-year summative evaluation was completed in 2011. |
12. Micronutrient Initiative | PB | Formative evaluation completed. Summative evaluation to be completed by 2014. An independent formative evaluation was completed in 2011. |
13. Canadian Foodgrains Bank | PB | Summative evaluation to be completed by FY 2011-2012. (Evaluation completed in 2010 due to accelerated disbursements.). |
14. GAVI Alliance - Global Partnership for Education; Global Partnership for Education (GPE) formerly “Education for All Fast Track Initiative”) | PB | Evaluations to be planned and managed through the a) GAVI Evaluation Advisory Committee; b) GPE Monitoring and Evaluation Steering Committee. |
15. Canadian International Food Security Research Fund | PB | Summative Evaluation to be completed by FY 2014-2015. |
16. World Food Program (approved July 2011) | PB | MOPAN review of WFP to be completed in 2014-2015. The Review of the World Food Programme's Humanitarian Assistance and Development Effectiveness was completed in 2012. To be also covered by the next Humanitarian Assistance program evaluation (2017-2018). |
17. East Africa Drought | ED | To be also covered by the next Humanitarian Assistance program evaluation (2017-2018). |
18. Global Malaria Program (WHO) | PB | Summative evaluation to be completed by FY 2015-2016. |
19. Scaling-up Nutrition through Integrated Life Saving Interventions (UNICEF/Helen Keller International (HKI)) | PB | Summative evaluation (UNICEF component) to be completed by FY 2013-2014. Summative evaluation (HKI component) to be completed by FY 2014-2015. |
Partnerships with Canadians Programs | ||
20. Jules and Paul Émile Léger Foundation | PB | Formative evaluation was rejected due to quality issues during 2011. Summative evaluation to be completed by FY 2013-2014. |
21. Canadian Francophonie Scholarship Program | PB | Formative evaluation was completed in 2012. Summative evaluation to be completed by FY 2014-2015. |
22. Volunteer Cooperation Program | PB | Formative evaluation was completed in 2012. Summative evaluation to be completed by FY 2013-2014. |
23. International Youth Internship Program (IYIP) | PB | Formative evaluation to be completed by FY 2012-2013. (in progress) |
24. Canadian International Food Security Research Fund (Phase 1) | PB | Summative evaluation to be completed by FY 2014-2015. |
25. Canada Fund for African Climate Resilience | PB | Summative evaluation to be completed by FY 2014-2015. |
26. Support to Aga Khan Foundation Canada’s initiatives entitled ‘Partnership for Advancing Human Development in Africa and Asia’ | PB | Formative evaluation to be completed by FY 2013-2014. Summative evaluation to be completed by FY 2014-2015. |
27. Canadian International Institution for extractive industries Development | PB | Institutional assessment to be completed by FY 2014-2015. Summative evaluation to be completed by FY 2017-2018. |
28. Canadian International Food Security Research Fund (Phase 2) | PB | Summative evaluation to be completed by FY 2017-2018 (NB: TB Submission currently with OMINE). |
Annex 1: Establishing the Evaluation Universe
In FY 2012-2013, the Evaluation Division revised its evaluation universe to take into consideration the results of the Deficit Reduction Action Plan (DRAP). The number of units in the evaluation universe is based on CIDA’s program activity architecture (PAA) (FY 2013-2014), the Performance Measurement Frameworks approved by Treasury Board Secretariat, and the commitments made in Treasury Board Submissions.
Based on these considerations and following extensive consultation across the Agency, including with the Multilateral and Global Programs Branch (MGPB) and Geographic Programs Branch (GPB) management, the Evaluation Division arrived at a tentative Evaluation Universe of 52 units for coverage during the next five (5) years:
- 5 evaluation units related to Fragile Countries and Crisis-Affected Communities;
- 8 evaluation units related to Low-Income Countries;
- 8 evaluation units related to Middle-Income Countries;
- 1 evaluation unit related to Regional Programming;
- 22 evaluation units related to Global Engagement and Strategic Policy;
- 5 evaluation units related to Canadian Engagement; and
- 3 specific commitments made in the Treasury Board submission.
Table 3 identifies the evaluation units stated above. Through consultation with Treasury Board Secretariat, it was determined that there is no obligation to evaluate sunsetting programs.
In planning the sequence of evaluations, the Evaluation Division consults widely across the Agency to ensure that user and corporate needs are being met and that ongoing programs are being evaluated every five years as per Treasury Board Secretariat requirements. Importantly, the Evaluation Division ensures that the timing of evaluations feed into the key decision making processes, such as the replenishment of multilateral funds.
Evaluation Unit Name | Number of Evaluation Units | Evaluation Risk Assessment |
---|---|---|
Fragile States and crisis-affected communities | ||
Afghanistan | 1 | High |
Haiti | 1 | High |
Sudan | 1 | Substantial |
West Bank/Gaza | 1 | High |
Humanitarian Assistance | 1 | High |
Sub-total | 5 | |
Low-Income countires | ||
Bangladesh | 1 | Medium |
Ghana and Ethiopia | 1 | Medium |
Mali | 1 | Medium |
Pakistan | 1 | Medium |
Senegal | 1 | Medium |
Vietnam | 1 | Medium |
Tanzania and Mozambique | 1 | Substantial |
Benin, Burkina Faso, DR Congo, Kenya, Nigeria, Mongolia | 1 | Medium |
Sub-total | 8 | |
Middle-Income countires | ||
Bolivia | 1 | Low |
Caribbean Regional | 1 | Medium |
Colombia | 1 | Low |
Honduras | 1 | Medium |
Indonesia | 1 | Low |
Peru | 1 | Medium |
Ukraine | 1 | Low |
Cuba, Guatemala, Nicaragua, Philippines, Sri Lanka, Egypt, Jordan, Morocco, South Africa | 1 | Low |
Sub-total | 8 | |
Regional Programming in Middle and low income countries | ||
Inter-American Program, Pan-African Regional Program, Southeast Asia Regional Program | 1 | Substantial |
Sub-total | 1 | |
Global engagement and strategic policy: Multilateral Strategic Relationships | ||
Asian Development Bank | 1 | Medium |
African Development Bank | 1 | Medium |
Caribbean Development Bank | 1 | Low |
Inter-American Development Bank | 1 | Medium |
World Bank | 1 | Medium |
International Fund for Agriculture Development | 1 | Medium |
UNICEF | 1 | Medium |
UNDP | 1 | Low |
UNFPA | 1 | Low |
WHO | 1 | Medium |
UNAIDS | 1 | Low |
UNWOMEN | 1 | Low |
GFATM | 1 | Medium |
GEF | 1 | Low |
CGIAR | 1 | Low |
Commonwealth institutions | 1 | Low |
Francophonie institutions | 1 | Low |
UNHCR | 1 | Medium |
WFP | 1 | Medium |
Sub-total | 19 | |
Global engagement and strategic policy: Multilateral and Global Programming | ||
Health Programming | 1 | Medium |
Sectors/Themes other than Health | 1 | Medium |
Sub-total | 2 | |
Global engagement and strategic policy: International Development Policy | ||
Corporate Social Responsibility and other policy-related evaluations | 1 | Medium |
Sub-total | 1 | |
Canadian engagement for development | ||
Economic Growth and Environmental Sustainability | 1 | Substantial |
Governance, | 1 | Substantial |
Human Development, | 1 | Substantial |
Knowledge Creation and Sharing | 1 | Substantial |
Engaging Canadians | 1 | Substantial |
Sub-total | 5 | |
Treasury Board Submissions Commitments | ||
Muskoka thematic evaluation. | 1 | Substantial |
Review of the Muskoka initiative | 1 | Substantial |
Canada Investment Fund for Africa | 1 | Medium |
Sub-total | 3 | |
Total | 52 |
Annex 2: Evaluation Risk Assessment
The Treasury Board Secretariat requires that government departments and agencies determine the proposed five-year evaluation schedule based on identified evaluation needs and risks. Therefore, TBS expects departments to undertake an evaluation risk-analysis exercise.Footnote 19
CIDA’s evaluation risk assessment is used for the following purposes:
- inform the allocation of financial and human resources for implementing each evaluation;
- inform the evaluation approach/methodology for each evaluation; and,
- inform the sequencing of evaluations to ensure appropriate distribution of evaluation risks across the five year plan, where appropriate.
CIDA’s evaluation risk framework (Table 4) was developed according to the Treasury Board Secretariat (TBS) guidelines.Footnote 20 Four criteria are included in the evaluation risk framework, namely political sensitivity, materiality, complexity, and context.Footnote 21 Each of these criteria has a direct impact upon the successful completion and dissemination of evaluations. The four criteria were weighted according to the rank of the associated corporate risk in CIDA’s Corporate Risk Profile.Footnote 22
The evaluation risk framework below (Table 4) describes the four evaluation risk criteria, their definitions, and the method of assessment.
Evaluation Risk Criteria | Definition | Assessment Method |
---|---|---|
23 The Multidimensional Poverty Index (MPI) ranges from 0 to 1. Alkire, S., J.M. Roche, M.E. Santos and S. Seth (November 2011) ophi.qeh.ox.ac.uk | ||
Reputational Risk | The sensitivity of a particular evaluation to affect CIDA’s reputation and the confidence of stakeholders in CIDA’s ability to fulfill its mandate. | The Evaluation Division assessed political sensitivity according to programmatic priorities and stakeholder interest as follows:
|
Materiality | The amount of CIDA’s financial investments in a particular program. | Materiality was assessed using 2011/2012 year-end program expenditures, as provided by CFOB. |
Complexity | The difficulty of conducting an evaluation, related to the delivery modes used in the program and the availability of associated performance management data for evaluation purposes. | Complexity was assessed using the best judgment of the Evaluation Division based on delivery modes (including humanitarian assistance), number of program units, and availability of performance management data. |
Logistical Challenges | The general conditions within the evaluation context that may delay or otherwise impede the completion of the evaluation, including socio-political situation, local infrastructure, security, conflict, and natural disasters. |
|
An evaluation risk criteria rating chart (Table 5) was also developed, as recommended by the Treasury Board Secretariat, to ensure consistent definitions of risk levels for each criterion. The evaluation risk criteria were matched with corresponding corporate risk(s) from CIDA’s Corporate Risk Profile. The evaluation risk criteria were weighted according to the rank of the associated corporate risk(s).
Evaluation Risk Criteria | Corresponding Corporate Risk(s) | Rank in Corporate Risk Profile | Weight | Low Risk (1) | Medium Risk (2) | Substantial Risk (3) | High Risk (4) |
---|---|---|---|---|---|---|---|
24 Context risk assessments based on the Multidimensional Poverty Index (MPI), which ranges from 0 to 1. Alkire, S., J.M. Roche, M.E. Santos and S. Seth (November 2011) ophi.qeh.ox.ac.uk | |||||||
Reputational Risk | Reputation | 2 | 30% | Low political sensitivity associated with program failure | Medium political sensitivity associated with program failure | Substantial political sensitivity associated with program failure | High political sensitivity associated with program failure |
Materiality | Funding | 1 | 30% | Low level of financial investment (Less than $20M/year) | Medium level of financial investment ($20.1M-40M/year) | Substantial level of financial investment ($40.1M-80M/year) | High level of financial investment (Over $80.1M/year) |
Complexity | Modality, Perf.Mgnt. | 11, 7 | 20% | Low degree of program complexity (delivery channels, modality) | Medium degree of program complexity (delivery channels, modality) | Substantial degree of program complexity (delivery channels, modality) | High degree of program complexity (delivery channels, modality) |
Logistical Challenges24 | Socio-Political- GE, Institutional Capacity, Natural disasters | 8, 6, 5 | 20% | Context presents low risk (MPI 0.0->0.25) | Context presents medium risk (MPI 0.26-0.50) | Context presents substantial risk (MPI 0.51-0.75) | Context presents high risk (MPI 0.76-1.0 or Fragile and Conflict Affected State) |
As described in Table 5, individual risks were scored from low (1) to high (4). The scores from each risk category were weighted according to the ranking of the associated corporate risk in the Corporate Risk Profile. The final average for each evaluation unit was then calculated. These final weighted averages were assessed with the Evaluation Risk Scale (Table 6) to determine the overall evaluation risk rating:
Evaluation Risk Scale
Low Risk: (0.0 - 1.9)
Medium Risk: (2.0 - 2.9)
Substantial Risk: (3.0 - 3.6)
High Risk: (3.7 - 4.0)
Using the methods outlined above, each evaluation unit was assessed by the four criteria (political sensitivity, materiality, complexity, and context).