Consultancy: Evaluation of Development Account Project

negotiable / YEAR Expires in 13 hours

JOB DETAIL

Result of Service
The primary tasks of the consultants are the following: -Desk review of key reference documents related to the project -Development of full methodology which can include developing/refining key evaluation questions, indentification of indicators and measurements, etc. -Inception report -Development of initial results and recommendations and briefing to project team and management -Engagement with project staff, participants and beneficiaries -Drafting of evaluation report The consultant should explore human rights, disability and gender in the design, data collection and analysis and present relevant data wherever available and meaningful. The evaluation will also review these issues through a specific evaluation question on “What, if any, tangible results have been achieved through the integration of mainstreaming issues into work under the project?” This approach may be further developed and refined by the consultant in the development of methodology. Expected outputs and delivery dates 1.Development of methodology/inception report, revised with project team, and annotated outline of report: 15 July 2024 (See Annex 2 for a suggested outline) 2.Complection of interviews and data collection: 1 September 2024 3.Draft report: 1 October 2024 4.Final report: 1 November 2024 Performance Indicators Compliance with the terms of reference, including timeliness and quality of the deliverables, as assessed by DESA/SD: • Quality of consultation and data collection process; • Clarity of presentation of evaluation report and recommendations; • Usefulness of evaluation process; • Receptive/responsive to feedback; The evaluation will be conducted in accordance with the principles outlined in the UNEG ‘Ethical Guidelines for Evaluation’ (http://www.unevaluation.org/document/detail/102). Evaluators should demonstrate independence, impartiality, credibility, honesty, integrity and accountability to avoid any bias in their evaluation. Evaluators must address in the design and implementation of the evaluation, such as procedures to safeguard the rights and confidentiality of information providers. The evaluator will follow the standard Code of Conduct which should be carefully read and signed.
Work Location
Remote
Expected duration
1 Jun – 1 Nov 2024
Duties and Responsibilities
There is an urgent need to strengthen the capacity of national statistical systems to leverage the use of administrative data for statistical purposes, in order to fill gaps in the data available to policy and decision makers to monitor progress and implement the 2030 Agenda, and to address emerging challenges such as the COVID-19 crisis. When good-quality administrative systems are in place and their information is regularly updated, they can reliably and continuously provide a full picture of key aspects of a country’s population or economy, on a continuous basis. Re-use of data which are collected by government agencies and ministries in the course of their regular work often reduces the response-burden on the citizens and businesses and makes the statistical system more resilient to disruptions created by natural disasters and other emergencies. Data from existing administrative registers are, however, not primarily collected for the purpose of producing official statistics, but rather to fulfill other functions, such as legal or regulatory compliance, delivery of government services, etc. Many countries are still struggling with the lack of infrastructure and technical and institutional arrangements for the efficient exchange and processing of administrative data and metadata for the production of official statistics. The UN Statistics Division (UNSD) of the Department of Economic and Social Affairs (DESA) therefore initiated a project that aims to address the challenges currently faced by many national statistical systems, linking up with other ongoing initiatives such as the Collaborative on use of administrative data for statistical purposes. It takes a cross-cutting approach and supports 8 countries in their work to improve and increase the use of administrative data sources for statistical purposes, with a different area of focus in each country. The goal is to help improve utilization of administrative data sources for the implementation of the 2030 Agenda and for monitoring of the Sustainable Development Goals, which would be demonstrated by participating countries having improved collaboration among government entities, developed proposals for improved legal frameworks and established new processes for data sharing and quality assurance. The learnings from countries has been and will continue to be used to inform other countries and share experiences across. The general information is also being transformed into an e-learning course and general guidance materials that will aim at complementing the more thematically focused materials and tools already available. The 8 project countries are: – Africa: Cameroon, Namibia and Tanzania – Asia and the Pacific: Bhutan, Maldives and Sri Lanka – Latin America: Chile and Dominican Republic The countries were selected based on the following criteria: 1.Demonstrated interest and ongoing work on inter-governmental collaboration and sharing of administrative data to help increase availability, timeliness and quality of SDG indicators. 2.An interest to work on an area that is also useful in a wider context linked to the use of administrative data for statistical purpose so that they can function as practical case studies that can also be used as examples for other countries. 3.Regional representation; As countries tend to compare themselves with countries in their own region, three regions have been included; Africa, Asia and the Pacific and Latin America. 4.An overall assessment of potential options with an aim to cover different thematic areas (i.e. health, population, business, environment etc.) and cross-cutting challenges (i.e. legal frameworks and trust, harmonization of standards, technical interoperability etc.) Funded under the 13th tranche of the Development Account (DA13) with a budget of 624,000 USD, the project started in March 2021 and it will be operationally completed in December 2024, with main activities implemented in June 2024. The evaluation will be conducted for the purpose of accountability and learning. For accountability, it will provide independent evidence regarding the efficiency and effectiveness of DESA’s statistical capacity building work. For learning, the evaluation will aim to inform management decisions on future priorities in this field, as well as different approaches including joint work with UN partners, optimizing strategies for enhancing developmental outcomes. It will rigorously investigate the successes and challenges encountered, aiming to enrich the decision-making process, thereby enhancing the efficacy and impact of future projects. The primary audiences of the evaluation are DESA-Capacity Development and Programme Management Office and staff of the Statistics Division. Secondary audiences include staff of other DESA Divisions as well other key partners in the project. The evaluation will be shared with the project stakeholders (CDPMO, DA team, etc.), including beneficiary countries, upon their request. The main purpose of this evaluation is to provide an independent assessment of the achievements of the project, through an analysis of relevance, effectiveness, efficiency, sustainability and orientation to impact of the project. The evaluation will assess the extent to which the project’s outcomes were effectively and efficiently achieved, and the relevance of the project’s contributions. Effectiveness: Evaluate the project achievements, taking into account the indicators of achievement provided in the project document, and provide an indication of whether the project is likely to have lasting impacts on the intended beneficiaries. Analyze the implementation strategies of the project with regard to their potential effectiveness in achieving the project outcomes and impacts, including unexpected results and factors affecting project implementation (positively and negatively). Efficiency: Assess the overall value of the project activities and outcomes in relation to the resources expended, including, if possible, the added value by additional resources or substantive contributions, i.e., those beyond the original project’s budget or work plan. Relevance: Assess the relevance and coherence of project’s design. Regarding country needs and how the project is perceived and valued by the target groups. Ascertain the significance of the contributions made by the project to beneficiary country individuals, institutions and other key stakeholders. This component should include an assessment of the quantity, quality and usefulness of the activities and outputs. Sustainability: Assess the extent to which the benefits/results/activities will continue after the project has come to an end, from the perspective of beneficiary country individuals, institutions and other key stakeholders. Gender and human rights perspectives: Examine to which extent gender and human rights issues have been addressed. Coherence: Examine the project complementarity and coordination with other relevant interventions under the criterion of coherence. Impact: Assess tho which extent the intervention has generated or is expected to generate significant positive or negative, intended or unintended, higher-level effects. Furthermore, the evaluation will identify lessons learned, good practices and recommendations for the key stakeholders to improve implementation of project activities in general. Work assignment This Evaluation will be conducted as an independent exercise, based on documentation related to the project, online communication including interviews and e-mails with key individuals from the U.N. implementing organizations, from the beneficiary countries and project stakeholders, who are expected to provide information, opinions and assessments to the consultant (henceforth, the “Evaluator”), upon request. The evaluation will be undertaken from 1 June to 1 November 2024 (part time). The Evaluator will liaise with the DESA/Statistics Division (DESA/SD), and the DESA/Capacity Development Programme Management Office (DESA/CDPMO) for logistics and administrative issues, while conducting the evaluation independently. The draft report to be prepared by the Evaluator will be delivered to DESA/SD, who will also share with CDPMO for comments. All comments to the draft report will be compiled by DESA/SD and will be transmitted to the Evaluator with suggestions for additions or modifications. The evaluation will include: A desk review of project documents including, but not limited to: (a)The project document, reports and other outputs produced by the project, activity reports, financial reports of DESA/CDPMO, progress reports, and selected relevant correspondence. (See Annex 1 for a detailed list of documents that are to be reviewed) (b)Other project-related material produced by the project staff, partners, or beneficiary country counterparts; Interviews with key individuals from the U.N. implementing organizations, from the beneficiary countries and other project stakeholders. These meetings will be conducted online. In addition, the evaluator will have the possibility to join project meetings, including experience exchange meetings, of the project where he/she also can engage and ask questions. The Evaluator shall determine whether to seek additional information and opinions from other persons connected to the implementation of the project. The following are the evaluation questions that have been identified at this stage of the evaluation. The evaluator should identify which questions will be reviewed in the inception report. The questions below will be assessed considering the objective, indicators of achievement, planned activities and outputs as set forth in the project document. The evaluation will focus on the following main questions: 1.Did the project strengthen national capacities in the project countries in establishing and implementing effective collaboration between agencies holding administrative data and the national statistical office, potentially leading to increased evidence-based policy formulation, monitoring and evaluation at national level? 2.Did the project identify and make recommendations about the key entry points, during the duration of the project, to impact relevant social policy and programme development and implementation? 3.Did the project strengthen national capacity of National Statistical Offices and other agencies of the National Statistical Systems to increase the use of data collected for administrative purposes in official statistics production and dissemination, particularly for SDG indicators and assessment of impact of Covid-19 on the society? 4.Did the project effectively ensure the participation of country representatives in project activities? 5.Did the project promote South-South cooperation to share knowledge and experiences? 6.Did the project strengthen intra-Governmental collaboration with a focus on data sharing to increase availability, quality and timeliness of disaggregated data for SDG indicators? 7.Did the project increase availability of disaggregated SDG indicators? Effectiveness: 1.What are the achievements of the overall project objectives/outcomes? 2.Is the monitoring and evaluation system results-based and facilitates a project adaptive management? 3.Assess how contextual and institutional risks, and positive external to the project factors, have been managed by the project management? Efficiency of resource use and coherence: 1.Have resources (financial, human, technical support, etc.) been allocated strategically to achieve the project outputs and outcomes? 2.How well coordinated were implementing entities in implementing joint activities and/or among the implementing entities at project level? Impact orientation and sustainability: 1.To what extent have targeted countries been able to make use of knowledge products/tools to improve their work and enhance results 2.Which project-supported tools have been institutionalized, or have the potential to, by partners and/or replicated or external organizations? 3.Is the project contributing to expand the knowledge base and build evidence regarding the project outcomes and impacts? 4.How can aspects of the project that proved successful be scaled up and replicated after the project ends? Gender and human rights perspectives 1.To what extent were gender and human rights perspectives integrated into design and implementation of the project? 2.How can gender and human rights perspectives be better included in future projects design and implementation? 3.To what extent did the project promote gender equality and nondiscrimination? The methodology of the review will be determined by the consultant, in cooperation with DESA/SD. The methodology should provide robust evidence to support analysis that responds to the evaluation questions and sub-questions previously elaborated. The methodology should provide the framework for analysis (e.g., using theory of change ), define the indicators and data to be used for assessment (in relation to the criteria), the data collection and processing methods, and analytical tools (e.g., statistical analysis). In order to use the strongest evidence available and maximize the credibility of the analyses, it is recommended to have a wide range of data sources that can be triangulated with each other. Due to the nature of the project, that focused on capacity building and intra-Governmental collaboration, the following methods should be considered in the evaluation: -Document review of work processes, outputs, documents, job descriptions, partnerships agreements, previous evaluation results, strategies, meeting minutes and work plans. -Direct observation of relevant virtual meetings, processes and experience exchanges -Semi-structured or structured interviews with staff, internal and external partners, Member State representatives, beneficiaries and other stakeholders. -Secondary analysis of monitoring and programme data including performance, financial and other data available. -Content analysis and expert review of key activities and/or outputs. -Case studies of project outcomes, to identify positive factors that should be continued in future work and negative factors that should be avoided.
Qualifications/special skills
-Advanced university degree (Master’s degree or equivalent) in statistics, economics, development studies, international relations, public administration, or a related field is desirable. A Bachelor’s degree in combination with two additional years of qualifying work experience may be accepted in lieu of an advanced degree. -A minimum of seven (7) years of progressively responsible experience in social sciences, including statistics or economics -Experience in project or programme evaluation -Good analytical, writing and inter-personal communication skills -Solid knowledge of statistical production, ideally with particular focus on the use of administrative data and quality management -Knowledge of the Sustainable Development Goals and national efforts to implement them
Languages
Fluency in oral and written English is required. Knowledge of French or Spanish is desirable.
Additional Information
Not available.
No Fee
THE UNITED NATIONS DOES NOT CHARGE A FEE AT ANY STAGE OF THE RECRUITMENT PROCESS (APPLICATION, INTERVIEW MEETING, PROCESSING, OR TRAINING). THE UNITED NATIONS DOES NOT CONCERN ITSELF WITH INFORMATION ON APPLICANTS’ BANK ACCOUNTS.
New York, United States