Job Title: EMOS Learning Review Consultancy
Organization: QED Group, LLC
Project Name: USAID Environmental Management for the Oil Sector (EMOS) Project
Duty Station: Kampala, Uganda
About QED:
The QED Group, LLC is a full-service international development firm that provides practical solutions to social problems through sound analysis, proven management techniques, and creative implementation. We focus our efforts on two core practice areas: Monitoring & Evaluation and Knowledge Management. We work with U.S. agencies and overseas governments, international donors, private-sector clients, and PVOs/NGOs in more than 80 countries around the world. Key clients include USAID, the U.S. Department of State, the Centers for Disease Control and Prevention, the U.S. Department of Agriculture, and the U.S. Trade and Development Agency.
About EMOS Learning Review:
Environmental Management for the Oil Sector (EMOS), funded by the U.S. Agency for International Development, was developed to assist in building capacity to address the environmental threats emanating from development of an oil industry in Uganda’s biodiverse Albertine Rift. The overall aim of the project is to improve Ugandan understanding of oil and gas development and prepare the country to address, and also prevent, damaging impacts on biodiversity and ecosystems.
Through EMOS, Tetra Tech the contractor is working toward three ultimate goals:
- Strengthen capacity of Government of Uganda institutions to manage the environmental impacts of the oil and gas sector
- Develop academic programs to manage and respond to the environmental impacts of the oil and gas sector
- Increase knowledge of the oil and gas sector for Ugandan civil society to participate in decision-making processes
In April 2016, USAID/Uganda worked closely with Measuring Impact to facilitate a 3-day workshop to design a learning review for the EMOS Activity. The purpose and process of this workshop involved (1) reviewing and updating EMOS’ results chains that were first developed at the initiation of the Activity in 2014; (2) using the results chains to draft specific questions for testing assumptions; and (3) defining information needs, indicators, and units of analysis for each learning question. From this workshop emerged a set focus questions within the three result areas; from this initial set of questions would emerge the learning agenda for the EMOS learning review.
Now three months later (July 2016), USAID/Uganda Natural Resource Management team seeks to undertake a learning review of the EMOS Activity, with specific attention given to three identified learning questions. These questions are:
1. What, if any, barriers are preventing EMOS supported academic programs from getting national accreditation?
2. To what extent (or how) are personnel trained in TEV using those skills to develop management plans and conduct ESIAs in the Albertine Rift?
3. To what extent (or how) do EPI and DLG’s access data from the National Biodiversity Data Bank, and use the data to develop management plans?
The prioritization of these questions maps closely with discussions and decisions made during the design of the learning review, where the particular question in focus reflects the extent to which data and data sources are currently available to answer the question, the question will help to address (or accelerate) implementation and achievement of results, there are broader implications for programming within the sector, plus key considerations for M&E and CLA efforts by EMOS.
Purpose of the Learning Review: The purpose of this learning review is to provide information (learning) and recommendation to USAID/Uganda’s Natural Resource Management team and the Implementing Partner (IP) in support of program planning and management for the remaining life of the EMOS Activity. Additionally, addressing the questions posed will help to advance the documentation of EMOS’ implementation experience to date, capturing contextual shifts, collaboration experience, decisions taken, and adaptations made. The learning review will emphasize the ‘how’ and ‘why’ around the core focus/interest areas identified as the exercise will help to test critical assumptions, and foster a better understanding of the Activity’ theory of change plus the necessary and sufficient conditions required for successful USAID programming in the oil and gas (energy) sector in Uganda.
Approach and Methodology: Per the learning review design matrix, this exercise will employ document review, key informant interviews, site/location visits where possible and a stakeholders’ dialogue (workshop). Use of a short survey instrument may be considered. Additional tools such as stakeholder mapping, visual problem structuring and analysis, or data use maps shall be incorporated as needed. The overall aim is to undertake a review and learning approach that fosters shared issue analysis and recommendations setting for USAID, Tetra Tech, and sector stakeholders.
Period of Performance and Deliverable(s): The learning review is expected to help guide EMOS’ upcoming efforts for annual work plan development and the refinement of its activities around capacity development in the oil and gas sector for the next 18 months. Therefore, a completion timeline of early September for this learning review is ideal, with a level of effort and key milestones outlined below.
Illustrative Level of Effort in Days
No. Activity (Duration)
1. Desk Review* (5 Days)
2. Interviews, stakeholder meetings, and site visits (as needed) * (10 Days)
3. Findings presentation, report writing and finalization (5 Days)
Total LOE not to exceed 20 days
*Can happen concurrently
Key Milestones
USAID/Uganda approval of the SOW (and Consultant) by August 19, 2016
Inception Report by August 29, 2016
First draft of the report by September 14, 2016
Debrief/presentation by September 16, 2016
Final report and completed AAR by September 23, 2016
Presentation of the findings from this learning review shall take the form of an interactive reflection, synthesis and discussion with the team(s) at USAID/Uganda and with the Implementing Partner (as appropriate). The presentation is not expected to take the form of a lecture or with the primary use of PowerPoint slides, but should be an interactive, collaborative working space where a reflection on the process of conducting the learning review (i.e., getting to answers), what has been learned, as well as the recommendations and their implications are unpacked, honed, and reviewed for significance and coherence. This session is anticipated to last up to 90 minutes, with 40 minutes presentation time, 30 minutes of question and answer, and a final 20 minutes to confirm next steps towards use of the findings and finalization. A draft report from the Learning Review shall be circulated not later than 3 working days in advance of this session.
The final report from the Learning Review shall not exceed 25 pages (minus annexes of data analyses, highlights and/or synthesis of notes from interviews/group discussions, and other ‘evidence’). A 2-page briefing document highlighting core/critical findings, conclusions and recommendations shall be produced once feedback on the draft final report is consolidated. This 2-page will accompany the final full report as the main deliverable of this learning review.
Quality Assurance: Quality assurance efforts as part of this review will include rigorous documentation of interviews and the consistent use of templates for qualitative data collection and organization. A ‘getting to answers’ matrix’ will map out how analytical questions and triangulation processes will guide the generation of findings from the learning review questions. A structured framework for translating findings into conclusions and then to recommendations will be utilized so that the logic, flow and interpretation of evidence is clear.
At least 3 (once weekly) debriefs and synthesis moments among the learning review team across the will take place to (1) provide a check-in moment on progress, and (2) provide time for discussing and making any adjustments or modifications needed in methodology, analytical approach or the execution timeline. Internal (Learning Contract) peer review and editing support will be used to ensure completeness and coherence, clarity, and professional packaging of the final deliverables. A mid-point check-in and an internal end-line debrief (at the completion of report drafting) with the USAID Point(s) of Contact will yield active feedback and early confirmation that learning review products are in line with expectations. Use of USAID’s standard evaluation report checklist will guide the structure and presentation of the final report.
Target Audiences: The primary audience for wider sharing of the lessons and recommendations include the Natural Resource/Environmental Management team at USAID/Uganda, as well as the Program Office and various technical teams’ Monitoring, Evaluation and Learning Specialists. Other technical offices at the Mission can also benefit from a joint review of the methodology and approach used to conduct this learning review. Stakeholders/collaborators in the oil and gas sector in Uganda can make use of the findings and recommendations in their policy development processes and capacity building plans. USAID/Uganda implementing partners may also find the process of conducting the learning review helpful as they develop collaborating, learning and adapting approaches as a more formal part of their program management efforts.
Qualifications, Skills and Experience: A primary, single qualified Consultant is proposed to conduct this analysis. It is anticipated that the selected consultant shall have:
- An advanced (Master’s) degree or higher in environmental sciences, public administration or other field related to development studies with a focus on capacity building approaches
- A minimum of ten years’ proven experience conducting program assessments, evaluations and/or action research
- Proven skills in quantitative and qualitative data collection and analysis techniques, including strong synthesis, and writing and presentation capabilities
- Possess an appreciation for and able to incorporate a review of data quality and improving monitoring & evaluation processes as part of assessments/evaluations/research
- Proven familiarity with approaches to constructing and reviewing theories of change, logic models and causal pathways
How to Apply:
All suitably qualified and interested Candidates who meet the above criteria should send their CV, Cover letter and Bio Data Form, download here, expressing interest to thelearningcontract.recruit@gmail.com. Please indicate title of expertise in the subject line.
Deadline: 10th August 2016 by 5:00 PM
Sponsored Links