NGO Jobs in Kenya - World Vision

Position: Regional Monitoring & Capacity Building Coordinator

Location: Karen, Nairobi
Responsibilities
  • Planning in the Programme Cycle, 15%
  • Facilitate the development of a programme design that builds partner capacity in a way that ensures sustainability. Make recommendations for the log frame development, based closely on relevant information. Support the programme managers in the development of the logic of the intervention, as required
  • Facilitate a participatory review of logical flow and consistency within the hierarchy of objectives and assumptions. Contextualise indicators drawing on community conversations as appropriate
  • All AP under the Region have effective design documents completed in time
  • Identify key areas for monitoring according to the log frame. Identify and contextualize methodologies for appropriately measuring indicators using participatory methods where possible
  • Functional monitoring framework, M&E Plans and tools in place
  • Work proactively with key staff and stakeholder to develop detailed monitoring plan for the technical programme, ensuring clarity of roles and responsibilities
  • Monitoring plans put in place and in use; evidence by schedules and reports
  • Share monitoring and evaluation plans with key stakeholders and incorporate their feedback as appropriate. Seek approval for monitoring and evaluation plan from program design team lead as necessary
  • Approved M&E Plans in use (showing contribution of key stakeholders and approved)
  • Support planning for effective Surveys (Evaluations and Baselines) for APs transitioning or entering new phases within the Region
  • 100% execution of all the planned surveys and assessment
  • Support Area programs to plan and coordinate M&E activities
  • Well adhered to AP Specific M&E Plans
  • Lead the process of annual planning and budgeting by all the APs and Grants depending on their implementation calendar and phases
  • 100% timely completion and submission of the Annual Plans for Grants meeting donor requirements
  • Technical Support all DME and CB in the Region, 20%
  • Provide support to AP teams and partners in data processing, consolidation and analysis across the local programming areas.
  • Technically sound and csted AP process and Monitoring Plans and processes
  • Provide oversight on scheduling and budgeting for monitoring processes according to the needs of the technical programmes (TPs).
  • Support Programmes/Projects to ensure real-time monitoring and appropriate data storage for timely utilization by staff and partners to show progress and issues at Area Program, regional and national level.
  • Local programming data/information readily available for reference in decision making and programme evaluation/surveys,
  • Work closely with the area programme managers and teams to co-design technical programme M&E plans that are relevant to the local programming context.
  • TP M&E Plans in place and in use
  • Provide constructive/technical support to area programme managers on monitoring data generation to ensure that information produced is relevant and useful to the difference stakeholders.
  • UP to date and retrievable monitoring data in the M&E system
  • Share lessons learned from technical programmes with area programmes to enhance local planning and implementation. Suggest modifications that might be necessary at the local level based on evidence and lessons learned
  • Consolidated lessons from review of monitoring data documented and used in improving TP programming
  • Support the establishment and ensure use of quality monitoring systems and processes within the Region
  • Established and usable monitoring Systems in place
  • Visit field offices to monitor GIS operationalization in programs/projects and collect data
  • GIS Field Visit report with recommendations and action
  • Quality Assurance and Management through the Programming cycle, 20%
  • Ensure the sustained implementation of all DME related processes including LEAP/ HORIZON ,STEP, GIS/WVDPA /HAP/SPHERE at the Area Program and Regional operational levels;
  • Up-to-date monitoring databases (within the portfolio of projects) and in use in reporting and decision making
  • Ensure that D,M&E related activities – assessment, design, monitoring, baselines, evaluations, transition, and documentation – are successfully implemented as per standard and disseminated to staff and used to inform future ME processes and decision making.
  • DM&E Key components and processes meets the quality standards of practices
  • In relation to the AP Designs: Verify that collaboration with partners is appropriate to their capacity. Ascertain that the specific needs of the most vulnerable children are addressed. Verify the inclusion of collaboratively developed sustainability and transition plans with associated agreements with partners
  • Design documents meets the LEAP and relevant donor specific standards
  • Ensure harmonised timely and comprehensive routine and non- routine monitoring system for quality data as per the Technical program needs at the county and regional level
  • Joint/Integrated Quality monitoring completed and reported
  • Review Area Program monitoring data to ensure that it is accurate, relevant and contributes to national office strategy and CWB.
  • Up to date monitoring system (Horizon)
  • Ensure real time mapping of projects using GIS in the area programs
  • Up to date mapping of all project assets and profile
  • Develop periodic data verification and data quality improvement plans for the region
  • Functional data verification plan in place that is aligned to the National Data Quality Management guidelines
  • Ensure that GIS business processes, standards and policies developed in coordination with other WVK entities are effectively and consistently used by operations staff
  • GIS mapping adheres to the recommended processes and meets the standards and policies of practice
  • Ensure thorough data cleaning and processing prior to analysis and that descriptive and appropriate statistical analysis/tests are applied according to the analysis plan/indicator detail sheets, with support from NO DME specialists and consultancy teams.
  • All progress and Impact measurements are completed using aprorpiate Statistical Analysis and plan
  • Ensure that appropriate methods are used for sorting and analysing qualitative data, according to the analysis plan/indicator detail sheets, with support from the NO me specialists or consultancy teams.
  • Analysis summaries and appropriate factsheets are generated and shared with relevant stakeholders
  • Work with the Area Program manager to ensure active critical reflection and interpretation of the findings in context at the community and regional level.
  • Survey report meeting completed in time and disseminated to relevant stakeholders
  • Final report filed in appropriate repositories
  • Reporting of Progress and Performance, 10%
  • Facilitate/socialize Area Program teams’ understanding of requirements for a management report, according to LEAP/WV procedures. Guide the teams in the use of data management system efficiently to access relevant and necessary data for report generation
  • 100% completion and timely submission of the AP Management Report as per guidelines and schedules
  • Review the analysed and interpreted data from across the area programmes in the region to collate learning and use the monitoring data developed during the reporting period as the basis for the programme report. Facilitate review sessions of participatory monitoring data to identify key messaging for the report
  • Documented learning and monitoring data and reports used in reporting
  • Coordinate collection of feedback from technical report reviewers to Area Program managers. Check the report for accuracy and quality to ensure they meet report LEAP requirements. Provide Technical support to program managers on uploading reports on Horizon
  • All the AP and Grants Reports in the Region are reviewed and feedback provided for
  • Support documentation of special donor reports or documentations of best practices or success stories
  • We documented reports meeting donor and standard reporting requirements
  • Lead the process of semi and annual reporting by all the APS and Grants depending on their implementation calendar and phases
  • 100% timely completion and submission of the Semi/Annual Reports meeting donor requirements
  • Capacity Building, 10%
  • Support implementing staff to have required capacity (skills, knowledge & attitude) in DM&E processes and utilize it for enhanced programming.
  • Staff engaged in training, mentoring, coaching and practical field experience.
  • Promote an enhanced culture of learning, innovation and practice of LEAP/DME discipline and Empowered World View.
  • We adhered to learning schedule and Reports documented
  • Lead in the implementation of new program effectiveness initiatives to staff within the region e.g. Horizon Wave 2, WVDPA, EWV
  • 100% roll out of the new initiatives as per standards and schedules
  • Support in building capacity of implementing staff to integrate Programme and accountability framework within Area Programs
  • All targeted staff trained of PAF and follow up reports on application done
  • Support Local Institutions capacity building initiatives
  • Institutional assessment reports in place
  • Up to-date guidelines and procedures
  • Advisory support provided to Regional ME teams on capacity building of local institutions.
  • Support Programs/Projects sustainability and transition (S&T) planning and implementation
  • Programs/Projects S&T plans in place and being implemented.
  • Research, Surveys (Evaluation/baselines) and documentation, 15%
  • Participate actively in scoping discussions with key stakeholders, in an analytical and well-informed way to inform program evaluation TOR. Build capacity of field level teams in understanding the relevant information for the basic information sheet.
  • Approved TORs and schedules for surveys meeting the DME Standards
  • Support the Area Programs teams in logistics planning for the training of data collectors and the data collection in the field in accordance with ToR. Critically analyse and approve plans as necessary to respond to field reality, in collaboration with the evaluation lead or and Area Program manager
  • Sound Methodology and tools applied in the scheduled surveys/measurements and other studies
  • Coordinate the review and provision of feedback on standard tools in consultation with DME Coordinator Research and documentation and field staff/partners.
  • Participate in the field testing of tools to ensure evaluation lead has adopted a technical sound methodology to ensure accuracy in data collection. Provide critical, review and approve the final tools to be used for the evaluation after feedback has been received
  • Co-facilitate training for data collectors and data entry clerks, in a way that supports effective learning. In collaboration with the Area Program manager support the smooth-running of the data collection process. Supervise the consultant during data collection and data entry clerks, with due care and attention to ensure maximum data quality
  • Data collection meets acceptable standard/agreed on and documented methodology and quality assurance procedures
  • Support AP storage of collected data securely as per policies relating to data protection for all programs at the regional level. Deal with and/or escalate technical issues and problems proactively as they arise, adapting plans as appropriate. Participate actively in reflection sessions and provide constructive critical feedback as necessary
  • Data storage and retrieval meeting the policy standards
  • Support data analysis, interpretation and validation with relevant stakeholders according to the indicator analysis plan
  • Robust/rigorous analysis according to the indicator detail sheets – Analysis Summary
  • Participate in feedback processes, with community, staff and partners in a context-sensitive way. Co-facilitate reflection and learning with staff and partners to refine interpretation of the findings. Review and provide considered feedback on the baseline or evaluation report, as requested
  • Well documented reflection and learning report on the surveys/study findings
  • Provide input for products appropriate for different audiences to share the findings and recommendations, as requested. Participate in reflection with DME team to review and refine learnings about the evaluation process
  • Documented evidence of feedback/input provided on the various DME products
  • Contribute to identification, protocol setting and monitoring of action researches integrated to Technical Programmes or their models
  • Up to date implementation of action researches given in regular tracking reports
  • Strategic Coordination and collaboration, 5%
  • Establish and maintain links with external organizations for collaboration, networking, resource sharing, materials development, and learning activities in DM&E. Evidence of functional partnerships with learning institutions on potential research areas for evidence based programming sustained with running agreements/MoUs
  • Ensure WVK participation in learning/research forums within the county and National platforms/forums for wider experience gathering and sharing of practice. Active Learnings for replication from participation in external learning/research forums
  • Any Other duty 5%
Qualifications
  • Bachelor’s degree in a relevant field from a recognized University, preferably in Social Sciences (Project Planning and Management, Statistics, Development Studies, Monitoring and Evaluation, Impact Assessments and Accountability) or related studies from a recognized university. Post graduate degree will be an added advantage.
  • Minimum of 5 years’ active experience in project cycle management
  • Extensive conceptual understanding of and demonstrated practical command for implementing program design/Logical Approach, management and evaluation principles;
  • Experience in integrated Programme/Projects design and associated tools development
  • Demonstrated ability to train and build capacity of other staff and grassroots institutions for effective Programme delivery
  • Must have knowledge and practical experience in Research, organizational learning and documentation and have good writing and editing skills
  • In-depth knowledge and understanding of WV working systems, policies and standards will be an added advantage
  • Certification in DME Appropriate systems and software; SPSS, STATA, Epi Info, Ena for SMART, R, Windows Excel, among other Qualitative and quantitative analysis technics
  • Certifications in monitoring, Evaluations, data analysis, Project Management and or in documentation/knowledge management, Programme Management for Development Professional
  • Ability to design and coordinate surveys (feasibility studies/assessments, baselines, evaluations; using both online and offline platforms
  • Good level of proficiency in data analysis techniques/software and data management
  • Ability to design M&E tools, surveys, surveillance systems, and evaluations
  • Strong interpersonal skills and managerial capacity [Preference]
  • Ability to build capacity of staff on relevant technical fields
  • Proficiency in written and spoken English.
  • Effective in written and verbal communication [English].
  • Good interpersonal, organizational and management skills
  • Ability to solve complex problems and to exercise independent judgment
  • Experience in production of high quality briefs and reports
  • Experience in publications of articles/journals
How to Apply

Position: Records Management Assistant
Location: Nairobi
Job description
To facilitate the Finance Group in maintaining proper records and filing.
Responsibilities
  • Maintain proper and accurate filing of all finance documents.
  • Facilitate both internal and external auditors
  • Documents archiving
  • Maintain proper and accurate filing for all finance vouchers
  • Reconciliation of the field digital files
  • Digitization of files
Qualifications
The following may be acquired through a combination of formal or self-education, prior experience or on-the-job training:
  • Education: A Degree in Records Management, Business Administration or a related study.
  • Must demonstrate hands on experience of filling/documentation and familiarity with the various filling systems.
  • Experience: A minimum of 2 years work experience in records management or related administrative roles
  • Past experience in accounting computer packages, especially sun system
  • Experience working in grant related programs.
  • Good interpersonal skills and a team player
  • Work environment: Nairobi based with occasional travel to countries having active conflict within the East Africa region.
How to Apply