Request for Proposals - Digital Impact Alliance

0 downloads 189 Views 823KB Size Report
May 30, 2017 - service providers design and deploy their services faster, at a lower .... digital/tech industry, includi
Request for Proposals Survey Design and Baseline Implementation May 30, 2017

This document summarizes the requirements for a proposed consultancy to plan and deliver an ecosystem-wide survey helping deliver a baseline status report on DIAL’s mission. It is expected that the selected applicant will serve as consultant and thought partner in exploring the digital ecosystem, developing a project-long survey/data collection methodology, and implementing the baseline survey during July to September of 2017. Upon successful completion of this contract, subsequent contracts may be engaged for continued study and research partnership with DIAL and its Department of Investment Evaluation, as they measure their progress toward creating a more efficient and effective digital development ecosystem.

DIAL Background Summary Digital technology is having a profound impact on society, enhancing our ability to solve longstanding global development challenges. Creating an inclusive digital society would ease our ability to communicate with everyone and allow new opportunities for innovative services to flourish. Mobile phones have become commonplace, with 3.7 billion unique subscribers worldwide which is nearly half of the world’s population. For the most vulnerable though, the digital divide exists and is growing. Persistent challenges that slow awareness and adoption of digital technology and services in the developing world include the limited reach of technical infrastructure and software maturity, the misalignment of financial incentives, uncertain policy environments, and scarce technical capacity. These factors impede the scale and speed of delivering digital services to millions of people, preventing them from realizing the full potential of better health, education, and economic opportunities. This imbalance must change. When we achieve a digital society that serves everyone, we have the potential to improve the lives of millions of people around the world. The Digital Impact Alliance (DIAL) was formed to bring the public and private sectors together to help realize an inclusive digital society that could connect everyone to life-enhancing and life-enabling technology. DIAL is staffed by a global team of technology researchers, developers, investors, negotiators, and policymakers. It is supported by world-class foundations and development agencies and guided by a board of leading emerging market entrepreneurs, technologists, and development experts. With this leadership, DIAL is uniquely positioned to serve as a neutral broker in this ecosystem, bringing together government, industry, and other development stakeholders to discover and promote new solutions to old problems. DIAL’s hypothesis is that a more efficient and effective digital ecosystem leads to more inclusive digital societies and this hypothesis shapes DIAL’s vision and mission, described below. DIAL’s vision is to realize a more inclusive digital society for the underserved in developing markets, in which all women, men and children, regardless of gender, geography, social or economic status, benefit from the life-enabling services available in an effective digital ecosystem.

30 May 2017

IE-01.1 Survey Implementation (Baseline)

1

DIAL’s mission is to overcome systemic barriers to enhance the collective efforts of donors, governments, industry and others in the digital ecosystem. To strengthen these digital ecosystems, DIAL focuses its efforts on convening the community, generating insights in key gap areas, discovering what works through demonstration models and subsequently advocating for the adoption of proven practice. DIAL is focusing its work in three areas: •

Platforms and Services: DIAL addresses reach and capacity challenges, working to help digital service providers design and deploy their services faster, at a lower cost, and to a wider audience in developing markets.



Data for Development: DIAL accelerates shared value scenarios in technology and economic models, working closely with MNOs and data holders. DIAL also supports discussions tackling sensitive privacy and security questions that hinder the public sector and development community’s access to and use of data to improve communications and services provided to the people they serve.



Insights and Impact: DIAL produces, curates, and disseminates evidence-based good practices packaged in easy to understand guidance so that governments, technology companies, the development community, and other implementers can quickly use it to inform ongoing efforts to fund, design, and deploy digital services to more people. DIAL then convenes these groups that share advocacy agendas to drive collective impact.

DIAL’s Measurement Approach DIAL’s founders have encouraged DIAL to take risks and continuously adapt its approach. This mandate requires a sophisticated and flexible approach to monitoring, evaluation and learning that allows DIAL to manage adaptively while still working toward consistent, long-term outcomes. DIAL’s M&E approach is based on Developmental Evaluation principles1 linking together three key components that combine a focused, rigorous framework with critical reflection and learning cycles. These key components are • • •

Results Framework (with clear outcomes and indicators identified which will remain constant over DIAL’s lifetime and through which DIAL is measuring its success) Investment-Level M&E (designed to project-manage and link investments to the results framework, while allowing for flexibility, change and adaptation), and a Learning Agenda (to ensure periodic, critical reflection on hypotheses, context, and progress toward mission).

DIAL strives to build principles of accountability and learning into all three of these components (for more information, see the attached document, Investment Evaluation Approach: How the Digital Impact Alliance (DIAL) Measures its Efforts, Learns from Them and Adapts to Change). Indicators. As part of its Results Framework, DIAL has identified primary and intermediate outcomes and corresponding indicators (see Attachment A), but has also defined a set of indicators reflecting its 1 Patton, Michael Q. “Developmental Evaluation: Applying complexity concepts to enhance innovation and use.” Guilford Press: London, 2011.

30 May 2017

IE-01.1 Survey Implementation (Baseline)

2

strategic goal of influencing how the digital ecosystem functions. To identify meaningful metrics at the ecosystem level, DIAL is borrowing from market systems work, best signified by the Springfield Institute’s Making Markets Work for the Poor (M4P) approach2. This work uses the concept of “crowding in” - the process through which “interventions catalyze, or bring other players and functions into the market system so that it works better for the poor.” This can be represented by breadth (more actors, more transactions, etc.), or reach (new areas or markets). A complete set of these results statements and indicators, with preliminary definitions and proposed data sources, is provided in Attachment B. DIAL anticipates that these indicators will largely be measured through surveys among members of the digital ecosystem, or ‘Digital Service Providers (DSPs).’ These DSPs represent four main segments of this ecosystem, namely NGOs/CSOs, funders, governments, and the private sector (technology developers and providers). Each group has a unique function (or set of functions) with respect to how they engage with the ecosystem: funding, designing, developing, or deploying digital services.

Scope of Work 1. Purpose. The purpose of this consultancy is to find a partner to help devise and implement the data collection/survey strategy to measure the ecosystem-level outcomes of the results framework. The consultant will be expected to build off of DIAL’s prior work, including early versions of similar studies and the existing results framework, to define and implement this strategy. This initial project will serve as the first phase of a long-term research approach to help measure DIAL's progress, including future surveys and contribution to other elements of DIAL’s Investment Evaluation approach. 2. Goals and Deliverables. We expect the work to entail the following key components: •

Landscape Analysis: Conducting preliminary desk review and research to ensure a shared understanding of the ecosystem - the stakeholder map, actors, key influencers, critical dynamics, most important participants for survey data collection, and opportunities for measuring change. It is important to note that much exploratory work has been done over the two years of DIAL’s inception. Interviews with many key members of the industry and ecosystem, research on systems and barriers, extensive strategic planning, and a ‘pilot’ baseline study have all been done and should be used extensively for this stage. All materials will be made available to the consultant and DIAL staff and leadership will be available to make recommendations and shape the analysis. This phase will result in a report that contains a clear definition of the ecosystem actors and strategy for identifying and reaching each group, overview of barriers and behaviors DIAL is attempting to address, and any additional recommendations for measuring them.

2 Springfield Institute. (2008). A Synthesis of the Making Markets Work for the Poor (M4P) Approach. Bern Switzerland: Swiss Agency for Development and Cooperation SDC.

30 May 2017

IE-01.1 Survey Implementation (Baseline)

3



Design: Recommend a survey methodology/approach for the life of DIAL, based upon knowledge from the above research, including recommended methods (assuming mixed quantitative and qualitative), frequency (e.g., baseline/endline, panel surveys, or other approaches), sampling frame (based on the above research and per each of the four identified segments of the digital ecosystem noted above), sample size recommendations, etc. This includes development of the data collection tools (see Attachment C for an initial preliminary and illustrative list of survey questions) and mechanisms. Methods and their tools should be appropriate to the sector, using best practices and approaches from the digital/tech industry, including web- and mobile-based applications, user-friendly design, appropriate incentives and communication strategies, etc. (Partnering with survey technology specialists, including those operating from the global south, is an option.)



Implementation and Analysis: Work with DIAL to implement a ‘baseline’ version of the survey to determine indicator status and key data points relevant to future implementation and produce a quality analysis of the results. This will include implementing all relevant tools, collecting qualitative and quantitative data and analyzing the data and results, and presenting it in ways that are interactive and useful, and can be disaggregated by a range of possible criteria (organizational type, sector, function, geography, gender, etc.). This should also take advantage, where appropriate and useful, of state-of-the-art digital tools, including dashboards, visualizations, maps, and/or other creative and effective dissemination methods. A final report on the ‘state-of-the-ecosystem’ will be published and widely shared.

Upon successful completion of this work, it is hoped that a long-standing partnership between the consultant and DIAL will continue with additional and related expert consultation and implementation over the life of DIAL. Illustrative Dates (based on kick-off of July 1, 2017)

#

Deliverable

Description

1

Summary report

2

Recommended survey methodology

This secondary research/review should use DIAL materials and any additional research needed to fill gaps or update understanding in order to compile a clear description of DIAL’s understanding of the digital ecosystem, including breakdown of the key sets of actors, important relationships and dynamics, a review of barriers, behaviors, and indicators of change, and any additional research required to fill gaps or update understanding A proposal that details data collection methods (quantitative/qualitative), definition of sampling frame, recommendations on sample size and selection process, and frequency of data

30 May 2017

July 14, 2017

July 21, 2017

IE-01.1 Survey Implementation (Baseline)

4

#

3

4

Deliverable

Set of quantitative and qualitative data collection tools Execute the research

5

Presentation of findings

6

Final report

7

Final data set

Description

Illustrative Dates (based on kick-off of July 1, 2017)

collection, and recommendations on strategy and incentives. Defining and testing any required survey tools, and question guides for qualitative data collection.

August 4, 2017

Execute the set of quantitative and qualitative data collection tools. The research process should be finished, the consultant should share an initial raw data set and interview notes. An in-person (or virtual) presentation with key staff to generate discussion, ask additional questions, and refine final analysis A report with findings, baseline status for Results Framework indicators, and key recommendations will be provided. Time- and resource-allowing, additional dissemination methods may be requested (e.g., blog, dashboard, briefings, webinar, etc.) It is expected that all data and meta data collected will be the property of DIAL and will be provided to the Director of Investment Evaluation at project closing.

August 30, 2017

September 15, 2017

October 1, 2017

October 1, 2017

3. Project timeline. DIAL is in its first full year of implementation (though Year 3 of its initial planning) and has at least three more years remaining. We are seeking a partner to work with us over multiple years, but this contract should take approximately three to four months, between June and October of 2017. DIAL proposes one month for the initial research and development of survey methodology and one to two months for implementation of the survey and analysis. Start date will be contingent upon final contracting and availability. 4. Role of Consultant. The consultant will work in close coordination with DIAL’s Director of Investment Evaluation, as well as senior leadership to approve methods, tools, and final products on a work for hire basis. 5. Performance Measurement. The deliverables of this project are linked inextricably to the Results Framework, DIAL’s strategic goal and its intermediate and primary outcomes. This project will be critical to DIAL’s ability to understand and describe its success. As noted, successful completion of this contract could lead to follow-on contracts in subsequent years of

30 May 2017

IE-01.1 Survey Implementation (Baseline)

5

DIAL’s implementation. ‘Successful completion’ requires definition and includes the following criteria: 1. Each deliverable has approval from relevant DIAL staff and will include a period of review and sign-off to indicate satisfaction with the work. 2. Response rate for any quantitative survey should meet or exceed industry norms. 3. Acceptable survey and interview dissemination plan is developed by the consultant, approved by DIAL, and 4. Quality of the work (ecosystem report and final survey report) is determined to be credible by our peers, and publishable within the sector. 5. Final negotiated schedule is adhered to with a window of variance not to exceed 4 weeks total. 6. Raw data is received in a format accessible to and readable by DIAL’s Director of Investment Evaluation. 6. Intellectual Property (IP) considerations DIAL’s mission is to create public goods that enable a more efficient digital economy for everyone’s common benefit. To serve this goal in partnership with other organizations and individuals, DIAL funds the development of important hardware and software, databases, computer protocols, and useful industry standards.

Intellectual property ("IP") is at the heart of all things creative and inventive. DIAL's IP policy is shaped by our key funders’ (i.e., the Bill and Melinda Gates Foundation, United States Agency for International Development (USAID) and the Swedish International Development Cooperation Agency (Sida) policies. DIAL’s work products thus must comply with BMGF’s “Open Access” policy. Further, any IP we fund should be licensed for free use worldwide. This is accomplished through open source and Creative Commons licensing and by open standards, unencumbered by restrictive copyrights and patents.

The scope of work for this project and deliverables will, as such, abide by DIAL’s intellectual property (IP) policy and its donor’ compliance requirements. If special considerations are required, we will consider those on a case by case basis with selected vendors.

As required by its donors, DIAL is committed to “Global Access”. As such, DIAL will ensure that knowledge and information gained from any project and any deliverable produced will be promptly and broadly disseminated under a creative commons license, and any funded developments will be made available at an affordable price to: 1. 2.

People most in need within developing countries and /or In support of the U.S. educational system and public libraries, as applicable

30 May 2017

IE-01.1 Survey Implementation (Baseline)

6

Proposal submission requirements Proposal submissions, which may be created in Word, PowerPoint, or a combination of the two, must include the following components. Respondents may include additional elements as needed. • Proposed approach – Demonstrate understanding of the project objectives – Describe approach and methodologies, as applicable – Describe project management approach, including timeline and any recommended updates to timeline provided above, including level of effort on the part of the DIAL team, e.g. to participate in scoping and requirements workshops, iteration junctures, etc. • Subject Matter Expertise – Staff and team structure ▪ Identify the team structure, including roles, responsibilities, and level of effort of staff and any sub-contracted resources (note that partnerships or sub-contracting is acceptable) ▪ Provide rationale and background on any sub-contracted firms or individuals – Relevant experience ▪ Demonstrate firm(s) and key participants’ experience relative to the scope of work (including partners/subcontractors) ▪ Provide at least 3 examples of similar work • Value – Provide a detailed budget, including assumptions and costs and level of effort for staff and any sub-contractors. The consultant may find it helpful to document assumptions per workstream as appropriate, for example: ▪ Research Study ▪ Quantitative Survey Implementation ▪ Qualitative Interviews ▪ Analysis and Reporting – Provide professional fees budget, including cost and level of effort per staff member – Provide separate line item for any sub-contractors – Provide expenses budget by type of expenses, e.g. travel, research, etc. Travel estimates should indicate the anticipated destination and duration of each trip • References – Provide names and email addresses of at least two prior client willing to discuss their experiences working with you.

Submission format and timeline • • • •

All submissions are due on June 20, 2017 by 6:00 pm EDT. We expect the submissions to be in the 10-15 page range but will not penalize submissions that are above or below this range. Questions and clarifications will be communicated to Respondents on or before June 12, 2017 with a kind request for prompt turnaround on part of the Respondents. The selected Respondents will be notified on June 30, 2017 by 6:00pm EDT. Please send all EOIs and email submissions to [email protected]

30 May 2017

IE-01.1 Survey Implementation (Baseline)

7



In case Respondents encounter a problem submitting, please contact Barbara Willett at bwillett@digitalimpactalliance.

Questions and answers Please forward any questions to [email protected] by June 12, 2017. DIAL will make every effort to respond to questions within 24 hours, and may choose to share the questions and answers from these bilateral discussions with other Respondents.

Evaluation Process DIAL will review all written proposals, and may request a phone or in-person interview and/or updated submission to address questions or provide clarification. The evaluation committee will use the following criteria to evaluate candidates’ response. The selection decision will be based on the following criteria: Criteria 1. Approach (10 points) The proposed approach shows an understanding of the objectives and a clear plan for achieving them. 2. Subject Matter Expertise (35 points) Appropriate level of understanding of the key stakeholders and dynamics within the digital ecosystem Experience working with large-scale quantitative surveys Experience with advanced data collection, analysis, and visualization tools. 3. Project Management (10 points) Demonstrated understanding of their proposed scope of work Achievable action plan that will deliver the project on time and on budget Thoughtful risk identification and mitigation strategies 4. Capabilities and Experience (25 points) Demonstrated firm experience with similar projects and in international development context Team members with demonstrated skills and experience with similar projects and activities High-quality sub-contractors and external advisors, if relevant, especially with knowledge and experience in global south/developing countries Appropriate access to resources and knowledge centers 5. Value (20 points) The proposed pricing is within budget The proposed pricing demonstrates a competitive price and good value for the money

30 May 2017

Score (1-5)

IE-01.1 Survey Implementation (Baseline)

8

Intent and disclaimer This RFP is made with the intent to identify a consultant to deliver results as described in this RFP. DIAL will rely on Consultant’s representations to be truthful and as described. DIAL assumes it can be confident in Consultant’s ability to deliver the product(s) and/or service(s) proposed in response to this RFP. If DIAL amends the RFP, copies of any such amendments will be sent to all Respondents.

Contract terms DIAL will negotiate contract terms upon selection. A copy of the contract terms and conditions will be provided to finalists. All contracts are subject to review by UN Foundation’s Business Services Budget Reporting (BSBR) team. Once a draft contract is reviewed by BSBR, DIAL’s Grants Manager will contact the Consultant. The project will start upon the execution of the contract. The contract will outline terms and conditions, scope, budget, and applicable flow-down terms.

Release Consultant understands that DIAL has chosen to solicit an RFP for consulting services, and that Consultant’s response does not guarantee that DIAL will enter into a new contract with Consultant or continue any current contract(s) with Consultant. Consultant agrees that DIAL may, in its sole discretion: • Amend or cancel the RFP, in whole or in part, at any time • Extend the deadline for submitting responses • Determine whether a response does or does not substantially comply with the requirements of the RFP • Waive any minor irregularity, informality or nonconformance with the provisions or procedures of the RFP • Negotiate with all consultants UNF deems acceptable • Issue multiple awards • Copy the responses This RFP is not an offer to contract. DIAL assumes no responsibility for Consultant’s cost to respond to this RFP. All responses become the property of DIAL. The Consultant, by submitting a response to this RFP, waives all right to protest or seek any legal remedies whatsoever regarding any aspect of this RFP. Consultant represents that it has responded to the RFP with complete honesty and accuracy. If facts provided in Consultant’s response change, Consultant agrees to supplement its response in writing with any deletions, additions, or changes within ten (10) days of the changes. Consultant will do this, as necessary, throughout the selection process. Consultant understands that any material misrepresentation, including omissions, may disqualify it from consideration for a contract award. Consultant understands it may receive proprietary and confidential information from DIAL during the RFP process (“Confidential Information”). Consultant agrees to not use Confidential Information for any

30 May 2017

IE-01.1 Survey Implementation (Baseline)

9

purpose other than its participation in the RFP process and to not reveal Confidential Information directly or indirectly to any other person, entity, or organization without the prior written consent of DIAL. Consultant further agrees to exercise all reasonable precautions to maintain the proprietary and confidential nature of Confidential Information where it can best demonstrate its value and capacity to delivery ecosystem-wide, meaningful value.

30 May 2017

IE-01.1 Survey Implementation (Baseline)

10

Attachment A: DIAL’s Results Framework

30 May 2017

IE-01.1 Survey Implementation (Baseline)

11

Attachment B: Results Framework Indicator Plan Strategic Goal: The digital ecosystem can more efficiently and effectively produce and adapt digital platforms and services and share data and insights targeted at accelerating the rate at which any developing country can achieve an inclusive digital society. Level

Strategic Goal:

Result Statement/Purpose

Indicator

Indicator Definition

Data Source

Depth of System Change (impact on those we reach directly and how they apply and innovate with new services, as well as the' crowding-in' of other providers who see this as a viable market opportunity)

# of DSPs reaching underserved populations (and types of services provided)

DSPs include tech companies, NGOs, funders, governments. Vulnerable populations can refer to the country (emerging markets, developing countries) or the specific target group (women, poor, disabled). 'Types of services' refers to what range of services a provider makes available, and across what vertical markets. We will seek to understand who (type of DSP), what (type of service), for whom (what populations are served), as well as exposure to DIAL-related activities.

research on ecosystem and system mapping, surveys and interviews with DSPs

Breadth of System Change (how change is reflected in outreach and inclusion)

# of people from underserved populations reached by DSPs (and types of populations)

DSPs include tech companies, NGOs, funders, governments. 'Reach' in this case represents the customer base or scale and by definition needs to include one or more category of 'underserved populations.' Will attempt to capture the total number of underserved populations served by DSPs surveyed, to see growth and change in number and types of people served.

research on ecosystem and system mapping, surveys and interviews with DSPs

Funder Behavior/Collaboration (enabling environment investment)

# of new digital development 'collective investments' that are implemented

'Collective Investments' may include joint funding, shared investments, partnerships, or other examples of collaboration among actors. 'That are implemented' implies that funding should have changed hands and activities begun on the ground. This may also include new, relevant investments into existing work, such as key sustaining funding for ODK or DHIS2.

Funder survey/interviews, surveys and interviews with DSPs

Gendered Perspective ensuring that DIAL's work enhances gendered perspectives within the ecosystem

# of DSPs/partners demonstrating improvement in incorporating gendered perspectives into programmatic and operational digital initiatives (and types of improvements)

'Incorporating gendered perspectives' can be reflected in a variety of ways, including improved inclusion of women and girls in digital services provided (programmatic), as well as policies, initiatives, or other activities that demonstrate a commitment to addressing gender in digital services (operational). The type of improvement may ultimately be more important than the total number of DSPs showing improvement, but further detail about these improvements, the types and extent of them, will be available.

Surveys and interviews with DSPs, program records, etc. In the T4D projects, gender participation trends can be captured in both development and consumer communities.

30 May 2017

IE-01.1 Survey Implementation (Baseline)

12

Level

Result Statement/Purpose

Indicator

Indicator Definition

Data Source

Primary Outcome

PO 1 - By FY19, providers of digital development services can design and deploy their services faster, at a lower cost and to a wider audience

# of DSPs improving their service delivery to underserved populations (and types and degree of improvement)

'Improving their service delivery' may include one or more types of improvements, including reduced cost of delivery, time to market, enhanced customer base, decreased down-time, or other measures of efficiency or effectiveness. This indicator represents a topline aggregated # of DSPs, but the data behind it will include more detail about the ways in which DSPs improve, the degree or extent to which they do, and will also be linked to the type of initiative that has improved their performance (virtual aggregator, messenger, open source platform, etc.). This will also include a specific look at what DSPs have specifically addressed gender and added a gendered perspective to their service delivery, including being able to report a more diverse customer base, etc.

DSP surveys, interviews, program records

Intermediate Outcome

IO 1.1 - By FY18, developers of digital development services leverage standard digital development software components for designing and deploying their services

# of DSPs using existing T4D software components to design and deploy their Digital Services

'Existing Tech for Development (T4D) software components' refers to technologies, software, platforms, or other tools (DIAL-associated or otherwise) that enable DSPs to build on what has been done, rather than building from scratch and creating new silos. (We may need to develop a specific list of components that fit into the T4D characterization. This we will do when we learn more about the trends and tools in use).

% of Platform Service Center clients who are satisfied with services provided

'Clients' are software projects receiving services from the Platform Service Center. Satisfaction will be determined based on a center-defined likert scale, to characterize quality of service.

DSP surveys (responses disaggregated by type of standard tech stack/service). Funder surveys will also provide window into shifting investment trends as this takes root. Satisfaction surveys among clients. Will include looking at met and unmet needs.

# successful graduates of DIAL capacity building programs.

'Successful' means completing required training, coursework, testing, or other assessment standards as defined through the centers. Disaggregated by gender, as well as source of participant (new graduates, current DSP employees, etc.). Follow-up with graduates will continue as they engage their services with DSPs.

Records of capacity building programs, training record (including where they come from, where they go next)

# of DSPs who are satisfied with the DIAL trained technologists that they have engaged.

'Engaged' can mean hired directly, contracted, or otherwise employed for their services by a DSP. 'Satisfied' means that trained technologists are providing acceptable services to DSPs and the work has improved as a result. This will likely involve a likert-scale to assess quality, as compared to a simple yes/no response, as well as qualitative information to help inform future trainings.

Surveys and interviews with DSPs about quality of work received, based on records from capacity centers and follow-up with graduates.

# of markets served by standards-compliantaggregators

'Markets' refers largely to countries (or other market-based or geographic segments). 'Aggregators' are created to provide streamlined and standardscompliant access to channels for digital service providers, with appropriate standards in use.

Program Records (will be working directly with them so will have access to that information

# actors using Messenger platforms to deliver services to underserved populations

Industry actors includes tech developers, messenger platforms, and others that service providers can leverage for improved delivery. Where possible, delivery of services will also examine type of services, and number reached.

Program Records (will be working directly with them so will have access to that information)

Intermediate Outcome

IO 1.2 - By FY18, providers of digital development services have streamlined access to channels that allow them to reach more users in select countries

30 May 2017

IE-01.1 Survey Implementation (Baseline)

13

Level

Result Statement/Purpose

Indicator

Indicator Definition

Data Source

Primary Outcome

PO 2 - By FY19, public service delivery and development programs have improved access to, understanding of and use of data for development (D4D)

# of D4D actors engaging (using and/or sharing) with digital data (mobile, satellite imagery, etc.) as measured by known cases

D4D' (Data for Development) actors include industry, developers, MNOs, service providers, NGOs, governments, etc. 'Engaging' refers to using digital data, or sharing digital data, depending on which actor is engaging. “Known" cases are those that DIAL supports directly or learns of indirectly through dedicated trainings, guidance, policy efforts, surveys, or other means. Findings will also explore the types of usage and outcomes of this usage.

Program Records (will be working directly with them so will have access to that information)

# of public service providers employing data and analytics from digital sources in their decision making # of D4D actors who report improved understanding of and acting on responsible data policies and practice # of demonstration projects illustrating the value of digital data for decisionmaking documented and disseminated

'Public service providers' is inclusive of the broad development community outside out DIAL's immediate partners.

Various sources, including program records, surveys, key informant interviews, etc.

D4D actors include those engaged in D4D activities, including industry, developers, MNOs, service providers, NGOs, governments, etc.

Various sources, including program records, surveys, key informant interviews, etc.

'Demonstration Projects' are those completed by (or funded/supported by) DIAL which have been successfully completed and results documented/shared (e.g., case studies for each). These projects will be about working with MNOs and service providers to illustrates how valuable digital data can be in making decisions about services to underserved populations. Use and replication of these projects and results will be monitored through higher level outcome indicators. Of the demonstration projects noted above, taking a gendered perspective will not always be feasible or appropriate, but one or more projects should enable a gendered perspective

Program Records - M&E results from demonstration projects, including case studies for each project.

# of users of digital data platforms/partnerships catalyzed by DIAL # of knowledge products promoting responsible use of digital data for improved services to underserved populations prepared (and type of product)

Users can refer to those downloading information, accessing the platforms, requesting data or information, etc. Access methods for 'using' are TBD as the tools and resources evolve. 'Knowledge products' include tools, guidance, advocacy materials, economic models and/or global policy guidance. These tools will be categorized and monitored to examine direct use (what types are most used, most positively received), as well as application to higher-level improvements (understanding which resources most influenced DSP service improvements).

Program Records - platform analytics, website analytics, etc. Program records, download rates, etc. We will monitor use across tools and topics to understand more about what is and isn't being used and how they are being received.

# of portfolios of public good assets for D4D packaged

'Public Good Assets' refers to APIs, codes, methodologies, etc. related to improved and publicly available assets for D4D.

Program records, download rates, etc. We will monitor use across tools and topics to understand more about what is and isn't being used and how they are being received.

Intermediate Outcome

IO 2.1 - By FY19, in select countries, providers of public services and development agencies can access and use insights and indicators based on mobile and digital data

# of demonstration projects explicitly addressing gendered perspectives

Intermediate Outcome

IO 2.2 - By FY19, stakeholders have access to common public goods (e.g. APIs, tools, knowledge products) on how to embed data and analytics in their decision making

30 May 2017

IE-01.1 Survey Implementation (Baseline)

Program Records - M&E results from demonstration projects

14

Level

Primary Outcome

Result Statement/Purpose

PO 3 - By FY19, governments, funders and implementers adopt emerging good practice when they fund, design and deploy digital services

Intermediate Outcome

IO 3.1 - By FY19, DSPs access emerging good practice for more efficient and effective design and deployment of digital services

Intermediate Outcome

IO 3.2 - By FY19, funders access emerging good practice and increasingly collaborate for impactful digital development investment IO 3.3 - (Tentative) By FY19, policy makers access evidence-based policy and regulatory best practice for digital development

Intermediate Outcome

Indicator

Indicator Definition

Data Source

# of unique users of common public goods created on generating D4D

Users can refer to those downloading information, accessing the platform, requesting data or information regarding any of the public goods produced. Access methods for 'using' are TBD as the tools and resources evolve.

Program records, download rates, etc. We will monitor use across tools and topics to understand more about what is and isn't being used and how they are being received.

# of governments, funders and implementers who have implemented the PDD

As governments, funders and implementers endorse PDD, this requires active implementation of some kind - inclusion into M&E, program proposal/RFP guidelines, public discussion/presentation, or other overt displays of how the PDD are actually being used.

Funder survey, secondary research (websites, publications, RFPs, etc.)

# of DSPs reporting improved capacity to serve underserved populations (as a result of DIAL interventions) # of guidance, documents, and learning products on more effective and efficient design and deployment of digital services generated and disseminated

Whether through use of DIAL knowledge products or use cases, adoption of PDD, partnership with other ecosystem actors, there is a shift in how members in the ecosystem view their ability and/or willingness to provide services to underserved

DSP surveys, interviews, feedback mechanisms

These tools (knowledge products such as trainings, papers, toolkits, etc.) will be categorized and monitored to examine direct use (what types are most used, most positively received), as well as application to higher-level improvements (understanding which resources most influenced DSP service improvements). Each product will contain content specifically oriented toward understanding, enabling, or employing a gendered perspective on the subject.

Program records, download rates, etc. We will monitor use across tools and topics to understand more about what is and isn't being used and how they are being received.

# of governments, funders and implementers who have endorsed the PDD % of target funder participation in DIAL donor events and activities # of citations of DIAL research, insights or support as a meaningful input to policy making/program delivery

Endorsing the PDD doesn't necessarily mean implementing (captured at higher levels of the RF), but suggests awareness of and agreement with the concept.

PDD website and program records tracks endorsements

Targeted funders are those directly invited to an event hosted by DIAL, and participation means attendance.

Program records (registration, participation, event evaluation)

tbd

tbd

30 May 2017

IE-01.1 Survey Implementation (Baseline)

15

Attachment C. Sample Ecosystem Survey Questions Following are suggestions for questions that might be included in the ‘ecosystem-wide’ survey, including those that are needed to obtain baseline data for results framework indicators, and those that will help track perceptions and behaviors to help us understand behaviors. These are only suggestive, and a final set will be developed as a result of the state-of-the-ecosystem study, and survey planning discussions with DIAL team members. Indicator-Related Exploratory: • Does your organization provide (sell, distribute, deploy) a product or service that operates over mobile (cellular) networks or the internet? (one example of such a service is M-pesa in Kenya) Does your organization provide a product or service that is currently in use (sold, deployed or operational) in a developing market or a development market? • What are barriers to serving these populations. Ask explicitly, is your organization able to serve underserved populations? Why or why not (if already serving, why? (financial, mission, social responsibility). If not, what are the barriers). • If not now, does your organization intend to serve underserved populations? why or why not? (maybe differentiate by business opportunities or social responsibility) • What is your current user base (total; by populations e.g., male/female; age segments; urban/rural; economic brackets; disability) • What types of services do you provide (by user group) • Have you collaborated with others in funding, planning, designing or deploying services to these markets? With whom? • Do you have policies in place or understand regulations around responsible data practice? What kind, how long, how implemented? How successful? Negative consequences? • Are you aware of the Principles for Digital Development? Has your organization endorsed them? What department is responsible/implicated in their operationalization? (e.g., IT, M&E, BusOps, Program, etc.). How has it impacted your organization? Technical • Do you use or fund open-source or other collaborative digital platforms, services, tools, or apps? What kind? • Do you do any of the following in order to ensure you offer/support/improve digital services for women and girls? -data disaggregation -analysis of data by gender -gender-segmented approaches -gender policies -other Please describe, and discuss impact of this work on your services. • how long does it take to get a new service to market • how much does it cost to launch • how long does it take to reach profitability • If selling services, number and type of customers • If providing services, number and type of beneficiaries

30 May 2017

IE-01.1 Survey Implementation (Baseline)

16



What would help improve your efficiency? What are key barriers?

Funders: • how many digital services have you funded, how many successful, how many beneficiaries reached, • how many sustainable? how many repurposed, how many locally sustainable? NGOs: • how many digital services have you implemented - new, repurposed, open source, other; how many beneficiaries reached, • how many of these services are sustainable, locally, or can be repurposed by organization, cost per $$, • When funding/designing/developing/deploying digital services, what existing tools or components have you used (provide list), and what have you built from scratch. • Do you use digital data made available from other sources? What sources? For what? For whom? If not, why not? Perceptions and Behaviors • How effective is the ‘digital ecosystem’ in enabling you to provide quality services, or to serve underserved populations? • How easy is it to deploy services, to reach underserved, to make a difference? • How are outcomes captured, how do they measure success among their beneficiary populations, or for their services?

30 May 2017

IE-01.1 Survey Implementation (Baseline)

17