Next Article in Journal
The NFT Purchasing Process and the Challenges to Trust at Each Stage
Next Article in Special Issue
Preparation of an Environmentally Friendly Rice Seed Coating Agent and Study of Its Mechanism of Action in Seedlings
Previous Article in Journal
The Impact of the Aggregate Used on the Possibility of Reducing the Carbon Footprint in Pavement Concrete
Previous Article in Special Issue
Data Type and Data Sources for Agricultural Big Data and Machine Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Benefits and Challenges of Making Data More Agile: A Review of Recent Key Approaches in Agriculture

1
The Committee on Sustainability Assessment (COSA), Philadelphia, PA 19147, USA
2
International Center for Evaluation and Development, Sakumono JWCP+XJ7, Ghana
3
GDi Partners (GDi), New Delhi 110065, India
*
Author to whom correspondence should be addressed.
Sustainability 2022, 14(24), 16480; https://doi.org/10.3390/su142416480
Submission received: 24 August 2022 / Revised: 28 October 2022 / Accepted: 10 November 2022 / Published: 9 December 2022
(This article belongs to the Special Issue Farming 4.0: Towards Sustainable Agriculture)

Abstract

:
Having reliable and timely or ongoing field data from development projects or supply chains is a perennial challenge for decision makers. This is especially true for those operating in rural areas where traditional data gathering and analysis approaches are costly and difficult to operate while typically requiring so much time that their findings are useful mostly as learning after the fact. A series of innovations that we refer to as Agile Data are opening new frontiers of timeliness, cost, and accuracy. They are leveraging a range of technological advances to do so. This paper explores the differences between traditional and agile approaches and offers insights into costs and benefits by drawing on recent field research in agriculture conducted by diverse institutions such as the World Bank (WB), World Food Program (WFP), United States Agency for International Development (USAID), and the Committee on Sustainability Assessment (COSA). The evidence collected in this paper about agile approaches—including those relying on internet and mobile-based data collection—contributes to define a contemporary dimension of data and analytics that can contribute to more optimal decision-making. Providing a theoretical, applied, and empirical foundation for the collection and use of Agile Data can offer a means to improve the management of development initiatives and deliver new value, as participants or beneficiaries are better informed and can better respond to a fast-changing world.

1. Introduction

There are many calls for better quality data and statistical modernization to guide sustainability investments and policies [1,2]. The broad ambition of the 2030 Agenda creates the need for an unprecedented range of reasonably high-quality statistics, at different levels: sectoral, subnational, national, regional and global [3]. Traditional data sources such as household surveys differ in terms of coverage, frequency, objective, timelines, and questionnaire design. This presents an important challenge for the monitoring of Sustainable Development Goals (SDGs) [4]. Further, in contrast to administrative data, household survey data have often been collected less frequently, and by multiple organizations and institutions, each with different focus and capacities. In some cases, the national surveys are implemented by governments, and in others, they are administered under the auspices of an international organization. Even this simple difference can generate substantial data gaps in terms of comparability, standardization, and the levels of disaggregation [4]. The high costs of traditional data gathering, associated with classical methods adopted in household surveys such as recall questions and self-reported measures, can frequently generate significant erroneous estimates of land [5], yields [6], farm labor [7,8], and fertilizer [9]. These facts signal the need for standardization and improvements in our monitoring systems and strategies to produce data in ways that are more reliable and more timely.
In this paper, we discuss recent data science developments that can transform not only the speed but also the accuracy and functionality of data collection and processing. Highlighting benefits and trade-offs of traditional non-agile data, including the time lags, high costs, and measurement complexities. The term non-agile is used simply to distinguish a contrasting approach; we do not imply a pejorative judgment. We present and characterize some emerging, technology-enabled solutions with Agile Data systems. We discuss the most recent tools introduced by the World Bank [9,10,11,12,13,14], USAID [15,16,17], CGIAR [18,19], World Food Program [20] and Acumen [21,22,23], focusing on how they present many of the features of agile approaches to data, and also how some do not manage to attain valuable agile attributes.
This analysis serves to characterize the main features of Agile Data, especially its ability to deliver real-time, high-quality data that can serve to improve interventions as they are unfolding. Costs are a salient feature and can be low because significant automation and standardization can be utilized across an array of context-appropriate technologies, (mobile, apps, chatbots, etc.) that reduce the need for enumerators travelling to the field. The approach is adaptive and with leading partners, such as the International Center for Evaluation and Development (ICED) in Africa and GDI Partners in South Asia, we are testing if it can accommodate a wide range of types of information related to program interventions, outcomes, compliance, well-being, and resilience. Similarly, within diverse applications such as those of the International Coffee Organization (ICO), the Gates Foundation’s Sustain Africa (fertilizers), and the World Poultry Foundation (WPF), the processes are being adapted for easy and large-scale use.
Combined with new geospatial data technologies and artificial intelligence (AI), Agile Data offers place-specific insights into an array of concerns ranging from new practice adoption to human rights and deforestation. Agile Data also creates additional value when conducted in tandem with traditional technologies. With appropriate incentives, farmers tend to provide more timely and accurate responses during the data collection effort, generating more functional knowledge. This relational approach to data—what the Committee on Sustainability Assessment (COSA) with the Ford Foundation [24] calls Data Democracy—enables policy makers to better address the SDGs and pressing political agendas with timely and extensive monitoring frameworks to meet the demand for collection, processing, and dissemination of data by and within countries [4].

2. Materials and Methods

This paper builds on the most current work around agile, semi-agile and non-agile approaches to data to distill optimal practices and embedded knowledge already in use among data scientists and development practitioners. The vision of Agile Data was expressed by Owen Barder in 2013 [25] in reference to the software industry. This vision has been more recently adapted to the international development context [17,26]. In contrast with the waterfall model typically followed by the development community—meaning, based on implementation in sequence, just as waterfalls cascade in one direction—an agile, adaptive and interactive approach is based on feedback loops through performance metrics, as in Figure 1. By comparison with the waterfall model, the agile model is based on multiple rapid iterations of “design, build, test” to adapt the project design. In contrast, the waterfall model is often based around the implementation of a prepared master plan [17].
Combining the Agile project design and monitoring approach introduced by the literature with the innovation brought by digital technologies in data gathering, we obtain an Agile approach to data. In practice, what we define as Agile Data is a monitoring, evaluation and learning (MEL) data-driven approach that can improve outcomes and learning in development through an adaptive project design and monitoring combined with rapid deployment of surveys, data processing and analysis. In this approach, data are collected using short-duration or low-volume inquiries that can be conducted with high frequency and at relatively low cost.
In Section 3.1, we compare the waterfall versus the Agile model, isolating characteristics of the Agile Data compared to the non-agile data through a systematic review of the literature. Section 3.2 of the paper contains a review of recent key lessons learned about Agile approaches to data in the field of agricultural development, highlighting the needs for statistical modernization and standardization. Through this review, the paper contributes to the characterization of an Agile Data system, helping development practitioners to build better data approaches to deliver greater knowledge efficiencies and potentially better impacts.
COSA Definition: Agile Data is a Monitoring, Evaluation and Learning (MEL) approach that provides timely insights to facilitate adaptive learning and improve investment or intervention outcomes by rapidly deploying short-duration surveys that can be conducted at various fre-quencies and at relatively low cost. It applies targeted and context-appropriate field technologies such as IVR, apps, chatbots, or SMS and employs human or artificial intelligence to provide automated data validation, analysis, and feedback loops to users.
Used in rural development programs or supply chains, it is configured to deliver higher-quality, real-time data, reducing survey fatigue among beneficiaries. It differs from most monitoring and evaluation by actively engaging data subjects more purposefully for more accurate information and mutual iterative learning during an engagement rather than after its completion.

3. Results

In the current complex and changing contexts of large-scale development programs and supply chains, we identify the potential benefit of shifting toward more Agile Data approaches and isolate six attributes that distinguish Agile from non-agile data utilizing a review of the recent literature. To support this hypothesis, we present evidence from field experiences of different organizations working on Agile approaches globally, highlighting the factors that are not yet developed and that will be necessary to take full advantage of these approaches. These include statistical modernization and standardization at both the semantic and structural data levels, together with the emergent opportunities to embrace the potentially considerable benefits of a more farmer-centric data approach. In particular, we illustrate the last point—an approach that we call Data Democracy—of how an Agile Data approach can build on a farmer’s own first-hand information, and thus not only reduce the noise often present in multi-actor or multi-stage data channels (that gather, clean, and interpret), but also position farmers in a more direct learning and exchange function using a continuous flow of data throughputs to attain potentially unprecedented levels of understanding.

3.1. Non-Agile Data versus Agile Data

Much of the current data flow in development programs is traditionally structured as non-agile. It is a plan-driven approach with sequentially dependent steps such as analysis and planning, designing, pilot testing, and deployment for use [27]. The non-agile approach to data has been widely used and validated to collect accurate information on agriculture in complex environments including those characterized by small farm size, remote plots, multiple cropping systems, and poorly demarcated land boundaries [28,29]. According to Sourav et al. (2020) [30], large volumes of data are generated using traditional systems for data collection in agricultural contexts, but it is challenging to process the data using traditional data analysis. Non-agile approaches are disadvantaged by the costs associated with both the data gathering and the data analysis in the data generation process. Other examples in the literature show how the long process associated with data processing and analysis does not generate timely information, limiting the capacity of farm management methods to respond effectively to in-field variability in growing conditions or incorporate real-time information about weather conditions in managing agricultural activities [31,32].
The ultimate value in Agile Data for development is the Agile mechanism that enables teams to gather data quickly and more frequently. This approach allows us to understand key metrics faster, thus delivering quick learning through feedback loops and a greater possibility to respond to change and succeed [14,20,33,34,35]. An Agile approach allows for adapting questions through modules and across different data collection periods, iteratively building on learning from answers in early interviews. Recent studies in the field of extension services in agriculture, food security, and resilience show that an Agile data mobile phone-based approach to dissemination of information as a service for smallholders can have a positive impact in helping farmers to face shocks and stressors [36,37,38,39], improve nutrition checks and actions [40], promote farm management practices [41], deliver advice through an automated advisory service [42,43], and use speech-based services as a viable way for providing information to low-literacy farmers [44].
  • Measurement errors due to recall questions embedded in low-frequency surveys
At the micro level, non-agile data collection approaches could lack consistency and accuracy related to recall questions. Garlick et al. (2020) [45] argue that the recall method used for farm surveys overestimates farm labor per person per plot through recall bias that creates a countervailing effect on hours of farm labor at higher levels of aggregation, thereby overestimating the labor productivity of household-operated farms. Similar examples can be found in estimations of common measures of agricultural production and productivity [46]. The World Bank (2020) [10], using data from the Living Standard Measurement Study Integrated Surveys on Agriculture (LSMS-ISA) in Malawi and Tanzania, showed that with longer recall periods, farmers report higher quantities of harvest, labor, and fertilizer inputs, indicating the presence of non-random measurement error. According to Dillon et al. (2018) [47], Fraval et al. (2018) [7] and Kilic et al. (2021) [48] and Carletto et al. (2013) [49], self-reported measures of land size and yields generate bias in the estimation and create non-credible data observations. The unreliability of farmer-level observations, such as yield measurements, labor, and land, have decision-making implications for household land and thus substantially understate agricultural production and labor productivity [50].
Recent studies show that high-frequency phone surveys per single user are extremely useful for limiting the bias in the collection of agricultural data. An example is with regard to labor inputs or the harvesting of continuous crops such as cassava, for which the use of long recalls is highly inaccurate [51,52,53]; or for water quality measurement in the case of aquaculture management [54]; or for plot size and productivity [55,56,57,58]; or for enforcing labor contracts [59]. The recent integration of active artificial intelligence (AI) is further facilitating the interpretation of land use data and is making it more commonplace to access data that are no more than a few days old. Satellite data are also used to provide precise yield estimates. Benami et al. (2021) [52] indicate that remote sensing data and spatial modeling allow users to estimate crop yields as well as monitor them at scale at the requisite frequency and timelines; hence, many researchers are exploring improved algorithms that enable measurement of agricultural yield from space [28,60]. Carter et al. (2017) [61] and Flatnes et al. (2018) [62] validated the use of maize and rice yield estimates from higher-resolution satellite data. A study conducted by McCarthy et al. (2021) [39] in Uganda compared maize yield estimates generated from satellite data and crop models against farmer self-reports (surveys), subplot crop cuts, and full-plot crop cuts. The results show that remotely sensed yields captured over half of the variability observed in the full crop cut data for pure stand (i.e., not intercropped) plots > 0.10 hectares. These results point to a promising possibility of eventually using inexpensive, publicly available earth observation data combined with crop models to characterize key field conditions such as yields and yield losses.
High-frequency panel remote surveys are also useful to track socioeconomic and health impacts of shocks such as COVID-19 and Ebola [50,63,64] and to enable the users to collect timely information on shocks, stressors, and associated resilience strategies [59,65,66]. Hoogeveen et al. (2020) [64] showed that when a systemic shock occurs, it tends to affect most of the different actors involved in the food supply chains (food producers, retailers, transporters, etc.) and prevents most of them from operating efficiently [64,67]. In the case of shocks, timely information would have contributed to a rapid communication between different actors of the supply chain and helped to diminish or prevent supply chain disruptions [5,19,50,68,69]. At the macro level where decision-makers focus on the bigger picture of policy implementation and impact, non-agile data approaches can weaken the policy response.
2.
Low comparability at portfolio level due to limited interoperability (non-standardized metrics)
Non-agile data are usually focused on specific geographies and on assessing or reporting on an intervention or related issues of interest. Even though this approach enhances in-depth assessment within limited scopes and geographies, the definitions of measurement metrics for generating data are often inflexible for comparison with similar interventions and issues of interest such as other programs/projects, crop and livestock types, and geographical areas. The slowness to adopt common metrics has limited the ability to learn from field experience and has diminished the potential for more effective policymaking [70]. Blundo et al. (2018) [71] and Wanjala et al. (2017) [40] surmised that a conscious layout of an impact pathway, even ex ante, can facilitate identification of data needs for each anticipated milestone to inform prompt decision-making. However, using non-agile data approaches can be time consuming and cumbersome as tools to facilitate learning at key operational levels. Specifically, the integration of micro, meso, and macro levels becomes a particular challenge within the temporal confines of many programs. This fact adversely affects learning and use for decision-making along reporting hierarchies, as well as the ability to repurpose the data to meet other needs or outcomes [32].
According to Carletto 2021 [1] (p. 720), “the limited integration and interoperability of agricultural data has contributed to making today’s agricultural data less relevant to tomorrow’s policy challenges. Improving data integration and interoperability across data sources would greatly contribute to overcoming the limitations of individual data sources in achieving the temporal and spatial resolution needed for many applications”. One area of data governance that is very important for developing e-agriculture (electronically facilitated information related to agriculture) is creating standards to harmonize the ways data are collected, processed, stored, and shared. To maximize the benefits of digital technology use, there needs to be some way to ensure the consistent collection, exchange, and dissemination of accurate information across boundaries, both sectoral and geographic. Without such consistency, there is a real risk of misinterpretation of information, and incompatibility of data structure and terminology [72].
3.
Limited inclusivity due to lack of both a farmer-centric approach and open data principle
To ensure that a key development issue such as inclusivity (gender, youth, minorities, the poorest, etc.) is promoted as part of an intervention requires regular and consistent monitoring. Similarly, implementer and beneficiary satisfaction about inclusion needs to be tracked in real time. Saner et al. (2018) [67] argue for participation-based and inclusive monitoring to be fundamental components of managing the SDG implementation process to ensure transparency. Non-agile data approaches can be used to capture inclusivity, but often limit the extent to which decision-makers could be furnished with that data, so as to enhance timely responses to address those issues. According to Lamanna et al. (2019) [48] and UN (2020) [15], the common practice of interviewing the head of the household has generated significant bias and limited the progress toward achieving gender empowerment and inclusivity as per the SDGs.
According to COSA (2022) [24] and Schroeder et al. (2022) [43], Agile Data should also be farmer-centric or embrace Data Democracy principles, providing the right set of incentives to farmers to make them more engaged in the data collection process. In practice, there should be an open relationship and exchange of data with farmers in a mutually beneficial data ecosystem between the public and the private sector [39]. As suggested by Schroeder et al. (2022) [43], policy makers should employ a clear legal framework that recognizes “a general principle of access to privately held data of public interest”. This claim of public interest is based on the public or collective contribution to the value of certain private data assets. These can include identifying and building on competitive spaces for sharing private sector data—where private actors realize the value of open data in promoting innovation, cost-sharing, and value chain efficiencies—and combining with other datasets for new or expanded insights. The public sector could also create public–private partnerships by, for example, co-financing research and development with private-sector firms.
4.
High costs of conducting face-to-face surveys in the field
As with data collection in general, the issues related to the practical application of mechanisms to collect qualitative, quantitative, or mixed data are interrelated. Theobald and Diebold (2018) [66] posited that weaknesses in non-agile data collection and management include interface problems related to project planning; controlling, reporting and approval; contracting and budgeting; process requirements; tooling and infrastructure; coordination and integration; and staffing. In general, it is not easy to obtain good cost estimates on household surveys since funders tend to keep cost information confidential. Data for Development (2015) [73] compares the costs of different high-quality surveys based on computer-assisted in-person interviews (CAPIs). The analysis shows a per-survey cost that ranges from about USD 450,000 to 1,700,000, Table 1.
Data for Development (2015) [73] and Dabalen et al. (2016) [33] highlight that “a typical complex, multi-topic household survey that is in the field for a year might cost around USD 140–150 per household—excluding technical assistance in sampling and data entry—and collect data on responses to roughly USD 3000 questions or about USD 0.06 per question, compared with USD 0.20 per question in a mobile phone survey”.
Digital technology and new survey modes can reduce these costs. Agile Data relies on fast and easily accessible methods of data collection based on remote technologies, such as computer-assisted telephone interviews (CATI), interactive voice response (IVR), mobile applications and sensors. Remote technologies seek to replace, at least partially, the face-to-face interviews with voice calls from live operators (CATI), SMS, or through pre-recorded messages (IVR) or mobile applications and chat applications to enable more frequent data collection for a lower cost. These technologies are more often combined with remote sensing collecting physical data to be integrated into the Geographical Information System (GIS).
Table 2 presents an analysis of attributes of the different technologies associated with the modes of survey administration. CATI and remote surveys in general are more cost-effective for data collection since operating mobile phone calls is usually cheap and can produce ready-to-analyze data in near real-time [14,33,74,75]. Utilizing both SMS and direct phone calls, preliminary results from a pilot initiative in Botswana suggest that phone assessments can provide valid information, under certain conditions, at a fraction of the cost of face-to-face interviews [76].
Data for Development (2015) [73] attempts to compare these costs with the cost of a standard face-to-face survey, but there are no obvious and easy ways to accomplish this given the differences in sample sizes, the frequency of data collection, the complexity differences of questions, and the number of questions per module. From a recent IPA (2020) study, it is clear that self-administered modes—IVR and automated SMS—cost less to implement than CATI because they do not require the same personnel costs (human interviewers and supervisors), as shown in Figure 2 elaborated by IPA (2020) [77]. Further advantages of IVR compared to other communication channels, such as SMS and most mobile phone applications, are that voice messages or surveys can be recorded in different local languages and accessed on demand, and farmers can easily follow the voice message even if they do not know how to read. Results from a study in Ghana conducted by CGIAR (2022) show that farmers are willing to use mobile phones to receive agricultural information [24]. However, they prefer voice channels over text, which may be related to the low education and literacy level.
The positive attributes of Agile Data approaches compared to non-agile data are summarized in Table 3.

3.2. The Status of the Literature: Agile Data and Semi-Agile Approaches

In recent years, there has been a proliferation of semi-agile approaches to data. The term "semi-agile approach" stands for approaches with characteristics similar to those of the Agile approach to data, but with incomplete attributes. The World Bank, USAID, CGIAR, the World Food Program, Acumen, and many other development practitioners have introduced tools to inform decision-making in ways that were never possible before thanks to new digital technologies, ensuring sustainable development outcomes through greater efficiency, agility, and performance.
In the following section, we review some of these tools in relation to the survey modes and some of the six Agile Data attributes. The characteristics of each tool in relation to these agile attributes are summarized in Table 4.
First Attribute: Diverse technology modes. The technologies used to capture the data are different and may also include using advanced statistical methods to reduce required sample sizes and using tools that can support rapid data collection: cell phones and tablets for survey implementation, SMS, mobile apps and IVR technology for remote data collection, and geospatial imagery from satellites. For example, CGIAR’s innovative 5Q approach, developed in 2015 and refined in 2021, uses smart-question trees (5Q-SQTs) to recall a farmer’s perception, monitor the effects of implemented activities, or evaluate adoption, among others [18,75]. Questions are asked through IVR, and they are linked in a tree structure by branches and decision nodes, connecting a respondent-based answer choice to the subsequent question block. Another example is USAID’s Real Time Data For Adaptive Management (RTD4AM) that has heavily invested in mobile platforms and IVR technology [40]. USAID’s M-Posyandu project, for example, uses a smartphone application called M-Posyandu to use real-time mobile data to improve the efficiency and quality of nutrition service decision-making and achieve national nutrition goals [90]. USAID’s Listening Post uses an interactive rural radio platform to provide broadcasts and radio mini-series on specific topics [91]. Listeners, mainly farmers, are then invited to participate in polls, ask questions, and offer opinions. The Listening Post project in Tanzania is a pilot initiative with funding from Bill and Melinda Gates Foundation and undertaken by Farm Radio International (FRI) [91]. In order to gather and analyze the mobile phone-based feedback, the project uses Uliza—a tool built with interactive voice response (IVR) by FRI. The system is built around IVR developed by Voto Mobile, and it enables listeners to vote in polls, leave messages, and request information.
Second Attribute: Short and high frequency surveys. The World Bank and Acumen mainly use CATI technology, but by introducing modifications to the survey length. These surveys allow for near real-time survey data collection and make it possible to quickly cover wide areas. For example, WB’s Rapid Response Phone Surveys (RRPS) are quick surveys administered through CATI to households, businesses or firms, with each interview typically lasting less than 20 min [9]. Further, WB’s Survey of Well-Being via Instant and Frequent Tracking (SWIFT) is a low-cost, low-frequency survey (annual), to collect welfare information from project beneficiaries, as well as to monitor a project’s contributions to extreme poverty and shared prosperity by providing timely feedback to project teams. SWIFT requires only 7–10 min for each household interview, a few minutes for processing the data, and costs less than USD 100,000 per country to implement [13]. In Tanzania, SWIFT is being used to fill a critical data gap in mobile penetration across income groups by combining questions on mobile phone uptake and usage with consumption estimates.
The World Bank recently transformed its well-known traditional multi-country and multi-round FTF Living Standards Measurement Study—Integrated Surveys on Agriculture (LSMS-ISA) program into a high-frequency monthly phone survey following the outbreak of the COVID-19 pandemic [10]. The proposed High-Frequency Phone Survey (HFPS) in each of the 5 LSMS-ISA countries (Uganda, Ethiopia, Nigeria, Malawi, Tanzania) will track the responses to and economic impacts of COVID-19 by conducting monthly phone interviews with a national sample of households that had been interviewed during the latest round of the LSMS-ISA-supported national longitudinal household survey and/or an alternative, recent, nationally representative, cross-sectional survey that may also be available. Each month, the HFPS households will receive a core set of questions primarily to capture the economic impacts of COVID-19, and these questions will be complemented by rotational questions on select topics that will be introduced each month and kept to an agreed length. Within the core set of questions administered in each country, a selection of these will be comparable across countries. The monthly interview with each HFPS household will not exceed 20 min.
Acumen, inspired in part by the principles emerging from the Lean Research Initiative that also included MIT D-Lab and the Fletcher School at Tufts University in 2015, developed an approach to what they termed LeanData [21,92,93]. This approach has been based on telephony to essentially reduce the time and costs of data gathering. For example, it captured information on poverty via the Progress out of Poverty Index (PPI—now Poverty Probability Index). They further developed Toolkits on Gender and Climate Resilience with applications for customers in a Resilient Agriculture Fund [22,23,94].
Third Attribute: Agile and adaptive design and monitoring. Most of the tools developed by the main development practitioners seem to have embraced the logic of the Agile project management over the waterfall model. Feedback loops are indeed present in the WFP’s Mobile Vulnerability Analysis and Mapping Approach (mVAM), WB’s Iterative Beneficiary Monitoring (IBM), USAID’s Rapid Feedback MERL (RF-MERL), USAID’s RTD4AM, and CGIAR’S 5Q approach. These approaches complement traditional ME methods by increasing the frequency of stakeholder consultation to understand how project activities are impacting, providing timely information for corrective action. USAID’s RF-MERL in Tanzania, for example, has implemented regular “learning checks” in which all partners come together to reflect on the findings, brainstorm ways to refine implementation, and iterate accordingly, in order to strengthen community engagement in children’s learning. USAID’s M-Posyandu shows that through this mobile phone application, counselors can input monthly information about children and automatically process growth measurements. The system also flags nutritional risk, allowing counselors to tailor health messages for parents in real time. All measurements are stored in electronic health records that are available in real time or nearly real time at sub-district and district levels, where they trigger responses by health care officials and NGO staff. Counselors who used mobile phones were more likely to provide feedback on their sessions, and the system accelerated the process of nutrition data collection and improved data accuracy by 80%.
The Institute of Development Studies conducted a detailed research study exploring whether and how the USAID Listening Post could support adaptive management processes [95]. The research found that Listening Post has demonstrated its potential to collect real-time feedback from farmers that could be used to aid decision-making and improve accountability in agricultural development initiatives, helping to ensure they are more responsive to farmers. WB’s IBM has been tested for a wide range of topics, from school meals and fertilizer subsidies to free medical care, and in different countries, including Mali, Niger, and Nigeria. IBM’s feedback mechanism led to notable improvements in project implementation: more students receive school meals, more farmers receive fertilizer vouchers, and more women have access to social protection than would have been the case without IBM. It helps to monitor project investments and the reach of beneficiaries, guaranteeing social inclusion. CGIAR’s 5Q uses simple sets of questions, part of a logic-question-tree structure, implemented in multiple rounds, thus simplifying the burden for respondents while rapidly and with high frequency providing feedback to the project implementers.
Fourth and Fifth Attributes. Farmer-centric and open data principles. Although these tools often implement an agile feedback mechanism and deploy data in dashboards, such as WB’s RRPS, CGIAR’s 5Q and USAID’s RD4DM, they show limits concerning open data principles, data ownership and a farmer-centric approach. The data infrastructure process, in the form of database ownership and data management systems, is often not clear, except for the case of WB’S LSMS, in which data are public. Further, experience from the field shows that it is difficult to share data back directly with respondents. The research conducted by IES found that USAID’s Listening Post has demonstrated its potential to collect real-time feedback from farmers that could be used to aid decision-making. However, it also concludes that “closing the feedback loop”—ensuring that a farmer’s comments, questions and concerns are responded to—is a challenge for the Listening Post. Some results have been reached by WFP through the Mobile Vulnerability Analysis and Mapping (mVAM). This tool is used to collect data on households’ vulnerability and food security [20]. There are three modes of data collection embedded in this tool: CATI, through live calls from a call center, SMS surveys, and Interactive Voice Response (IVR). In Somalia, the operators who place outgoing calls also take incoming calls from beneficiaries all over the county. In the Democratic Republic of Congo, WFP set up an IVR system to respond to questions from beneficiaries. WFP is currently working on two fronts: webpage and chatbots. Through chatbots, respondents are contacted on Telegram via their smartphones and asked a series of questions about their food security and livelihood situations just as they would be by phone, SMS, or on other mVAM modalities. WFP is also working with Facebook’s Free Basics platform. With Free Basics, people can access relevant information for free via their internet-enabled phones. WFP works in over 50 countries, and it is piloted right now in Malawi, where people can access weekly market price data, market news, and a polling function that allows users to take simple surveys and provide feedback that mVAM has collected on a website.
Sixth Attribute: Interoperability. There is so far limited clarity about how these tools ensure consistent data collection, exchange and the dissemination of accurate and standardized information. In some cases, when the data collection efforts are repeated over multiple years, as in the case of LSMS, the benefits of digital technologies are maximized since there are specific standards to harmonize the ways data are collected, processed, stored, and shared.

4. Discussion

As a primary goal of this article, we examined the changing pathways through which digital technologies and adaptive methods can accelerate the gathering and use of data, particularly for development objectives. Reviewing the recent applications of agile approaches in the field of agricultural development, we offer evidence to define the salient characteristics that distinguish Agile Data through six crucial attributes.
As developers and users of such approaches, we note that there are still limitations and challenges and, in this section, we discuss those and propose potential solutions that could be tested so as to further develop what we believe could be useful data tools, especially in difficult or remote regions.

4.1. Lack of Digital and Physical Infrastructure: Potential Bias and Solutions

First, it is clear that technological innovation is central to enabling such Agile Data approaches. Second, the low-cost and relative scalability allow access to much greater numbers of beneficiaries and thus permit a richer understanding and the ability to segment groups (by age, gender, income, etc.) to observe the effect of different treatments, capacities, or conditions. It can thus serve to assess the inclusivity of participation, particularly in regard to gender or minorities.
However, there is a substantial challenge manifest in different levels of access to digital technologies that, if not properly addressed, can exclude some of the most vulnerable and could thus manifest as a data or sampling bias and result in possibly deepening the digital divide [96,97,98].
For example, the potential of Agile Data to conduct high-frequency inquiries requires that data collection happen via a digital medium. However, the completeness or quality of data collected can be somewhat dependent on the accessibility of the respondents to mobile devices. According to Jarvis et al. (2015) [75], mobile-cellular subscriptions had grown in the five years prior to his study, especially in Asia and Africa. However, a gender gap still exists in low- to middle-income countries (LMICs) where women are on average eight percent less likely to own a mobile phone than men and use a smaller range of mobile phone services, such as SMS and internet access [99]. Mobile user data not surprisingly also show that the proportion of people who use smartphones to access the internet decreases with age and increases with educational attainment and household income [89]. In particular, literacy is crucial for the use of many digital technologies. Farmers in developing countries and smallholders in general may lack the skills and knowledge to reap the full benefits of digital applications that are available to them [100]. The fact that only a fraction of the population uses mobile phones, and this subpopulation may not have the same characteristics or behaviors as the population of interest, can easily generate a sampling bias [25,101,102]. In other words, the sample could systematically under-represent some groups, notably migrants, younger individuals, the poor, women, and people who do not have the skills or the capacity to use the technology offered [77,79].
Some studies have directly provided telephones to respondents in order to fill a critical data gap in mobile penetration across income groups and to understand how they access digital technology and use phones across socio-economic groups within the country [103,104]. Given the high costs of a single in-person survey, this can be cost-effective. The results have been used for evidence-based policy recommendations to increase access to mobile phone and internet technology for the poorest segments of the population.
Agile Data approaches can counter some of the exclusionary aspects of the digital divide and foster uptake and inclusivity with technologies that help reduce resistance with simpler intuitive interfaces [105]. Further, new technologies such as IVR and other voice-based applications could help to overcome some of the bias inherent in participants’ levels of literacy. These types of efforts can help to overcome the lack of technological capacity among some users in the field and, therefore, facilitate adoption and use of these data approaches.

4.2. Adoption of Digital Technologies and Related Incentives: Toward a Farmer-Centric Approach Characterized by Data Democracy

A selection bias may occur based on the level of interest of the participants in the medium of data collection—especially with relatively novel mobile apps where the participants can tend to self-select. This opens some possibilities of unintended exclusions or even multiple inclusions of the same party, leading to sample frame errors and skewed results. The World Bank (2020) [11] proposes to use stratified sampling to overcome sampling bias and a system of incentives to increase participants’ response rates. According to the World Bank (2020) [11], stratification based on forecast (ex ante) characteristics helps to balance the sample, and some of the literature notes that incentives for on-farm adoption of digital technologies are based on their perceived costs and benefits [43].
There is some consensus that monetary incentives, the most widely tested, increase response rates by reducing refusal rates, but do so with diminishing returns over time and even as the size of the incentives increases [75,82,106]. We note in a prior COSA and World Bank effort in Indonesia, that monetary incentives can be insufficient to ensure that accurate data are provided by respondents. Garbage in, garbage out is a clear concern that can overcome other efforts or methods to ensure data rigor. COSA and its partners, ICED and GDI, will test whether farmer-submitted data are more accurate when that same data are analyzed and offered back as useful benchmarks. For example, knowing one’s input-use efficiency (e.g., labor or fertilizer costs per kilo or ton produced) relative to other farmers in the same zone and growing the same crops can offer valuable insights, but only if the data submitted to the algorithm are accurate.
Behavioral drivers of adoption, i.e., risk aversion and socio-economic characteristics such as land and farm size, together with the capacity of the technology to satisfy farmers’ needs, are crucial features to be considered in the adoption of any particular digital technology [43,95,107].
Embracing a farmer-centric approach that directly benefits farmers and ensures their data ownership and access, offers potentially longer lasting opportunities to empower their active participation in the data effort. Farmers are often reluctant to adopt digital technologies due to lack of trust between them and third-party actors who collect, aggregate, and share data. In other words, unclearly defined data ownership, access, and control rights could lead to data misuse, eroding farmers’ trust in digital technologies and discouraging their adoption [43]. The European Parliamentary Research Service and the German Agricultural Society both hold that for them, the farmer owns the data originating from his or her farm [47,74]. Although data ownership is important, we need to also guarantee crucial access to data and productive data use by farmers [108].
Perhaps the most valuable corollary feature of Agile Data is that it can incorporate a system of creative incentives to participate in the data collection efforts and to provide accurate data. Data providers or beneficiaries can be rapidly engaged with benchmarking and use of their data to tailor functional knowledge within a process known as Data Democracy. When functioning as open digital platforms and anonymized databases, they can help ensure that the voice of the farmer is heard directly.

4.3. Data Protection Challenges: Need for Clearly Defined Data Rights

Agile Data can contribute to more timely and more informed decision making with targeted management feedback loops that may include sensitive information.
However, since digital technologies can collect new types and large amounts of farm data, often with geotagging or other identifiers, it is important to limit access and safeguard farmer rights to their data. Laws addressing the ownership or use of data from digital agriculture are frequently either missing or inadequate, particularly in low and middle income countries. Farmers that use digital technologies may tend to share disproportionally more of their data, sometimes inadvertently, which can exacerbate privacy issues and reduce their control. Clearly defined data rights could encourage technology adoption, while unclearly defined data ownership, access, and control rights could lead to data misuse, eroding farmers’ trust in digital technologies and thus potentially slowing or discouraging their adoption.

5. Conclusions

In this paper, we have offered a more systematic understanding and characterization of Agile Data based on an analysis of the evidence in recent literature and our own development of the concepts. This is part of an effort to help apply the principles more widely to improve how data can better serve farmers and rural communities as well as the organizations, such as supply chains, development organizations and governments, that rely on or invest in those farms and communities.
We have further specifically offered a definition of Agile Data and highlighted how an Agile Data approach can allow monitoring, evaluation and learning (MEL) to be based on consistent standardized data, undertaken regularly from the field for understanding individual projects, and to better target project beneficiaries through active feedback loops [31]. This approach works across categories, crops, and geographies, whether for food security, poverty reduction, climate adaptation, gender inclusion, resilience or income generation. It also offers timely and frequent access to low-cost information that can help overcome conditions of limited time and resources for data.
Nevertheless, we highlighted concerns related to the digital divide and potential failures to be inclusive that need to be actively considered if these approaches are to reduce those barriers to equity and advancement. We also caution that the proliferation of novel data approaches can be detrimental if not well-governed from the perspectives of both privacy and data protection. It is a topic that must be actively addressed by government, policy makers, and programs. Implementing appropriate data security measures will be important, as will inclusive communication to foster trust in the system that is necessary for its acceptance and growth. We propose that further research is needed in four key areas:
  • To understand the effect of these digital technologies on inclusivity
  • To determine data accuracy, provenance, and veracity
  • To what extent approaches can be designed as interoperable to accommodate advanced data approaches such as those employed by the LSMS and CGIAR centers
  • To better understand the potential and limitations of the range of benefits related to a farmer-centric approach including the progression to shared local or regional data eco-systems.

Author Contributions

Conceptualization, E.S., D.G., D.A. and A.B.; methodology, E.S. and R.B.; validation, E.S., D.G. and D.A.; formal analysis, E.S.; investigation, E.S.; resources, D.G.; writing—original draft preparation, E.S., D.G., R.B., T.A.N.W., D.A., A.B. and R.M.; writing—and editing, D.G. and E.S.; visualization, E.S.; supervision, E.S.; project administration, E.S.; funding acquisition, D.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Bill and Melinda Gates Foundation grant number INV-031160.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Carletto, C. Better data, higher impact: Improving agricultural data systems for societal change. Eur. Rev. Agric. Econ. 2021, 48, 719–740. [Google Scholar] [CrossRef]
  2. Food and Agriculture Organization. Investing in Data for SGDs: Why Good Numbers Matter; FAO: Rome, Italy, 2019. [Google Scholar]
  3. UN. The Sustainable Development Goals Report; United Nation: Rome, Italy, 2021. [Google Scholar]
  4. UNECE. Measuring and Monitoring Progress toward the Sustainable Development Goals; UNECE: Geneva, Switzerland, 2020. [Google Scholar]
  5. Fraval, S.; Hammond, J.; Wichern, J.; Oosting, S.J.; De Boer, I.J.M.; Teufel, N.; Lannerstad, M.; Waha, K.; Pagella, T.; Rosenstock, T.S.; et al. Making the most of imperfect data: A critical evaluation of standard information collected in farm household surveys. Exp. Agric. 2019, 55, 230–250. [Google Scholar] [CrossRef] [Green Version]
  6. Dillon, A.; Lakshman, R. Land Measurement Bias: Comparisons from Global Positioning System, Self-Reports, and Satellite Data. SSRN Electron. J. 2018, 10, 2139. [Google Scholar]
  7. Gaddis, I.; Gbemisola, O.; Amparo, P.L.; Janneke, P. Measuring Farm Labor: Survey Experimental Evidence from Ghana; World Bank Policy Research Working Paper; The World Bank: Washington, DC, USA, 2019. [Google Scholar]
  8. Wollburg, P.; Tiberti, M.; Zezza, A. Recall Length and Measurement Error in Agricultural Surveys; World Bank Working Paper; The World Bank: Washington, DC, USA, 2020. [Google Scholar]
  9. World Bank. Conducting Rapid Response Phone Surveys (RRPS) to Fill Data Gaps; The World Bank: Washington, DC, USA, 2020. [Google Scholar]
  10. World Bank. Capitalizing on the World Bank LSMS-ISA Program for High-Frequency Phone Surveys on COVID-19; The World Bank: Washington, DC, USA, 2020. [Google Scholar]
  11. World Bank. High Frequency Mobile Phone Surveys of Households to Assess the Impacts of COVID-19; Guidelines on Sampling Design; The World Bank: Washington, DC, USA, 2020. [Google Scholar]
  12. World Bank. Iterative Beneficiary Monitoring (IBM) as a Cost-Effective Tool for Improving Project Effectiveness; The World Bank: Washington, DC, USA, 2019. [Google Scholar]
  13. World Bank. Survey of Well-Being via Instant and Frequent Tracking (SWIFT): Estimating Consumption for Household Poverty Measurement; A Rapid Assessment Tool; The World Bank: Washington, DC, USA, 2015. [Google Scholar]
  14. World Bank. The World Bank Listening to LAC (L2L) Pilot Final Report; The World Bank: Washington, DC, USA, 2012. [Google Scholar]
  15. USAID. Rapid Feedback Monitoring, Evaluation, Research and Learning; USAID: Arlington, TX, USA, 2021.
  16. USAID. Lab Evaluation, Research and Learning Plan; Evidence Brief; USAID: Arlington, TX, USA, 2019.
  17. USAID. Bridging Real-Time Data and Adaptive Management: Case Study Report; USAID: Arlington, TX, USA, 2017.
  18. CGIAR. The 5Q Approach for Gender Data on Empowerment in Climate Adaptation Projects: Case Study in Ghana; CGIAR: Washington, DC, USA, 2022. [Google Scholar]
  19. CGIAR. Collecting Development Data with Mobile Phone: Key Considerations from a Review of the Evidence; CGIAR: Washington, DC, USA, 2017. [Google Scholar]
  20. World Food Program. Remote Food Security Monitoring. Introduction to Mobile Vulnerability Analysis and Mapping; The World Bank: Washington, DC, USA, 2017. [Google Scholar]
  21. Acumen. The Lean Data Field Guide; Acumen: Durham, UK, 2015. [Google Scholar]
  22. Acumen; Unilever. A Lean Data How to Guide. Understanding Gender Impact Phase 2; Acumen: Durham, UK; Unilever: London, UK, 2015. [Google Scholar]
  23. Acumen; Unilever. A Lean Data How to Guide. Understanding Gender Impact Phase 1; Acumen: Durham, UK; Unilever: London, UK, 2015. [Google Scholar]
  24. Committee On Sustainability Assessment. Data Democracy: How to Radically Alter the World of Small Farmers; Blog Series; COSA: Philadelphia, PA, USA, 2020. [Google Scholar]
  25. Barder, O. Science to Deliver, but “No Science of Delivery”. Center for Global Development Blog; SSRN: Washington, DC, USA, 2013. [Google Scholar]
  26. Sommer, A.F.; Dukovska-Popovska, I.; Steger-Jensen, K. Barriers towards integrated product development—Challenges from a holistic project management perspective. Int. J. Proj. Manag. 2013, 32, 970–982. [Google Scholar] [CrossRef]
  27. Bassil, Y. A simulation model for the waterfall software development life cycle. Int. J. Eng. Technol. 2012, 2, 742–749. [Google Scholar]
  28. Gourlay, S.; Kilic, T.; Lobell, D. A new spin on an old debate: Errors in farmer-reported production and their implications for inverse scale—Productivity relationship in Uganda. J. Dev. Econ. 2019, 141, 102376. [Google Scholar] [CrossRef]
  29. Carletto, C.; Sydney, G.; Siobhan, M.; Alberto, Z. Cheaper, Faster, and More Than Good Enough: Is GPS the New Gold Standard in Land Area Measurement? 2017. Available online: https://policycommons.net/artifacts/1291524/cheaper-faster-and-more-than-good-enough/1894488/ (accessed on 23 August 2022).
  30. Sourav, A.I.; Emanuel, A.W.R. Recent Trends of Big Data in Precision Agriculture: A Review. In Proceedings of the IOP Conf. Series: Materials Science and Engineering, Jakarta, Indonesia, 20 October 2020. [Google Scholar]
  31. Anderson, J.; Karuppusamy, R.; Paul Enrico Neumann, P.E.; Howard Miller, H.; Tamara, R. Smallholder Households: Distinct Segments, Different Needs; CGAP Focus Note: Washington, DC, USA, 2019; Volume 111. [Google Scholar]
  32. Carletto, C.; Zezza, A.; Banerjee, R. Towards Better Measurement of Household Food Security: Harmonizing Indicators and the Role of Household Surveys. Glob. Food Secur. 2013, 2, 30–40. [Google Scholar] [CrossRef]
  33. Dabalen, A.L.; Etang Ndip, A.; Hoogeveen, J.G.; Mushi, E.; Schipper, Y.; Engelhardt, J.V. Mobile Phone Panel Surveys in Developing Countries: A Practical Guide for Microdata Collection; World Bank Group: Washington, DC, USA, 2016. [Google Scholar]
  34. Velthausz, D.; Donco, R.; Skelly, H.; Eichleay, M. Mozambique Mobile Access and Usage Study: Computer-Assisted Telephone Interview (CATI) Survey Results. United States Agency for International Development: Washington, DC, USA, 2016. [Google Scholar]
  35. Zilberman, D.; Khanna, M.; Lipper, L. Economics of Sustainable Agriculture. Aust. J. Agric. Res. Econ. 1997, 41, 63–80. [Google Scholar] [CrossRef] [Green Version]
  36. Bauer, J.M.; Akakpo, K.; Enlund, M.; Passeri, S. Tracking vulnerability in real time. Mobile text for food surveys in Eastern Democratic Republic of Congo. Afr. Policy J. 2014, 9, 36. [Google Scholar]
  37. Catholic Relief Service. Measurement Indicators for Resilience Analysis (MIRA II Phase II); Final Report; Catholic Relief Service: Baltimore, MD, USA, 2017. [Google Scholar]
  38. Harvey, C.A.; Rakotobe, Z.L.; Rao, N.S.; Dave, R.; Razafimahatratra, H.; Rabarijohn, R.H.; Rajaofara, H.; MacKinnon, J.L. Extreme vulnerability of smallholder farmers to agricultural risks and climate change in Madagascar. Philos. Trans. R. Soc. B 2014, 369, 20130089. [Google Scholar] [CrossRef] [Green Version]
  39. McCarthy, H.; Potts, H.W.W.; Fisher, A. Physical Activity Behavior Before, During, and After COVID-19 Restrictions: Longitudinal Smartphone-Tracking Study of Adults in the United Kingdom. J. Med. Internet Res. 2021. Available online: https://www.jmir.org/2021/2/e23701/PDF (accessed on 23 August 2022).
  40. Wanjala, M.Y.; Iravo, M.A.; Odhiambo, R.; Shalle, N.I. Effect of Monitoring Techniques on Project Performance of Kenyan State Corporations. Eur. Sci. J. 2017, 13, 269. [Google Scholar]
  41. Akers, J.C. Dial “A” for agriculture: A review of information and communication technologies for agricultural extensions in developing countries. J. Agric. Econ. 2011, 6, 631–647. [Google Scholar] [CrossRef]
  42. Fabregas, R.; Kremer, M.; Schilbach, F. Realizing the potential of digital development. The case of agricultural advice. Science 2019, 366, eaay3038. [Google Scholar] [CrossRef] [Green Version]
  43. Schroeder, K.; Lampietti, J.; Elabed, G. What’s Cooking: Digital Transformation of the Agrifood System; Agriculture and Food Series; World Bank: Washington, DC, USA, 2021. [Google Scholar] [CrossRef]
  44. Raissa, F.; Michael Kremer, M.; Schilbach, F. Realizing the Potential of Digital Development: The Case of Agricultural Advice. Science 2019, 366, 30–38. [Google Scholar]
  45. Garlick, R.; Orkin, K.; Quinn, S. Call me maybe: Experimental evidence on frequency and medium effects in microenterprise surveys. World Bank Econ. Rev. 2020, 34, 418–443. [Google Scholar] [CrossRef]
  46. Talip, K.; Moylan, H.; Ilukor, J.; Mtengula, C.; Pangapanga-Phiri, I. Root for the tubers: Extended-harvest crop production and productivity measurement in surveys. Food Policy 2021, 102, 102033. [Google Scholar]
  47. Fabregas, R.; Kremer, M.; Lowes, M.; On, R.; Zane, G. SMS-Extension and Farmer Behavior: Lessons from Six RCT in East Africa; ATAI Working Paper; ATAI: Cambridge, UK, 2019. [Google Scholar]
  48. Lamanna, C.; Hachhethu, K.; Chesterman, S.; Singhal, G.; Mwongela, B.; Ng’endo, M.; Passeri, S.; Farhikhtah, A.; Kadiyala, S.; Bauer, J.-M.; et al. Strengths and limitations of computer assisted telephone interviews (CATI) for nutrition data collection in rural Kenya. PLoS ONE 2019, 14, e0210050. [Google Scholar] [CrossRef] [Green Version]
  49. Carletto, C.; Savastano, S.; Zezza, A. Fact or artifact: The impact of measurement errors on the farm size–productivity relationship. J. Dev. Econ. 2013, 103, 254–261. [Google Scholar] [CrossRef]
  50. Furbush, A.; Josephson, A.; Kilic, T.; Michler, J.D. The Evolving Socioeconomic Impacts of COVID 19 in Four African Countries; World Bank Policy Research Paper; The World Bank: Washington, DC, USA, 2021. [Google Scholar]
  51. Arthi, V.; Kathleen Beegle, K.; De Weerdt, J.; Palacios-López, A. Not your average job: Measuring farm labor in Tanzania. J. Dev. Econ. 2017, 130, 160–172. [Google Scholar] [CrossRef] [Green Version]
  52. Benami, E.; Carter, M.R. Can digital technologies reshape rural microfinance? Implications for savings, credit, & insurance. Appl. Econ. Perspect. Policy 2021, 43, 1196–1220. [Google Scholar]
  53. Lau, C.Q.; Cronberg, A.; Leenisha Marks, L.; Ashley Amaya, A. In Search of the Optimal Mode for Mobile Phone Surveys in Developing Countries. A Comparison of IVR, SMS, and CATI in Nigeria. Surv. Res. Method 2019, 13, 305–318. [Google Scholar]
  54. Sampaio, F.G.; Araújo, C.A.S.; Dallago, B.S.L.; Stech, J.L.; Lorenzetti, J.A.; Alcantara, E.; Losekann, M.E.; Marin, D.B.; Leão, J.Q.A.; Bueno, G.W. Unveiling low to high frequency data sampling caveats for aquaculture environmental monitoring and management. Aquac. Rep. 2021, 20, 100764. [Google Scholar] [CrossRef]
  55. Abay, K.A.; Abate, G.T.; Barrett, C.B.; Bernard, T. Correlated non-classical measurement errors, ‘second best’ policy inference, and the inverse size-productivity relationship in agriculture. J. Dev. Econ. 2019, 139, 171–184. [Google Scholar] [CrossRef] [Green Version]
  56. Bevis, L.; Barrett, C. Close to the edge: High productivity at plot peripheries and the inverse size-productivity relationship. J. Dev. Econ. 2020, 143, 102377. [Google Scholar] [CrossRef]
  57. Carletto, C.; Gourlay, S.; Winters, P. From guesstimates to GPS estimates: Land area measurement and implications for agricultural analysis. J. Afr. Econ. 2015, 24, 593–628. [Google Scholar] [CrossRef] [Green Version]
  58. Desiere, S.; Jolliffe, D. Land productivity and plot size: Is measurement error driving the inverse relationship? J. Dev. Econ. 2018, 130, 84–98. [Google Scholar] [CrossRef] [Green Version]
  59. Casaburi, L.; Kremer, M.; Ramrattan, R. Crony Capitalism, Collective Action, and ICT: Evidence from Kenyan Contract Farming. PEDL Research Paper. 2019. Available online: https://www.atai-research.org/crony-capitalism-collective-action-and-ict-evidence-from-kenyan-contract-farming/ (accessed on 23 August 2022).
  60. Lobell, D.; Tommaso, S.; You, C.; Djima, I.; Burke, M.; Kilic, T. Sight for Sorghums: Comparisons of satellite- and ground-based sorghum yield estimates in Mali. Remote Sens. 2019, 12, 100. [Google Scholar] [CrossRef]
  61. Carter, M.R.; de Janvry, A.; Sadoulet, E.; Sarris, A. Index Insurance for Developing Country Agriculture: A Reassessment. Annu. Rev. Resou. Econ. 2017, 9, 421–438. [Google Scholar] [CrossRef]
  62. Flatnes, J.E.; Michael, R.; Carter, M.R.; Mercovich, R. Improving the Quality of Index Insurance with a Satellite-Based Conditional Audit Contract; Working Paper; Department of Agricultural and Resource Economics, University of California: Berkeley, CA, USA, 2018. [Google Scholar]
  63. Amankwah, A.; Gourlay, S. Household Agriculture and Food Security in the Face of COVID 19. Evidence from Five Sub-Saharan Countries. In Proceedings of the International Association of Agricultural Economists Virtual Conference, New Delhi, India, 17 August 2021. [Google Scholar]
  64. Hoogeveen, J.; Utz Pape, U. Data Collection in Fragile States. Innovations from Africa and Beyond; Palgrave MacMillan Edition; Springer Nature: Cham, Switzerland, 2020. [Google Scholar]
  65. Security Information Network (FSIN). Quantitative Analyses for Resilience Measurement; FSIN: Singapore, 2016. [Google Scholar]
  66. Theobald, S.; Diebold, P. Interface Problems of Agile in a Non-Agile Environment. In Proceedings of the Agile Processes in Software Engineering and Extreme Programming 18th International Conference, Cologne, Germany, 22–26 May 2017; pp. 123–130. [Google Scholar]
  67. Saner, R.; Yiu, L.; Nguyen, M. Monitoring the SDGs: Digital and Social Technologies to ensure citizen participation, inclusiveness and transparency. Dev. Policy Rev. 2019, 38, 483–500. [Google Scholar] [CrossRef]
  68. Benjamin, L.; Morello, R.; Mellon, J.; Peixoto, T.; Davenport, S.T. Do Mobile Phone Surveys Work in Poor Countries? Center for Global Development Working Paper; SSRN: Washington, DC, USA, 2015. [Google Scholar]
  69. Food and Agriculture Organization of the United Nations, International Telecommunication Union. E-agriculture Strategy Guide: Piloted in Asia-Pacific Countries; FAO; ITU: Bangkok, Thailand, 2016. [Google Scholar]
  70. Giovannucci, D. How New Metrics for Sustainable Agriculture Can Align the Roles of Government and Business; UN Global Sustainable Development Report Science Briefs; United Nations: Brussels, Belgium, 2015. [Google Scholar]
  71. Blundo, C.G.; Faure, G.; Hainzelin, E.; Monier, C.; Triomphe, B.; Vall, E. Impress Ex Ante. An Approach for Building Ex Ante Impact Pathways; CIRAD: Montpellier, France, 2018; 64p. [Google Scholar]
  72. Food and Agriculture Organization. Rima-II. Resilience Index Measurement and Analysis; FAO: Rome, Italy, 2016. [Google Scholar]
  73. Data for Development. A Need Assessment for SDG Monitoring and Statistical Capacity Development; Sustainable Development Solutions Network: New York, NY, USA, 2015. [Google Scholar]
  74. European Parliamentary Research Service. Precision Agriculture in Europe: Legal, Social, and Ethical Considerations; European Union: Brussels, Belgium, 2017. [Google Scholar]
  75. Jarvis, A.; Eitzinger, A.; Koningstein, M.; Benjamin, T.; Howland, F.; Andrieu, N.; Twyman, J.; Corner-Dolloff, C. Less Is More: The 5Q Approach; CIAT Scientific Report; International Center for Tropical Agriculture: Cali, Colombia, 2015. [Google Scholar]
  76. Angrist, N.; Bergam, P.; Brewster, C.; Matsheng, M. Stemming Learning Loss during the Pandemic: A Rapid Randomized Control Trial of a Low-Tech Intervention in Botswana; CSAE Working Paper; SSRN: Rochester, NY, USA, 2020. [Google Scholar]
  77. IPA and Northwestern. Remote Surveying in a Pandemic: Research Synthesis; IPA and Northwestern: Washington, DC, USA, 2020. [Google Scholar]
  78. Pariyo, G.W.; Greenleaf, A.R.; Gibson, D.G.; Ali, J.; Selig, H.; Labrique, A.B.; Al Kibria, G.M. Does mobile phone survey method matter? Reliability of computer-assisted telephone interviews and interactive voice response non-communicable diseases risk factor surveys in low and middle income countries. PLoS ONE 2019, 14, e0214450. [Google Scholar] [CrossRef]
  79. Benson, T. Can Mobile Phone-Based Household Surveys in Rural Papua New Guinea Generate Information Representative of Population Surveyed? IFPRI Project Paper; Intl Food Policy Res Inst: Washington, DC, USA, 2019. [Google Scholar]
  80. Dillon, B. Using mobile phones to collect panel data in developing countries. J. Int. Dev. 2012, 4, 518–527. [Google Scholar] [CrossRef] [Green Version]
  81. Kilic, T.; Moylan, H.; Koolwal, G. Getting the (Gender-Disaggregated) lay of the land: Impact of survey respondent selection on measuring land ownership and rights. World Dev. 2021, 146, 105545. [Google Scholar] [CrossRef]
  82. Özler, B.; Cuevas, P.F. Reducing Attrition in Phone Surveys; World Bank Blogs; The World Bank: Washington, DC, USA, 2019. [Google Scholar]
  83. De Beer, J. Ownership of Open Data: Governance Options for Agriculture and Nutrition. GODAN (Global Open Data for Agriculture and Nutrition). 2017. Available online: https://www.godan.info/sites/default/files/documents/Godan_Ownership_of_Open_Data_Publication_lowres.pdf (accessed on 23 August 2022).
  84. Young, A.; Verhulst, S. Aclímate Colombia: Open Data to Improve Agricultural Resiliency. Open Data’s Impact. 2017. Available online: http://odimpact.org/files/case-aclimate-colombia.pdf (accessed on 23 August 2022).
  85. Open Data Institute. How Can We Improve Agriculture, Food and Nutrition with Open Data; Working Paper; Open Data Institute: London, UK, 2015; Available online: https://theodi.org/article/improving-agriculture-and-nutrition-with-open-data (accessed on 23 August 2022).
  86. Gabella, C.; Durinx, C.; Appel, R. Funding Knowledge Bases: Towards a Sustainable Funding Model for the UniProt Use Case; F100 Research 6 (ELIXIR): 2051; Swiss Institute of Bioinformatics: Lausanne, Switzerland, 2017. [Google Scholar]
  87. DLG. Digital Agriculture: Opportunities, Risks, Acceptance; A DLG Position Paper; DLG: Frankfurt, Germany, 2018. [Google Scholar]
  88. UN. UN Common Guidance on Helping Build Resilient Societies; United Nation: Rome, Italy, 2020. [Google Scholar]
  89. Food and Agriculture Organization. Evaluation of the Information on Nutrition, Food Security and Resilience for Decision Making (INFORMED) Programme; FAO Programme Evaluation Series; FAO: Rome, Italy, 2021. [Google Scholar]
  90. Rainaw, F.R.; Susanti, A.I.; Amelia, I.; Ardisasmita, M.; Widarti; Dewi, R.K.; Ferdian, D.; Gusdya, W.; Purbasari, A. Understanding mobile application development and implementation for monitoring Posyandu data in Indonesia: A 3-year hybrid action study to build “a bridge” from the community to the national scale. BMC Public Health 2021, 21, 1024. [Google Scholar]
  91. Farm Radio International. The Listening Post; Farm Radio International: Ottawa, IL, USA, 2020. [Google Scholar]
  92. Hoffecker, E.; Leith, K.; Wilson, K. The Lean Research Framework: Principles for Human-Centered Field Research; D-Lab: Cambridge, MA, USA, 2015. [Google Scholar]
  93. Leith, K.; McCreless, M. Lean Research Field Guide; MIT D-Lab: Cambridge, MA, USA, 2018. [Google Scholar]
  94. Green Climate Fund. Funding Proposal-Acumen Resilience Agriculture Fund (ARAF); Green Climate Fund: Incheon, Korea, 2018. [Google Scholar]
  95. IDS. Exploring the Potential for Interactive Radio to Improve Accountability and Responsiveness to Small-Scale Farmers in Tanzania; Institute of Development Studies Administrators: Kota Kinabalu, Sabah, Malaysia, 2016. [Google Scholar]
  96. ITU. Measuring the Information Society Report; ITU: Geneva, Switzerland, 2018. [Google Scholar]
  97. Keusch, F.; Bähr, S.; Haas, G.C.; Kreuter, F.; Trappmann, M. Coverage Error in Data Collection Combining Mobile Surveys with Passive Measurement Using Apps: Data from a German National Survey. Sociol. Methods. Res. 2020. [Google Scholar] [CrossRef] [Green Version]
  98. Kilic, T.; Moylan, H. Methodological Experiment on Measuring Asset Ownership from a Gender Perspective; World Bank Technical Report; The World Bank: Washington, DC, USA, 2016. [Google Scholar]
  99. GSMA. Connected Women: The Mobile Gender Gap Report 2020; GSMA Association: London, UK, 2020. [Google Scholar]
  100. Beegle, K.; Carletto, C.; Himelein, K. Reliability of recall in Agricultural data. J. Dev. Econ. 2012, 98, 34–41. [Google Scholar] [CrossRef] [Green Version]
  101. Leo, B.; Robert, M. Practical Considerations with Using Mobile Phone Survey Incentives: Experiences in Ghana and Tanzania; Center for Global Development Working Paper; Center for Global Development: Washington, DC, USA, 2016. [Google Scholar]
  102. National Research Council. 8 Potential Sources of Error: Nonresponse, Specification, and Measurement. In Estimating the Incidence of rape and Sexual Assault; Candace Kruttschnitt, K., Kalsbeek, W.D., House, C.C., Eds.; The National Academies Press: Washington, DC, USA, 2014. [Google Scholar]
  103. Ballivian, A.; João Pedro Azevedo, J.P.; Durbin, W. Using Mobile Phones for High- Frequency Data Collection. In Mobile Research Methods: Opportunities and Challenges of Mobile Research Methodologies; Ubiquity Press: London, UK, 2015; pp. 21–39. [Google Scholar]
  104. Hughes, S.; Velyvis, K. Tips to Quickly Switch from Face-to-Face to Home-Based Telephone Interviewing; Mathematica (Blog); JPAL: Cambridge, UK, 2020. [Google Scholar]
  105. Eitziger, A. Data collection Smart and Simple: Evaluation and Meta Analysis of Call Data from Studies Applying the 5Q Approach. Front. Sustain. Food Syst. 2021, 5, 727058. [Google Scholar] [CrossRef]
  106. Kasy, M.; Sautmann, A. Adaptive Treatment Assignment in Experiments for Policy Choice. Econometrica 2019, 89, 113–132. [Google Scholar]
  107. Sabates-Wheeler, R.; Devereux, S.; Mitchell, T.; Tanner, T.; Davies, M.; Leavy, J. Rural Disaster Risk: Poverty Interface; Institute of Development Studies, University of Sussex: Brighton, UK, 2008. [Google Scholar]
  108. World Bank. Economic Effects of COVID-19. Rapid Surveys of Rural Households in India; IDinsight and Development Data Lab: Washington, DC, USA, 2021. [Google Scholar]
Figure 1. Waterfall versus Agile model. Source: USAID (2017) [17]. Reprinted/adapted from Ref. [17].
Figure 1. Waterfall versus Agile model. Source: USAID (2017) [17]. Reprinted/adapted from Ref. [17].
Sustainability 14 16480 g001
Figure 2. Average cost per completed survey (USD). Source: IPA (2020) [76]. Reprinted/adapted from Ref. [76]. a A portion of these estimates do not include fixed costs and underestimate total survey costs. b Automated SMS utilizes technology that can schedule messages rather than rely on human interviewers to send messages to participants
c Manual SMS utilizes human interviewers to send messages to participants.
1 Kenya (32); Lebanon (43); Madagascar (7); Malawi (7, 49); Mozambique (53); Nepal (55); Peru (3); Senegal (7); Sierra Leone (42); South Africa (18); Tanzania (7); Togo (7). 2 Afghanistan (40); Bangladesh (19); Ethiopia (40); Ghana (24, 31, 54); Mozambique (40); Peru (3); Sierra Leone (42); Uganda (19); Zimbabwe (40). 3 Liberia (17); Peru (3). 4 Nepal (55).
Figure 2. Average cost per completed survey (USD). Source: IPA (2020) [76]. Reprinted/adapted from Ref. [76]. a A portion of these estimates do not include fixed costs and underestimate total survey costs. b Automated SMS utilizes technology that can schedule messages rather than rely on human interviewers to send messages to participants
c Manual SMS utilizes human interviewers to send messages to participants.
1 Kenya (32); Lebanon (43); Madagascar (7); Malawi (7, 49); Mozambique (53); Nepal (55); Peru (3); Senegal (7); Sierra Leone (42); South Africa (18); Tanzania (7); Togo (7). 2 Afghanistan (40); Bangladesh (19); Ethiopia (40); Ghana (24, 31, 54); Mozambique (40); Peru (3); Sierra Leone (42); Uganda (19); Zimbabwe (40). 3 Liberia (17); Peru (3). 4 Nepal (55).
Sustainability 14 16480 g002
Table 1. Average cost per survey type (USD) *.
Table 1. Average cost per survey type (USD) *.
TypeDHSMICSLSMSLaborAgriculturalSupp.
Operations800,186716,0401,235,852331,2041,117,303319,002
Field805,027340,985495,427133,128431,135125,974
Total1,605,2131,057,0251,731,279464,3331,548,438444,977
“Operations” consist of training, transport, personnel and data processing. “Field” support refers to technical assistance, admin. and other costs. “Types” refers to DHS: Demographic and Health Survey; MICS: Multiple Indicators Cluster survey; LSMS: Living Standard Measurement Study; "Supp." Refers to supplemental surveys to measure progress toward SDGs. * Source: Data for Development: A Need Assessment for SDG Monitoring and Statistical Capacity Development.
Table 2. Modes of administering survey and typical related attributes.
Table 2. Modes of administering survey and typical related attributes.
Technology ModeKey Attributes
ModePhysical Set UpHardware
Requirements
Time Saving—Low CostsHigh FrequencyFeedback Loops
CAPIFace to Face OperatorMobile Phone/Computers
CATILive Operator Call centers Mobile phone
SMSAutomated & ManualMobile phoneif manual
Mobile AppAutomatedSmartphone or feature phone
IVRPre-Recorded MessagesMobile phone
SensorsAutomatedSatellite
Note: Red = missing attribute; Green = attribute fully satisfied; Orange = attribute partially satisfied.
Table 3. Typical positive attributes of Agile and Non-Agile Data in international development.
Table 3. Typical positive attributes of Agile and Non-Agile Data in international development.
AgileNon-AgileReference to the Literature
1Diverse technology modesSurveys administered face-to-face or by telephone = Higher costs[3,4,12,33,73,77,78]
2Short duration, and high frequencyLong duration of data gathering and processing = Less actionable knowledgeLow frequency = Measurement errors, non-timely information and greater attrition rates[1,5,6,17,30,35,40,45,66,79,80,81,82]
3Agile design and monitoring based on rapid feedback loops and adaptive behaviorWaterfall or linear management model is more static and less interactive, thus reducing flexibility and rapid learning or decision-making [12,15,17,20]
4Open data principlesClosed data ecosystem = Limited exchange of data to farmers and between the public and private sector [74,83,84,85,86];
5Farmer-centric approach to Data DemocracyLimited ongoing farmer engagement[20,24,43,48,87,88]
6InteroperabilityLimited integration between different data types and sources. Non-standardized metrics that challenge verification and can limit the topics and levels of analysis set in the beginning[1,40,71,89]
Table 4. Semi-agile approaches to data collection in agriculture.
Table 4. Semi-agile approaches to data collection in agriculture.
OrganizationDiverse Tech ModesHigh Frequency Short DurationAgile Design and MonitoringFarmer CentricOpen Data Principles
World BankCATIRRPS: Multiple rounds
LSMS-ISA: monthly
RRPS: 20 m
LSMS-ISA: 20 m
SWIFT: 7–10 m
IBM LSMS-ISA:
RRPS: Interactive country dashboard
USAIDIVR, mobile appRTD4AM: monthly or weeklyRTD4AM: rapidRTD4AM
MERL
RTD4AMRTD4AM
World Food ProgramSMS, CATI, IVR, chatbot, FacebookmVAMmVAM: rapidmVAMmVAM dashboardmVAM
CGIARIVR5Q: Daily5Q: 15 min 5Q: Dashboard
AcumenCATI
Note: Red = missing attribute; Green = attribute fully satisfied; Orange = attribute partially satisfied.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Serfilippi, E.; Giovannucci, D.; Ameyaw, D.; Bansal, A.; Wobill, T.A.N.; Blankson, R.; Mishra, R. Benefits and Challenges of Making Data More Agile: A Review of Recent Key Approaches in Agriculture. Sustainability 2022, 14, 16480. https://doi.org/10.3390/su142416480

AMA Style

Serfilippi E, Giovannucci D, Ameyaw D, Bansal A, Wobill TAN, Blankson R, Mishra R. Benefits and Challenges of Making Data More Agile: A Review of Recent Key Approaches in Agriculture. Sustainability. 2022; 14(24):16480. https://doi.org/10.3390/su142416480

Chicago/Turabian Style

Serfilippi, Elena, Daniele Giovannucci, David Ameyaw, Ankur Bansal, Thomas Asafua Nketsia Wobill, Roberta Blankson, and Rashi Mishra. 2022. "Benefits and Challenges of Making Data More Agile: A Review of Recent Key Approaches in Agriculture" Sustainability 14, no. 24: 16480. https://doi.org/10.3390/su142416480

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop