The Center for Data Innovation spoke with Ewan Oglethorpe, executive director of Data Friendly Space, a U.S.-based NGO that works across six continents to connect modern data systems with the humanitarian community. Oglethorpe discussed how effective data management can strengthen humanitarian response during a crisis.
Gillian Diebold: What is Data Friendly Space (DFS) and what was the catalyst for its founding?
Ewan Oglethorpe: At the core of DFS and its founding is the drive to bridge the gap between the level of technology in the international humanitarian community and that of the private technology sector. In 2015, I was working in the Bay Area as a data engineer for the personal styling company Stitch Fix when the Himalayan country of Nepal suffered several devastating earthquakes. At the time, my parents were living there and I traveled over on an altruistic whim to help out however I could. I left with a set of work gloves expecting to move rubble or something similar but ended up doing data management for the United Nations and other responding organizations. I was quite taken aback by the digital systems used by the humanitarian community; in particular, I was taken aback by its overreliance on Excel and Google Sheets. I felt that I could make a difference by developing more modern data solutions for responders to use and this has been my guiding light ever since.
This drive led to the creation of DFS 3 years later. What started as a seedling has grown into a humanitarian force for good, with over 85 staff working in more than 10 countries as of 2021. We are a team ranging from the obligatory data nerds to designers all with a common cause to create tools and systems that assist the world’s most vulnerable.
Diebold: How does DFS serve the humanitarian community?
Oglethorpe: We ultimately serve a global community by reducing the amount of time spent on managing data so that organizations can focus on their critical missions. DFS does this in three main ways. First, we make data systems that are welcoming for humans to use. We build custom software with a focus on modern design and interoperability between different applications and organizations. There is an unfortunate pattern in the humanitarian community where work can sometimes be done in silos and data systems often do not communicate with one another. By bridging this divide, we help humanitarian organizations save both time and money.
Second, we aim to tame the flood of qualitative data. A staggering amount of qualitative information—think news articles, reports, social media posts—is generated in humanitarian crises. This content contains rich information that can be vital in informing humanitarian response operations. However, the volume is simply too great to be effectively used. DFS provides teams of analysts working across English, French, Spanish, and Arabic who swim through these tides to produce actionable insights using our DEEP tool.
Lastly, we let the robots do the work when suitable. Many analysis and data-gathering processes in the humanitarian community are easily automatable, but the effort simply hasn’t been put in yet. We focus primarily on the development of natural language processing solutions to help our analysts use DEEP for more routine tasks so that they can use their minds for human problems that an AI system can’t do quite yet. We also develop systems for semi-automated “human in the loop” information retrieval and extraction.
Diebold: Tell us more about DEEP. What are the technologies behind this tool?
Oglethorpe: The best way to think about DEEP is that it’s a tool that allows the humanitarian community to create a very well cataloged library for all of its qualitative information that can be easily accessed. Previous to DEEP, humanitarian analysts would have to laboriously juggle spreadsheets (and maybe post-its) to sort through large amounts of qualitative data. While this worked to a certain degree, it fell quite short for collaborative tasks or when data needed to be stored and reused at a later point (the humanitarian community boasts a large hodgepodge of orphaned Dropboxes and Google Drives).
Making DEEP work is a standard web application stack written in Python and Javascript. Our key technologies include Django, GraphQL, ReactJS, and D3.js. We host DEEP, and all of our applications, in AWS environments and use Docker for containers. DEEP is an open source project (we love FOSS!) and the code is available on GitHub, with contributors welcome! We’re always mindful of best security practices as well as personal information related to beneficiaries can sometimes be uploaded to the platform.
Diebold: How is DEEP aiding responses to the COVID-19 pandemic?
Oglethorpe: Humanitarian crisis response is often nothing short of a scrambled response to chaos. Conditions can change in an instant and for sudden onset disasters in particular the information that may have been valid yesterday is no longer relevant. Adding COVID to the mix makes matters all the more complex as much of the face-to-face information gathering and communication methods are limited or have stopped altogether.
DEEP helps mitigate these additional complexities by allowing for the systematic extraction of information from secondary data sources. In addition, data stored in static places like PDFs can now be extracted and housed in an indexed and centralized location which means legacy data can be more actionable. Our work with DEEP to support COVID response has ranged from Colombia to Bangladesh where we’ve provided analysis support in addition to developing new features of the platform. This work has allowed humanitarian responders to gain better contextual understanding and use data for informed decision-making.
Diebold: In terms of digital capacity, where does the global aid community need the most support?
Oglethorpe: This question is hard to answer as I think there is no shortage of areas where the aid community needs support. If I had to choose the main one, however, I would say that it is the digital literacy and capabilities at the field level—a term meaning countries or places where humanitarian responses are occurring. There are numerous initiatives to create aggregated global data systems, however, if the data that is coming from the field is of low quality, then when it is merged together it remains substandard. I believe that the solution to this problem lies in investing in issues ranging from education to political governance and a commitment to open data.
At a virtual level, there are a large number of organizations that host their data in ways that are not machine-readable—i.e. through an API or RSS feed—making it difficult to systemically access this information. The problem is not the existence of the data, but rather its accessibility. I believe that more emphasis needs to be put on helping organizations with limited IT capacity or budget to improve their data infrastructure and make their data more readily usable.