Where We Work
See our interactive map
The issue of data ethics isn't new to international development—but it all feels a little more urgent now.
SwitchPoint 2018 was an inspiring, passionate, and delightful indulgence: Spending two days in lush Saxapahaw, North Carolina, thinking about big ideas and learning from leading experts in small group sessions in between performance art and live music shows felt like such a departure from Conference As Usual.
I felt space opening up in my brain to start really processing what the speakers were saying—the real act and action of listening versus the usual ruse of secretly responding to emails on my phone while half paying attention and nodding at key pauses.
One of the most compelling speakers was Nathaniel Raymond, the Director of the Signal Program on Human Security and Technology at the Harvard Humanitarian Initiative. His talk ‘Human Freedom in a Post-Normal World,’ was a timely warning on the abuse of data and the importance of strong standards and ethics when implementing digital and data-driven programming.
We need to think through a framework for data responsibility 2.0.
We must be extra mindful of this in development and humanitarian response, as the populations we serve are particularly vulnerable both in terms of their marginalization, general lack of protections and dearth in critical digital literacy (though it could be well argued that most global populations are woefully in need of critical digital literacy education, including here in the United States).
Given the news of the day, this talk drew a rapt audience and a packed smaller group at Nathaniel’s breakout session on combatting fake news.
A conversation about data ethics is not new to international development—Linda Raftree has been one of the earliest and most persistent voices on informed consent and data responsibility. But it all feels a little more urgent now.
Nathaniel’s talk gave me real pause: Is it ethical to encourage projects to use Facebook in support of program activities? Are we endangering the people we were trying to help by encouraging mobile data collection and the use of digital technologies to track impact? What harm are we inadvertently causing as we try to do good?
Thankfully, nothing is that black and white. We can’t keep folks from using Facebook—if that’s where they are. Meeting users where they are is a key tenant of human centered design, the Principles for Digital Development, and DAI’s Digital Insights-based approach.
At the end of the data is people.
Digital data collection has such extraordinary upsides in terms of efficiency, cost, and impact that we can’t just toss the baby out with the bathwater. That being said, it is crucial that DAI and other digital development advocates think through a framework for data responsibility 2.0, one that is applicable and comprehensive in the era of social media, smartphones, machine learning, and big data.
Such a framework could hopefully result in the ability to make a go/no-go decision about technology deployment based upon a set of rules and answers to some key questions about what we know, don’t know, can verify and can’t verify about a specific technology, thus place a crucial point of reflection and analysis as a final gate before development and rollout.
In our enthusiasm for ICT4D, let’s not forget at the end of the data is people. Protecting and advocating for people is our most important job.
This post originally appeared on Digital @ DAI.
Get the latest updates from the blog and eNews