Underserved or Oversurveilled: Can We Strike a Balance When Regulating Tech?

Chenai Chair interviews Neema Iyer about the importance of intersectionality in data governance decisions.

February 1, 2021
2016-12-11T120000Z_731812220_RC18A70C7900_RTRMADP_3_KENYA-ACCIDENT.JPG
REUTERS/Thomas Mukoya

The power of digital technologies is something of a mixed blessing. Some see it as the silver bullet we can fire at our most deadly challenges; artificial intelligence (AI) and machine learning, for example, have been used in the response to the COVID-19 pandemic. Others see it as a curse; these technologies can also facilitate violence or the wildfire spread of disinformation. The power is located in a structure determined by social, political and economic contexts. One only needs to look at where big tech companies are headquartered to guess the dynamics that might shape their policies or operations. Individuals also navigate technology at the intersection of their racial and gender identities, class, geographic location and education (to name a few variables).

The conversation on the African continent with regards to digital technology rightly focuses on connectivity and closing the digital gender gap. However, there is increasing engagement with data-based technologies and, in particular, technologies that rely on AI and machine learning. I mapped AI development in South Africa as part of a recent research project titled My Data Rights (an outcome of my Mozilla fellowship). As the research illustrates, AI technologies are on the rise, but the accompanying policy and legal frameworks — which would ensure that necessary safeguards are in place — still need to be developed. Without these regulations, technology-facilitated harm is a significant concern and online safety is at risk such as digitally created intimate images of women without their consent. 

Neema Iyer, the executive director of Pollicy — a technology consulting firm that works to improve government service delivery through improved civic engagement — is an expert on data governance and online safety. In this interview, I asked Neema about the understanding of data in the African context; the intersection of gender, data and technology; and what we can do to ensure safety online amid a global surge in technology development. This is a lightly edited version of our conversation.


How do you define data in the context of emerging technologies?

I would say that data is any kind of distinct piece of information, which can be stored as values of quantitative or qualitative variables and which can be read, processed or understood by a human or a machine or both. With emerging technologies, data now extends to our images, our speech, our daily patterns and so on.

In your work, when it comes to data use in new technologies — and its governance — what opportunities and areas of concern do you see?

The opportunities are endless. I think that the advancements in health care especially will be very interesting to watch. This is an area dear to my heart and has definitely been put in context, thanks to the ongoing pandemic. How can data be used to increase efficiencies in health care, from preventative care to diagnosis, treatment and recovery, in parts of the world where, for example, getting bitten by a dog could be an instant death sentence by rabies? There are massive opportunities for improving agriculture and supply chains. We already see significant advancements in how we move money around and so on. However, the other side of the coin is the area for concern. Who has access to and owns this data? How will they capitalize on this information? Will it become a new currency for control and subjugation? And, overall, will we be able to keep any parts of our lives private?

What do you think are some of the intersectional issues between data, technology and gender?

As in most contexts, technology and gender are “co-produced,” each one shaping the other. These constructions are determined by who designs the technology, who decides what purpose it will serve, who develops it and, ultimately, who uses it. For example, women have always been a target of surveillance, for their families, by the larger society, and now there are more sophisticated means to continue to monitor the lives of women, to attack them for not conforming to patriarchal structures, to silence them when they enter public spheres such as politics or media.

Technology and gender are “co-produced,” each one shaping the other. These constructions are determined by who designs the technology, who decides what purpose it will serve, who develops it and, ultimately, who uses it.
Neema Iyer

On the one hand, technology and the internet, most notably, have opened doors for women across all fields. It’s mind-blowing to think that, for example, a student studying at Makerere [University, in Kampala, Uganda] could tweet for advice from an expert in a university across the world. On the other hand, there are voices that are against this progress, that aim to shut the door on the progress that women are making in all strides. As a result, you have significant violence against women in online spaces. You have discrimination that is embedded into algorithms. You have fewer women involved in the development of technology, due to hostile working environments. When you begin to think of the intersection of class, race, sexual preference and so on, it only gets more complicated.

Your study Alternate Realities, Alternate Internet uses feminist methodology to understand the issues of online gender-based violence. How did this approach shape your understanding of internet safety issues and dealing with data?

One of the key things that I took away from using feminist methodologies was to centre our respondents, the women we interviewed from the five different countries, as experts in their own experiences and to really focus on the root causes of systemic inequality and discrimination. One-third of all respondents said that they had experienced some form of online violence or harassment, and a large proportion of those women responded by deleting or deactivating their accounts, which we found to be a very worrying trend. If women do not fundamentally feel safe in online spaces, then we need to reconsider the entire internet. We need to pause and reassess everything.

Who do we need to have involved in the process of developing solutions for a safer internet and what are the challenges involved in their participation?

We need to include end-users from all walks of life and solicit feedback by understanding their experiences on the internet. This very process of feminist research can contribute to generating knowledge and collective action that will enable users to socially shape the future of data governance and internet use.

For the longest time, experts from the Philippines to Myanmar to Ethiopia sounded the alarms about the impact of unfettered social media on democracy, but it wasn’t until these issues imploded in the United States that people started to wake up — as in really wake up. Many technology companies that develop tools and technologies used by women do not take gendered needs and approaches into account. The community’s guidelines governing issues of safety often do not capture the nuances of online violence and lack the context of countries and regions from the Global South. The mechanisms for muting, blocking, reporting and seeking redress are often so convoluted that perhaps it is easier to just delete your online presence. It’s time to look at the power dynamics and systems of discrimination baked into what feedback global platforms actually decide to pay attention to.

What roles and responsibilities should the technical community take on in response to the opportunities and challenges that come with increased data use in technology?

I’d say that there are two main approaches. Firstly, look at things from different perspectives. There’s always a rush to solve problems with technology, whether it be sexy apps, platforms or portals. But these don’t answer the root causes of the problems.

Secondly, the technical community needs to focus on decentralizing technology platforms. I remember just a few years ago when people would say with pride that they were working on the next Facebook, but that zest to create has dissipated, and social media is such a mess that people are wary of claiming they want to create another beast. But, all in all, we need a broader landscape of digital infrastructure. We can’t rely on a few technology giants to rule the world. The monopolies of today are impossible to compete with. They will either replicate your idea, buy you up or simply destroy you in one way or another.

Going back to the feminist principles of the internet, how do we focus on alternate economies that are not based on capitalist greed, but also on how to consistently use open-source software with care as a core value? Just this early this month (January 2021), we’ve seen people jump ship from WhatsApp to Signal, which is free, open-source and run by a non-profit organization. Will we see more of these alternatives pop up? I sure hope so.

Overall, in thinking about data use and governance, what are other issues that we need to be thinking of in the future that would work toward ensuring a safer internet?

A big one is how data use and governance impact systems of government. As we speak, Uganda is awakening from a five-day complete internet shutdown during the 2021 general elections. How do we ensure a safer internet when we can’t ensure the internet at all?

More worryingly, today, data is used to surveil and stifle any opposition movements, to repress dissidents, to stop the freedom of movement. When there are no data protection laws in place or any bodies that can implement such laws in an unbiased way, it impacts everything in your life. Not to sound dystopian and 1984-ish, but that’s the truth about the situation in many countries. And yet, technology companies are in a race to supply such software to dictators throughout the world, with nothing to stop them from doing so. From biometric IDs, to smart cities with facial recognition in-built, to mandatory fitness trackers. It’s a constant battle to avoid datafication of every aspect of life.

So, on the one hand, governments can turn off the internet, and you’re free to stumble in the dark, or you’re connected, but oversurveilled with no laws to protect you. Sounds like a lose-lose to me.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Authors

Chenai Chair is the research manager focused on gender and digital rights at the World Wide Web Foundation. 

Neema Iyer is the founder of Pollicy, a civic technology company based in Kampala, Uganda.