At times when it seems like the future is too uncertain to prepare for, or that things look too bleak, it is important for human rights practitioners to remember that it is likely that some of our peers have already lived through similarly complex situations. Some of our colleagues around the world are positioned in places that might allow them to see the waves that are coming before they arrive at our location. As such, learning from the past experiences from colleagues could inform how we approach and prepare for the present and the future.
In this short section, we interview Grace Mutung’u (@Bomu), a lawyer with a background in information systems. We hope that in reading her reflections you gain some insights into what the near future might bring.
Q: How did you end up working on digital rights?
After high school, and in the period before we typically attend University in Kenya, I was sent to Strathmore College to study information systems management. This was just after the Y2K craze and everyone was very curious about technology. When I joined the University of Nairobi law school, law seemed completely abstract. Since Kenya is part of the common law system, we end up reading a lot about what some Lord said in the 1800s, and it was quite difficult to relate to. I recall making the link between legal rulings and programming quite early on: If this then that, pervasive in computer programming seemed very much like the types of tests judges were using to determine how a case should be decided. So for me, the association between the legal and information realms was kind of built early on...Then came the public interest component. During my time at law school, Kenya went through a big debate regarding whether the Constitution should be reformed. At the University, we were all involved in some way or another through student movements. Part of the debate was regarding how a constitutional reform would incorporate human rights. That triggered my interest. With my background in info systems I could see that, for example, much of the debate around rights such as freedom of expression would define how people would interact in the decades to come with technology like the mobile phones and messaging systems that were becoming a thing at the time.
Q: What surprised you the most about the way technology is reshaping the way people relate to governments?
At the beginning in the early 2000s, being in the space involved selling the case to governments regarding why they should adopt tech. That was also around the time we had the structural adjustment program, and our giant telco was being split up. There was a lot of discussion about how to get more companies into operating within Kenya. There was a narrative about the government being behind the curve in terms of tech adoption. So, in practical terms, our work involved pushing governments to adopt email, for example. Little did we know that tech was a small piece in a massive infrastructure of control governments already had in place. You add some of these tech components and it creates a whole new relationship with people. So the story changed from adoption to let’s also have some checks and balances. People could now see that governments already have a lot of power on its back, and when coupled with some of the new techs it could exercise such power in new and problematic ways. Now we have a mixed conversation. A part is access: Making tech accessible to everyone. Governments have adopted it quite a lot, especially in a country like Kenya where there are so many competing needs. Now the government often seems to be saying its services are unavailable to those who don’t use certain technology. It’s such a full circle when you compare it with the early 2000s.
To many of us, it now seems technology can and is very much being used to maintain the unequal relation between government and people. What might save us is not so much activism for digital rights, but what we had in that constitutional movement, which was not a tech movement, but a movement fighting for a multi-party state and a more plural society; one with a broader civic space and more public debate.
Q: Tell us a bit about your work around digital identity. How did you get involved with this topic?
I had the fortune of being an observer of two elections, called in to specifically look at the use of tech. This drew my attention to how political actors were leveraging tech more and more. At first it involved sending SMS messages to potential voters ahead of the election. Then, resources started being allocated to collecting data in the periods between elections. Kenyans are typically quite vocal about politics. And public reps started getting more interested in gathering data from social media. This made me curious to think about where this might be going.
Q: What would you say has changed since those days?
The government now requires you to get a digital ID with biometrics and all. This includes fingerprints, iris scan, facial photo and even an ear lobe scan (which is a curious component given many tribes elongate their lobes). The assumption is that these features are unique to a person, allowing the system to record the relationship between a person and the government from birth to death. When you add biometrics to the fact that people use phones that are constantly tracking location, and that many people use mobile payments, and that we have to register the phone cards, it means that a massive amount of very granular data about individuals is being collected constantly, and that social relationships are set to radically change because of this.
What is more, Covid accelerated the process of roll-out and adoption. To get a vaccine you need to have a digital ID. Of course there are advantages to this type of system, but it makes you wonder about the cost, and how these things were managed in the past, before intensive use of these tracking techniques.
There seems to be a drive to adopt these systems without a proper assessment of the needs. It often seems that these systems exist and it somehow follows that they therefore should be adopted. The issue of how such needs were conceptualized has happened in a separate forum, inaccessible, secret, and it’s now just about adopting.
Q: What can we learn from the past?
Looking into the past, it becomes evident that, like in other battles, there are issues that get silenced. People working on ecosystems and environmental rights are concerned with the energy required for all this system to run. All the steps that are needed prior to any digital interaction, and then all the places where subsequent interactions get recorded, copied and replicated. It seems like an immense amount of energy. People raising these questions are being silenced, as are the africanists who question the coloniality of this process. And they are being silenced by the same digital means. The decoloniality movement is asking why we have to be consumers in all eras; why we never get a seat at the table. There’s the risk of not getting to shape the tech and the system it creates, but also the harm of not getting a fair share of the benefits of the system that is currently in place. We have so many young people who have a stake in the future, but they seem to only afford a slot at the margins of that future.
Q: In what ways do you think the process of datafication is currently affecting human rights practitioners? What are some ways in which this process might evolve in the next ten years?
One of particular concern is that human rights practice has not been elevated itself at a par with the processes of datafication. We are on the outskirts of the debate. It’s also a cause for fatigue, because there is no single space for these debates. Even at the UN we are outsiders. Itt seems like in the global south we get access to a somewhat shortened version of human rights. What others have somehow defined as core human rights, is what we get. Social and economic rights, for example, are presented as something that will be discussed later on. I have not seen a holistic approach on human rights in this sense. Not even at the UN, and even after the adoption of Guiding Principles on Business and Human Rights.
I want to be optimistic and think that with all the exposure mainstream is getting on issues such as inclusion in the tech sector, we might move forward. Another hope comes from the fight of the generation of so called millenials. They seem to approach life in a more conscious way. They seem to make certain connections between the environment and the lifestyles they lead, which makes me feel there might be a glimpse of hope. But it also depends on the human rights movement to keep it up so things keep changing.
Q: What do you think the work of a human rights practitioner will look like in 2030?
Honestly, I think it will be a lot of the same issues with a tweak that incorporates some tech aspects. I think labor rights are and will continue to be a big issue. Especially given the way capitalism continues to operate. Perhaps the set of key actors will have changed or will be expanded, but at the end of the day regardless of whether you give your labor as a gig worker or in a field, the way labor relationships are shaped will remain a big issue.
I do think we will see people demanding traditional institutions like the World Bank to be more accountable. A lot of the reform processes in Africa, and those involving tech in particular, are being pushed for by institutions like the World Bank. And people are increasingly aware that if such institutions impact their lives, they should be accountable to them. I also feel that the concept of a human rights practitioner might become more circumstantial. We increasingly see people who play a huge role in enacting change by relying on Information Communication Technology (ICTs) to expose HR violations, even though they are not specifically trained as HR practitioners.
Q: It’s Wednesday, November 30th, 2030. You're sending an email discussing….?
I fear that by 2030 everything, even our minds will be read, hence you might be connected in ways that will not require emailing (laughs). I mean, I compare it to the time in which I had my first phone, which required an external keyboard. Now you touch whatever you want on the screen, and more of your senses are being embedded in the process of engaging with this communication.
But what will be discussed….? (She pauses to think) I hope it’s data trusts. Given it’s November 2030, perhaps it's about the reporting moment. I hope we will be discussing how our data has been used over the course of the year.
This post is a revised extract from an upcoming report by JustLabs and OpenGlobalRights on the the impacts that the process of datafication’s impact has had on public and intimate spaces over these past decades, and how the way in which the risks and opportunities played out in the past can inform our foresight into how this process might impact our rights and interests in the near future. Here are the parts I and II. You can access the full report here.