Olena Boytsun, founder of the Tech & Society Communication Group, talks to Nani Jansen Reventlow, an award-winning human rights lawyer specialised in strategic litigation at the intersection of human rights, social justice and technology, about the concept of digital rights, regulations within the field and vigilant individuals.— Nani, thank you for taking time to talk today about such an important subject as digital rights. You are one of the professionals, who were building the field in the years, having founded and developed the Digital Rights Fund. What definition of digital rights do you consider the most accurate one nowadays?— The traditional, old-fashioned way of looking at digital rights is to focus on privacy, data protection and freedom of expression online. The way that I, as well as we at
Digital Freedom Fund, have looked at them, has a slightly broader definition. We've always looked at it as all human rights engaged in the digital context - so, not just civil and political, but also economic, social, and cultural rights. In a weird way, the pandemic has been helpful in explaining the definition a little bit more. I think that we all noticed how much we had to engage with technology to make so many areas of our lives run. And that illustrates how in almost every aspect of our lives we're engaging with technology in some form: at school, at work, our social interactions with each other, our banking, healthcare systems… Basically everything. So in a broader sense of the word, it's all human rights, and when technology is involved, basically we talk about digital rights.
— It is quite often to think that there is a whole spectrum of human rights, and within this field there is a sector of digital rights. Do you mean that it makes more sense to structure digital rights as basic human rights that are just “touched” by or “delivered” through technology?— It makes sense in looking at it in a more holistic way. It's the same way that it's helpful to sometimes emphasize the context in which rights are being engaged. There's the same set of human rights for everyone. But it can be practical sometimes to focus on the human rights of children or women, then we talk about children's and women's rights. It's about highlighting particularly specific needs. I don't only mean protection needs, but also needs for people to be able to enjoy and properly fulfill their human rights. Technology presents very specific challenges and potential harms, when it comes to our human rights. So it's good to be aware of the fact that, when you are deploying technology, you have to look at the full scale of human rights and how they might be affected by it.
— Is it that technology is only bad for rights?— As someone famous said: “Technology is neither good nor bad”.
(Note: it was Melvin Kranzberg, known for his laws of technology) Technology itself isn't the issue. It's about how we, humans, engage with it. It's about where and how we choose to use it, and what kind of technology we create and want to use. Imagine any tool that you have in your kitchen, for example. You can use it for good or bad. You could make a delicious meal with it, or you could hurt yourself by accident, or another person, or your furniture, whatever. The same objects or pieces of technology could lead to really good things and to really bad things. And it's about thinking what are the things that we understand about the context, in which we're deploying something, what do we stand for how it works.
We have safety measures for all the different things that we use in daily life. If we need to have transportation to get somewhere, there are risks entailed, and we try to mitigate that. In a similar way, we have to think about technology. How do we make something that is safe, that doesn't create harm for people, perhaps creates benefits for our society, and how do we mitigate the risks, when it doesn’t? So, it's not the tech in and of itself, it's the whole context, it's us and the choices that we make.
— Are there any specific groups that are the most vulnerable to digital rights violation?— In the end, anyone could be affected. But I do think that those who are marginalized by our societies run the most immediate risk of harm. In particular, because of the two following points.
The first is the fact that the kind of technology we build and the way that we deploy it reflects the power structures that we have in our societies. Those who have power are often the ones who are also the most privileged. They are the ones who are making the policies and setting the rules. They are focused on their own interests. And there's often not enough attention for people who are differently placed in our societies, for marginalized groups for all sorts of different reasons: of ability, gender, religion, economic status etc. And we're not focusing enough on making sure that any harms inflicted on those groups are mitigated sufficiently. Because it's not part of the field of vision for leadership.
The other thing is that quite often those communities and groups are the testing ground for a lot of technology that we want to roll out. I'm not the one who framed that. Philip Alston, international law scholar and human rights practitioner, looked a lot at the impact of technology on the human rights of the poor and said that people who are economically disadvantaged are often the testing ground for new technology. You see it in other contexts as well, for example, with Europe using all sorts of really invasive technology to identify people, etc. That is all coming for all of us. It's there, it's being tested, it's being refined, and it's going to make it to all of our doorsteps really soon. So, this is why we have to keep in mind that what affects the most marginalized groups in our communities, that will directly affect all of us. It's just in an early stage. It's just not here yet, but it's coming.
— Could you possibly give an example of some unified case of digital rights violation that would be appealing and understandable to anyone?— My point is that it all affects all of us. We can take different things. For example, facial recognition technology that works best on white male faces, meaning that it doesn't work well on the rest of our society. Based on that technology assumptions are being made as to whether or not someone has committed a crime or whether or not someone was in a certain area at a certain time. It can also relate to innocent things like using facial recognition software to open a door or to be able to access certain software. We've seen it a lot during the pandemic, when the software was used to check whether or not students might be cheating during their exams that they had to take from home. So, yes, that affects all of us, perhaps except for a small group of people, which is the white able-bodied cisgender developers. So the “standard user”, which is not standard at all.
I can give one more example of predictive software usage that was incorporated in the law enforcement system, such as predictive police, but you also see it in the private sphere more and more. Whether or not you can get a bank loan is becoming increasingly dependent on all sorts of factors that you do not know in advance. There was an interesting case recently, nicely illustrative about how this sort of thing harms all of us. A
Swedish-speaking man in another Scandinavian country, who was living in a rural area, wanted to get a loan to do something with his house. But his application was rejected because Swedish-speaking people were a minority in that particular country as well as women were rated higher for receiving money. So he was a man from a minority, therefore he was downgraded. Had he been, one can say, a Finnish-speaking woman who had asked for a loan, he would have gotten it. So there are all of these ways in which the systems at play are not transparent to us, but they are making impactful decisions about our lives. So, yes, it affects us all both in the public and the private sphere.
— Are there any global regulations in the digital rights sphere?— Global regulation – no, it doesn’t exist. And I also would like to emphasize that we have the human rights framework to regulate these things.