“Netzpolitik. A Feminist Introduction”: Discussing with author Francesca Schmidt about queer feminist perspectives on the governance of hate speech in Germany

Interview by Todd Sekuler

Francesca Schmidt is founding member and board member of Netzforma* e.V. – Verein für feministische Netzpolitik. She is also Programme Director for intersectional knowledge related to memory and transformation at the Federal Agency for Civic Education in Germany, and a former Senior Programme Officer at the Gunda Werner Institute for Feminism and Gender Democracy. Several months ago, CrimScapes team member Todd Sekuler, researching the criminalisation of online hate speech, spoke with her about her 2020 book “Netzpolitik. Eine feministische Einführung” (published by Barbara Budrich). Here is an edited transcript of their conversation (translated from German into English):

Thanks so much for talking with me today about your ground-breaking book, Netzpolitik. A Feminist Introduction,which I see to resonate closely with the approach of our research project – it investigates the implications and impacts of policies beyond their initial intentions. To begin, could you please explain to readers of our blog what Netzpolitik means?

Netzpolitik is not yet a properly defined term, but several issues have been used to define what Netzpolitik should include, such as: digital publicity, access to the Internet as a structure, access to content on the Internet, and copyright and data protection. Of course, these all overlap to some extent, they’re not as clear-cut as they seem. But they are what is most commonly understood as Netzpolitik – at least from an activist perspective, and also from a political perspective.

In the book, you distinguish between different feminist forms of activism on the Internet, or with the Internet, and you make a distinction between feminist Neztpolitik, cyber feminism, and Netzfeminismus. Can you briefly explain these distinctions?

Yes, so cyber feminism, when considered as part of a historical investigation of digitization and technology, can be viewed as an artistic engagement with everything that took place digitally and technologically in the 80’s, 90’s, and beginning of the 2000’s – and it was really not all that regulative. By that I mean it was not much about questions like: How do we want to shape the Internet politically, in the sense of regulative politics? So it was not in the sense of politics from a feminist activist perspective, but really of, let’s call it, “a job description” (Berufsbild) – an artistic engagement. Netzfeminismus would be what connects to that and makes it understood through an activist lens, and grasps the space of the Internet as a space in which one can become active as an activist, and negotiate social processes. It was similar in cyber feminism, but I think that was more of an artistic engagement. I distinguish feminist Netzpolitik from those two because it’s really about the processes – regulative political processes. Why is which server located where? Who decides how URL’s are distributed? Who decides what standards are used? And so on… I think that’s an approach for feminist Netzpolitik and less for Netzfeminismus, which sometimes also takes this up implicitly, sometimes also explicitly – but, generally, they are quite disparate fields.

Regarding the German Internet Enforcement Act, or NetzDG, you write in the book that it only very vaguely includes feminist concerns. Can you elaborate on that? What’s missing?

The book is a child of its time, and especially that chapter, because in the meantime, the Internet Enforcement Act has progressed a bit. But, on the whole, I think, the criticism remains that, first of all, law enforcement is trying to privatise, or let’s say, the first step to law enforcement is trying to privatise – in the sense that the platforms decide what is legally legitimate and what is not; what you have to report and what you don’t. From a feminist perspective, this is difficult, because we live in a society that is patriarchally structured and where things like rape jokes are made, frequently actually, and other problematic things are said freely, not just those that are relevant to criminal law. Or, in other words, the legal system doesn’t necessarily protect women first, or people who are read as women, or protect Black persons first. But, instead, it protects white masculinity, the subject from which everything else is defined. That’s why it’s a bit difficult, or let’s say, a balancing act, to argue with legal regulation from a feminist perspective. And that’s what I still find difficult about the NetzDG – in the end, the large corporations decide what is relevant for criminal persecution and what isn’t. And I also have a problem with the structural displacement it entails: The state has a certain responsibility to care for its citizens, and this policy simply shifts that responsibility to the outside and says, “You do this. You also provide the care, and then, down the line, we’ll just check in to see whether you did it right or wrong. And if there’s any doubt, we’ll help.” This issue hasn’t changed, but some things have improved a bit since the book. 

Recently, I was on a panel and there was someone from the field of media protection, and she was enthusiastic about everything. She said, “The NetzDG helped a lot! There is much less hate!” And I thought, I don’t know, most things are deleted according to community standards and not at all according to the NetzDG, because the hurdles are just too insanely high if you want to delete something on Facebook according to the NetzDG. And you’re also just a bit afraid that you’ll somehow be half arrested or something if you’ve given the wrong information. And we’re all not criminal lawyers, so we all don’t know the details of the law for determining the legality of postings. 

Regarding the law itself, and this concerns the penal code itself, many issues that relate to women, or other people who experience sexual violence, also sexualized violence in communication, are simply not covered because gender is just not a protected category, and race even less so. 

In the book, you show how financially driven actors, and not just activists, have played a role in promoting political responses to potential Internet abuses and violence, and also that the current governance structures in Germany assign responsibility for surveillance and content regulation to Internet platforms. You describe this as a “privatization” of law enforcement. Their interests, you point out, are above all financial in nature. Increasingly, however, the platforms are compelled to work in tandem with other structures that mobilise different logics to their work, such as the German Federal Crime Office. Within this complex field of interrelated actors, how might we better recognise and de-center or de-prioritise the financial interests of these companies? 

Well, it’s of course not possible to remove their economic motivations. We would have to change the capitalist system. We are not going to do that now. We have to be realistic. But to work in that direction, I think it’s important to first establish that this is the case, and to give the users a possibility to choose – to give them that knowledge, firstly, and to say, “Listen. With your data – especially with your interactions with other users, so not only with your login data, but with all that you do online, everything – it can all be used by other people to earn money.” That’s the first thing. And the other thing, the much bigger issue I think, is that we’re not clear as a society about how we want to communicate, and about what is and should be permissible, and what is or should not be permissible. This process of negotiation has been going on for some time now. I think we have to be clear, again, that we are not driven or influenced by financial interests, by the interests of companies that actually only have financial motivations, or want to do everything in a particularly conservative way.

You focus on hate speech in the sixth chapter of the book, which is about feminist Netzpolitik and digital violence, and you criticize that sexual, homophobic and transphobic violence are not explicitly named in the Council of Europe definition of hate speech. This is a very important point and had me thinking. Other forms of hate are also not explicitly mentioned, such as hatred towards the homeless or persons living with particular illnesses, such as HIV or corona. Do you have thoughts about how to respond to these inevitable failures of the politics of naming? This has long been a topic in feminist and queer studies, and one we surely won’t resolve so easily, but are there alternatives here?

Maybe definitions must remain agile – because knowledge levels change, because perspectives change. I actually like the Council of Europe definition, despite the fact that these points are missing, because, in other definitions of hate speech, everything isn’t in there either. They have something about human dignity and human rights. I find that difficult, because you still don’t name whose human rights this is about in the end. So that’s why I like this one so much, because it’s quite clear, despite everything, and it’s relatively old now, from ‘97; so for Internet age, it’s ancient. (laughs) But despite everything, it’s a relatively clear definition, that’s why I still like to use it. 

One major concern that you present in the book is about data surveillance, or dataveillance, and the oscillation of its functioning between care and control. You referred to it as a desire for “safety” that is also a desire for “inflexibility”. You also offer the following question as helpful to bring with you for mobilizing an intersectional analysis: “Which women are affected with what other discriminatory features and how?” Building on this question, how has surveillance changed in recent years as part of the Internet, and how are groups differently implicated in this history?

Surveillance no longer starts from individual questions, but rather entire contexts are now monitored in a nonchalant way, and it no longer means looking at individual persons or targets, but, instead, whole contexts can be monitored, and I think that’s changed through technologization, and also the Internet, but probably mostly through the technology that’s driving it. Despite everything, our social coexistence in terms of power and domination has not changed. That means that white cis hetero masculinity is still in the centre and is also the positioning that is least – well, not least affected by surveillance, because we are all affected by surveillance in some way, but least impacted by its consequences. The impact of racial profiling, for example, is more likely to be felt by Black people, or the impact of the fact that we have body scanners at airports everywhere, or increasingly so, is more likely to be felt by trans people who can have a different gender in their passport than what is shown on the scanner. And that’s why it’s usually the case that people who are already affected by, well, not only by discrimination, but also by the structures of discrimination by the state, also by these surveillance structures, are more likely to be affected and restricted, and to fear. So for them it’s not a security issue, but always an issue of insecurity. 

In the book, you also make a call for more quantitative research on digital violence among BiPOC (Black, Indigenous, People of Colour), LGBTIQA (lesbian, gay, bisexual, trans, inter, queer asexual) persons, and persons with migration backgrounds or disabilities. Are there warnings that we should take with us with regards to quantitative research projects? Or how do we engage in that type of research without it becoming another form of surveillance, or a normalising force? 

I understand such questions of science or research in this context more as a tool to make policy, not so much as a form of surveillance. But I honestly have to think a bit more about it. In the area of digital violence, I’ve been hoping for this research because there simply is not yet intersectional data in this form – to provide empirical input about what is happening. This is always a point from where you can start to argue, to advocate. And I think that’s very important. The things that exist in the German context are not very helpful from an intersectional perspective: a) because they have a wishy-washy definition of hate speech – so not very precise, kind of xenophobic and so on, and then everything can be hate speech; and b) because of the categories that are used, and this is the case in Germany for good reasons, that’s why it’s going to be difficult: only gender, age, maybe self-assigned label, in terms of positioning, but more likely not even that, and migration background is there, but, again, this is another category that can also be everything. It’s difficult. 

For some of the largest platforms, algorithms already play a big role in filtering out content that might be hateful. In your book, you present some of the effects of algorithms on the Internet – especially how they come to take on the sexist, racist, homophobic and transphobic logics as the societies within which they are created. With regards to hate speech, the decisions of so-called cleaners, or persons who work for companies hired by platforms to respond to flagged content, are being used to further refine these algorithms. So their responses are feeding into the algorithms to help refine them further. Some hope that algorithms could drastically reduce, if not entirely eliminate, human involvement in selecting out hate on the Internet, which is significant given that the humans who take on these jobs are faced with overwhelming amounts of hate on a daily basis. What do you see as the potentials, and limits, of using algorithms to respond to hate online?

First of all, I don’t think that algorithms can filter out hate speech so finely that in the end we will only need algorithms. By now, they can probably filter out certain images, videos and so on without any problems, but everything that is somehow unclear, I don’t think they can pick those things up, so I don’t know, but I would not think it’s possible for it to not be biased in the end somehow. Because we still only have a feeling for what hate speech, or digital violence, is, exactly. At least not for the things that are relevant under criminal law. There are things one person might feel are not good, and are violent, and then someone else could have a completely different feeling. But these experiences and perceptions are often based on structures. So just because we don’t recognise something as violence that someone else experiences as violence, that doesn’t mean it isn’t violence. And I think you’re not going to solve that through technology, for one thing. The other thing is, for me, algorithms won’t solve our social problem of hate. It moderates something away, but we still have a society that hates, and communicates that hate to each other, and will simply find ways to get around it. And I think that’s why, for me, it’s first about social processes, and then second about technology, and not using technology to solve social processes. We have to somehow manage that ourselves. I mean, I am sceptical about using technology to solve problems that we, as humans, have to solve with each other. If we can’t do that, then technology won’t do it either, because technology simply takes over our problems, and somehow continues them, and maybe pushes them in a new place where we can no longer really intervene. So if technology can support us, then great. But I think if it’s used to replace us and the social and political work we have to do, it won’t work.

In the book, you suggest that the collectivization of rights, rather than the current individualisation of right claims, might be a possible future direction to take in the area of Netzpolitik. Can you explain how that might look in relation to online hate? 

The collectivisation of rights, or the enforceability of women’s rights as a collective claim, has been sought from a feminist perspective for a relatively long time: The gender pay gap, for example, all things that particularly affect a group of women. But this hasn’t really been put into action in Germany yet. It already exists for many things, but not yet for women’s rights. If multiple women are affected by digital violence, then one could try not to drag every single one of them before the court, and to say you either have to file a complaint and deal with the public prosecutor, or do it through a civil court – and then there’s already HateAid, a victim-support NGO based in Berlin, but they can only do so many cases, so they rather focus on strategic litigation, which is great – but it would be clever to design the whole thing in such a way that you pull things together and make something like a collective class action. This would mean that not every individual would have to do it, and it could also build up pressure to act – on both the political question of regulation, but also on platforms so they finally do something significant, and not just sharpen their community guidelines a little bit, and always in their own interests, about what is considered sexist and or not, or what is racist or not. 

As you said, so much has happened in the year since this book was published. Among other things, there has been new legislation passed in Germany, the Fight against Right-Wing Extremism and Hate Crime, and also in Europe, the Digital Services Act (DSA). In your book you state that, even if a call for certain regulations triggered by digital patriarchal violence are understandable, the question remains as to whether patriarchal violence can be combated with equally violent structures, for example through sanctioning. To what extent do you view these new laws as embodiments of the violent structures you question to be valuable resources for responding to violence online?

To a full extent. (laughs) We haven’t changed anything in the basic structure. Of course, these patriarchal violent structures remain in place. That doesn’t mean that I reject them in their entirety. Of course, definitely not. But despite everything, yes, it’s just like technology. In fact, it’s just another form of technology…these forms of laws. They will stimulate social change at best, but we have to make the changes ourselves. 

And do you see hope in these new developments? 

Well, in Germany, some things will change with the European law, above all the NetzDG. I am fundamentally an optimistic person. That means, of course, I have hope. So… (laughs) Above all, I have hope if we critically question the existing systems of administration, and then look again to see how we actually want to deal with this. The DSA does this. It does a lot, but among other things, it also asks how money is being earned with hate, and tries to prevent that too. And I am also hopeful that our consciousness about that becomes more focused, and that we perhaps find another way to respond. But that’s just a building block. We also see how dependent we are on information from these networks, and how quickly that can simply be turned off, as has happened in various authoritarian countries. And how much media competence is necessary to bypass the whole thing. 

Why has online hate speech become such an important issue lately? There are various explanations I’ve heard, like particular terrorist attacks or individual murders motivated by hate, and I am curious what you see to be the reason. 

Yes, well, from a feminist perspective, digital violence against women has always been an issue – since the Internet came to exist – but it hasn’t been an issue of popular interest. And I think, um, exactly, it got more intense on a broad level in 2015 with the refugee movement. And that has just not abated. I find this interesting and still wonder how we might use these dynamics in a different way – in a way that doesn’t promote social antagonisms.  

That leads me to my last question: What made you decide to write this book? 

Well, I found the topic interesting and wanted to think more about it, and to write about how we might negotiate the issue together as a society, or how it could be negotiated from a feminist perspective in the areas of digitalisation, Netzpolitik, especially in the political sphere. When I started working on this, the discussion was mostly just about Netzfeminismus, and how we can push forward feminist concerns on the Internet. Those concerns didn’t necessarily have anything to do with the topic of digitization. I’ve always found that to be a bit of a shame, because, from a feminist perspective, we continue to use the structures without question. It’s good that we use them, but we didn’t question the design of the structures in which we want to participate. And I found that quite interesting to look at, and that actually interested me more than any activist questions. And then I also thought that these linkages between violence and surveillance, which I place next to each other in the book, have not been adequately discussed from a feminist perspective in the area of surveillance today, although that combined lens of course had a long feminist tradition.

Your current work is as Programme Director for intersectional knowledge related to memory and transformation at the Federal Agency for Civic Education. I understand you are working at the moment on trying to decolonise the Internet. Can you tell us a bit more about that before you go?

My current work is about trying to centre other stocks of knowledge in what technology can be and what technology should be, and where technology should change society. One of my main focuses now, with regards to knowledge about transformation, is to apply that to the development of technology. For what transformations do communities need technologies, and for what transformations don’t they need them? And how can you build technology so it makes sense for the communities? So not that just some dude in San Francisco thinks, “That’s mega cool, let’s use it somehow.” Do we all have to use it? Do we all need it? 

Thank you so much.

To read further in German, the book is available for purchase here.

The book’s introduction and conclusion have been translated into English and can be found here.