Sarah T. Roberts’ book, “Behind the Screen: Content Moderation in the Shadows of Social Media” was released a year ago, bringing to light the hidden workforce that is hired by social media corporations to “sanitize” the internet by alerting their employers to objectionable – sometimes violent or criminal – material that is posted by users. Often, this content is seen by millions before it is removed from social media outlets. While many viewers worldwide will be unscathed by the presence of this content once it is removed, the labor force who works for hours each day viewing and removing the worst aspects of human behavior, cannot unsee what they are paid to watch for.
This fall, “Behind the Screen” was translated into French and published by La Découverte, to positive reviews in Le Monde. Roberts, an associate professor in the UCLA Department of Information Studies, says that the book’s appeal goes beyond an academic perspective of commercial content moderation (CCM), a phrase she coined when beginning her research more a decade ago.
“The way that we as a society are thinking about the outsized impact of social media, for example, on the political process [is] just something that comes to mind,” notes Roberts, who is the co-director, along with Safiya U. Noble, UCLA associate professor of information studies, of the Center for Critical Internet Inquiry at UCLA.
“People are seeking more information about about how these platforms operate and how the social media we consume, a large portion of which may include news – or pseudo-news in some cases – is created and circulated, and on what grounds.
“Although my book is focused on [CCM] workers and their experiences, my argument is that their experiences of their work life and what they do on the job are both inextricable from those larger issues,” she says. “So, in knowing something about how the work is conducted and what its impact is we can then know more about this bigger environment that we all find ourselves in, and that many of us are questioning.”
Since the release of “Behind the Screen,” the world has been plunged into a global pandemic, acquired a greater awareness of the racial, economic, and educational inequalities that exist, and watched the United States undergo a historic and polarizing presidential election. Roberts is planning to revise her introduction in light of this new environment for the paperback edition, to be released by Yale University Press in spring of 2021.
Ampersand spoke with Professor Roberts on the relevance of “Behind the Screen” to the French public, whose experiences with social media and its ability to divide society mirror the U.S. experience, and how the COVID-19 pandemic has compromised the reliability of CCM and further challenged its workers, who are a unique type of “first-responder” to the internet that society has come to depend on – even more so in the pandemic – for almost everything in everyday life.
Ampersand: How has “Behind the Screen” proven relevant to France and Europe in general?
Sarah T. Roberts: In France, as in many countries in the EU, they are struggling collectively and have been for some time with values and collective norms. What it means is that France is a pluralistic society struggling with those issues. It’s very polarizing in France as it is elsewhere.
Of course, social media has a role to play in not just reflecting values and norms back to us as users, but I would argue that it’s more than a mirror. In some cases, it’s a magnifying glass and we know that the magnifying glass can both make larger a particular small issue, and we know that it can draw attention and help us focus.
But we also know that you could take a magnifying glass and use it to light a fire. I think that’s probably a better metaphor than even just holding up a mirror. There’s an amplification and there’s a crystallization that happens in our social media platforms with regard to values and principles; fundamental issues social issues are played out and argued and debated.
I think France is very much in that moment as well. In fact, there was a very disturbing incident recently where a school teacher was murdered by a religious extremist, and so that ignites all kinds of very polarizing debates in France about religion in a society that has long held a principle of religion being separate from the state and of society being secular. But in France, as in other places in the West, extra scrutiny is given to people publicly expressing Muslim faith more so than any other. It has caused deep lines of fracture and profound social debate, not to mention the passing of laws against, for example, the wearing of a full veil by Muslim women.
It’s reignited or further ignited, I should say, issues around immigration and one’s right to one’s own sovereign cultural expression versus the need or the demand to somehow blend in to the social fabric as it has been traditionally thought of in France. These are arguments that happen against the backdrop of many, many decades of colonialism, so it’s not as if a wave of, for example, North African immigration shows up to France for no reason. It’s because of these longstanding bi-directional movements between France and its erstwhile colonies (although it was not a North African immigrant responsible for the case I just mentioned). The bottom line is that these are really hot-button issues there. I think that in many ways, my book talks about this place called “social media,” where so many of those issues are getting aired and battled out and exacerbated.
Many of European countries, particularly within the EU — which itself has this kind of umbrella mechanism of creating policy and law that covers a number of member states – both the individual states and the EU itself are both struggling with how to rein in and account for social media legally and from a policy perspective in a way that represents their mores and values – and, by the way, what are those and how do we agree on those. All of these are very thorny, very timely issues.
There are a number of measures before the French National Assembly dealing with social media expressly, but as I was speaking to some French reporters, it seems that, just as in the United States, the legislators are some of the least informed people, unfortunately, about what goes into things like adjudicating social media content. They want to, very reasonably, I think in some cases, put some parameters on the companies and make them do a variety of things to adhere to local laws and standards, but they have no idea what that would really take or what that would really look like in terms of implementation. And, there isn’t even an agreement across the board that those moves constitute a good thing. I mean, at the same time, we have plenty of people and groups who advocate for more openness on those channels, even though openness means open for all kinds of voices and opinions, including ones that might be quite disturbing and racially, culturally and religiously polarizing. So that’s sort of the context in which my book has dropped in France.
This is a really acute issue in France as it is here and reporters, journalists, and others see that connection between what I described in the book, about the working conditions and the demands on the workers who do this moderation of social media content and its connection to what the output is on the user side: what we see and what we experience as we navigate those platforms and how that ecosystem is created.
&: What has changed – or not – for CCM workers in the wake of the COVID-19 pandemic?
Roberts: I haven’t directly spoken with people in the book recently, many of whom have moved on from this work, but I’ve certainly been monitoring the industry. And back in April, when things were first really becoming a fully global [pandemic] and we knew that we were in for a long haul, it became clear that quarantining and no longer going to a workplace was absolutely going to affect the social media industry with regard to this major workforce around the world in the form of its moderators, many of whom were in very different places from Silicon Valley.
Quarantine orders were coming down and, in places like the Philippines, entire sectors of a business were closing because those employees who did service sector jobs for North American corporations just couldn’t safely go to work. For content moderation specifically, it’s especially complicated because there’s so much import put on privacy. So, the idea of having workers at home doing the work from home really didn’t seem appropriate.
And, you have the added issue of in many places around the world, including places in the United States [that cannot] count on reliable internet service. Even if the social media firms thought, “Okay, well we can kind of relax some of our restrictions on privacy … and we let people do the work at home,” what’s to say they really can even reliably do it?
An interesting thing was happening, which was that on the social media platforms – Twitter and Facebook in particular – if a user wanted to report some [troubling or prohibited] content … they began receiving messages in response that said things like, “Due to the COVID-19 crisis, our response times are much slower,” [or] that offered other mechanisms to try. Content moderators were dispersed and displaced [and] … on the user side, we were being made aware that there were people missing, and that that absence was going to affect what we saw on the platform.
&: Were there any measures to automate the work?
Roberts: They were using more computational and automated systems without as much human oversight, which was actually yielding some false positives, because those systems tend to be more conservative and err on the side of taking [suspicious material] down because they’re not as good at nuance. So, users were seeing the outcomes and effects of the absence of the content moderators from around the world who were quarantining where they were, and that included sites in the United States.
There’s an interplay between automation and the humans who kind of do oversight on some of that, or who could overrule, let’s say, a decision where [it looks like content] should come down, but in fact, [humans] can tell there’s a nuance here that the computer can’t really read. Well, those content moderators were not at work and able to make those more complex decisions, and so when automation was used, it wasn’t as good as it could be or as it is when it’s used in conjunction, or in a hybrid fashion with human workers.
I actually wrote a piece back in April for Slate, and I called it, “The Great A.I. Beta Test,” because in a way, this was an opportunity for the platforms to … see what does it look like if they were to rely almost entirely on automation. I think the fact that they are bringing all the content moderators back to work just as quickly as they can is a testament to the fact that we are not in a place where a move to computational or automated content moderation across the board is a desirable choice.
&: Have there been any improvements to CCM’s working conditions or any increase in psychological supports for workers?
Roberts: There really is not an across-the-board union of workers in this industry. They’re all so globally dispersed. Over the past couple years, Facebook for one, was doing things like just raising the base pay for everyone who was doing this work. In a lot of places, that base pay was raised to $15 an hour, which is not a lot. In some of the more expensive metro areas, where just the cost of living is high, they raised that minimum wage to $18 an hour, but that’s just not a lot of money and it doesn’t seem commensurate with either the important role these people play, which we’re now seeing even more so, or with the the psychological challenge that the work poses and and its risk.
They were also putting some counselors in place in these call centers. and I’m not cynical enough to think the gesture was without value. I think they were legitimately trying to do something with that. But for many of the workers I’ve spoken to over the years who’ve had counselors on site have said that it’s really uncomfortable to go see the counselor, who’s sitting in the same place where you work. They would prefer a private [consultation], or one outside of work. And it’s very difficult for them to feel comfortable speaking to a counselor who’s employed, in fact, by their employer and and feel assured that there’s a there’s an appropriate siloing of their issues from management. In some cases, there have been leaks from the counseling side, back to the firm.
So, there are these improvements that the major social media firms are seemingly trying to make, but I think we’re a long way off from making this job something that I could recommend to anyone at this time. It’s just very, very difficult, and I’m not sure that the benefits outweigh the costs.
&: CCM workers are yet another group of first-responders.
Roberts: Exactly. We’re seeing that that necessity and we have been reminded throughout this crisis that there are people in our communities and in society who are playing fundamentally important roles that make it possible for the rest of us live ours. Every day, people who are essential workers … are not treated as essential. [Their] work is devalued economically and socially all the time — people who are working in grocery stores, for example, people who are doing retail sales, and people who are doing service calls to homes for a variety of reasons.
The medical profession tends to get more respect, but everyone who’s involved in the running of clinics and hospitals from janitors, sanitation workers, chiefs of the ER, community health organization workers – all of those people are needed to make it happen. Yet, many workers kind of fall into a place where we don’t miss them until they’re no longer there. For example, it’s when the sanitation workers go on strike and the garbage starts to pile up, that’s when the city will take notice.
I think it’s a little bit like that with social media moderation, when those folks who do that work are gone. Suddenly, we feel that absence, but when they’re there, because their efforts have been designed to be seamless and friction-free, they don’t draw attention to what they’re doing under normal circumstances. It’s therefore easy for most users to not even know that they are they’re working to make these spaces bearable for the rest of us. In that way, they they share characteristics with the kinds of people in jobs I just mentioned.
Going to [work at] a call center in the middle of a pandemic and being in the space is a risk right there. I know that industry is taking steps just like every other industry to protect the workers – not having people right up on each other in the physical layout of the call center, using masks, disinfection, using other kinds of mechanisms. But some of these call centers exist in states where COVID-19 is running rampant and the state government is not following CDC recommendations. Those poor public policy decisions will invariably trickle down to affect the the well-being – or lack thereof – of content moderation workers in those environments.
Hopefully, one thing that can come out of this crisis, once we’re finally out of it, is that we can remember some of these lessons about placing value on workers and on people who who are really putting themselves on the line for us in so many ways. That’s about the only silver lining I can still cling to.
Photo by Stella Kalinina