Whitney Phillips is a professor at Syracuse University who specialises in internet ethics, online controversies, and digital culture. She is an expert on online antagonists and hoaxing, and authored a report to help prevent journalists from falling prey to social media manipulators.
How do you define an internet troll? How has your definition changed over the years?
I don’t! I never use that term anymore, unless I’m speaking historically about early subcultural trolling. Since 2013, I have been making this argument that we really should not be using the word ‘troll’, especially to describe any kind of identity-based antagonism. The term troll is so imprecise that when people use the word these days, they could be referring to anything from something that’s mildly annoying to basically being a violent white supremacist.
Because the term tends to have a kind of playful connotation, when you’re applying it to violence, it makes that violence seem less real and less impactful on somebody’s life. It provides abusers a rhetorical out, that if they get called out for doing something harmful to another person, they can say ‘Oh, I was I was just trolling’, it turns violence into something that can be cordoned off as being not really real. It’s the abusers’ preferred term because it minimises their culpability, and reframes the abuse in terms of the other person’s failure to not get trolled.
The term meant something very specific to a very specific group of people in the early mid 2000s. Things started to shift in 2014, when a lot of the sort of early subcultural trolling spaces, specifically 4chan,11.An anonymous online bulletin board with very few rules. started to undergo their far-right extremist shift that started with Gamergate in 2014.22.A harassment campaign against women in the video games industry. It continued into 2015, that’s when The Daily Stormer33.A neo-Nazi, white supremacist website. and other white supremacist groups were recruiting on 4chan. By that point, the term ‘trolling’ was still being used in many of the same contexts, but the there really had been a significant shift, an ugly and violent shift.
The term trolling is so attractive to journalists because it’s easy shorthand. When it comes to behaviors on the internet, I refer to them in terms of the impact they have. I call abuse ‘abuse’.
Why do online antagonists manipulate the media and how has this changed?
The thing that was initially interesting to me as a researcher was the ways that content that started in the troll space would bubble up into more mainstream channels. I started chronicling that in 2009, with the Obama Joker socialism image,44.A manipulated image showing then-President Barack Obama as comic book villain the Joker. that was a troll-made image and yet it filtered up into more mainstream channels, and really sort of set the narrative.
That relationship has remained fairly consistent, it’s just that the people who are operating under those rhetorical practices have really shifted in terms of their violence and bigotries.
The same media manipulation strategies that were being employed in the early days of trolling for ostensibly less obviously ‘harmful’ outcomes were used later to essentially compel journalists to amplify far right messages of hate.
People who are looking to manipulate the media, their numbers are meagre, and they require the signal-boosting power of particularly centre-left, mainstream news outlets. And that’s what the early subcultural trolls were banking on from the very beginning.
How much faith do you have that the media have learned their lessons and won’t be manipulated in the future?
After Charlottesville,55.An August 2017 white supremacist rally that resulted in the murder of counter-protester Heather Heyer. I noticed there were an increasing number of journalists who were becoming more conscientious of that cycle of amplification and their role within it.
Lots of people were still willing to use trolling as clickbait, because it performs very well, and those stories tend to be pretty easy to write, they don’t require a ton of investigative efforts, often it just means going to 4chan. So there was a lot of incentive to continue publishing these kinds of pieces, and there was a huge audience appetite for alt-right stories that got framed in terms of trolling.
Charlottesville marked a point where more people were willing to have meta-critical reflections about their reporting and the ways that reporting factored into amplification issues.
The next big moment was after the Christchurch shootings, many reporters are now really inclined to have these kinds of conversations. My Oxygen of Amplification report focuses specifically on how even if they’re trying to critique abusers and manipulators’ behaviours online, simply calling attention to those behaviours gives the behaviours a greater platform than they ever could have enjoyed otherwise, and in the process, normalises extremist expression. It also provides blueprints for subsequent attacks.
‘The media’ is comprised of so many different kinds of publications and so many different kinds of journalists. I have encountered more people than ever before who are concerned about these issues within journalism. That doesn’t mean all journalists are. But I definitely have seen more of a meta-reflective shift after these tragedies, recognising the dangers of calling attention to certain things online.
What concrete changes can be made to stop manipulators negatively impacting society? Who should deliver these changes: governments, social media platforms, users?
On the journalism side, it’s critical for journalists to understand that they are not just part of the amplification chain, they’re also often the trophy. When you see stories like QAnon,66.A far-right conspiracy theory about an American “deep state” that spread on 4chan. and other conspiracies, the goal is to get journalists to report on something. It’s a part of their game.
I would like to say that governments should have a role too, but coming from the United States perspective, I don’t have a lot of faith that anything big will be done.
With users themselves, I don’t buy into the ‘don’t feed the trolls’ logic because abusers and manipulators should not abuse and manipulate people, period. They’re the problem, not the people they target. But I do think that people need to understand and sort of reflect more critically on the ways that all of us play a role in amplification chains. It’s not just journalists. Even if your platform is smaller, by engaging with any content online, you feed into the sort of black box algorithms that we don’t even really know how they work, we don’t really know what information is going in, we just kind of see what the output is. So even if you’re commenting on something to critique it, even if you are responding in order to denounce someone’s behaviour, that still can help trigger algorithms on mass.
In the book I’m currently working on with Ryan Milner who I wrote my last book with, we talked about this in terms of hurricanes, we use a lot of ecological metaphors to talk about these issues. In the case of hurricanes, you would never point to a single gust of wind and say, ‘Oh, that’s a hurricane’. A hurricane is all of this stuff all at once. And in the case of disinformation online, that includes the things that journalists do, it includes the kinds of technological structures that social media platforms provide or enforce, and it involves the everyday actions of everyday people. Just by being in the world online, you can have an impact on it, positively or negatively.
Should more people be taught to consider online ethics?
I think online ethics are critical, but it’s not so much teaching ethics, it’s a question of helping people understand the ways that technological affordances obscure what the ethical issues are.
Some people actively try to harm others online, but most people don’t set out to be terrible and harmful, but their behaviour can still result in harm because of the way social media is structured. Like I said, if you’re responding to something to criticise it, you’re still amplifying it.
Technological affordances lend themselves to flattening of context. Instead of thinking about the full embodied context, people are often only engaging with individual texts, individual images, individual gifs. The seamlessness of travel discourages people from saying, ‘Wait a second. So here’s a funny picture of someone making a silly face. Who is this person? Did they consent to having their photo taken? And do they consent to the idea of random strangers using their image to capture why they hate Mondays?’
It’s not because people are harmful, and they don’t necessarily want to dehumanise or flatten another person into a meme, but we’re not encouraged to reflect and step back and say: what are the embodied repercussions of this? What am I not seeing? What do I not know about this story? How are the tools, the digital tools I’m using, getting in the way of thinking ethically?
Thinking about things in terms of the hurricane and trying to situate yourself within that is helpful, and is again a place to start.
On reflection, would you say the web and the internet more broadly has had a net negative or net positive affect on society?
I think that’s an unanswerable question.
We can’t speculate what the world would be like without the internet. I do think that many of the problems that we’re facing in terms of information flows are unique and specific to digital landscapes. There’s always been mis- and dis- information, but it functions in a different way and has different kinds effects when you’re talking about hyper-networked communication. That doesn’t mean that the world would have been a demonstrably better or worse place, because the flip side of that is that all of those networking technologies have allowed a lot of goodness to really flourish.
On these kinds of questions, I end up kind of being agnostic, not because I don’t care, but because this is the world that we’re in. Maybe it would be better if it were different, but it’s not different. And so how do we navigate this landscape in the best way that we can, given the challenges that we face?
This interview has been edited and condensed for clarity.