Winter in Unalaska by Sam Zmolek
Your voice in the Aleutians.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

UK's worst riots in years were incited by online disinformation about asylum seekers

SCOTT SIMON, HOST:

The worst riots in years have occurred in Britain in recent days, and the role of social media is under new scrutiny. The unrest began after three young girls were stabbed to death at a dance class in the English town of Southport. A false rumor that the attacker was a Muslim asylum-seeker quickly spread online, and days of violence followed. Marc Owen Jones is a disinformation expert and associate professor at Northwestern University in Qatar. He joins us from there. Thanks so much for being with us.

MARC OWEN JONES: Oh, thanks for having me.

SIMON: In your judgment, professor, how big a factor was social media in helping to foment the unrest?

JONES: It's a huge factor. The disinformation about the nature of the attacker started on social media, or at least appeared to start on social media, and it went viral. And the riots that happened in Southport on the night of the stabbings were premised on the fact that there was a belief that the attacker was a Muslim. This was not true.

SIMON: Who began these reports? Do we know?

JONES: Well, in some ways, we do. All of the speculation about the identity of the attacker all concerned his ethnicity. So there was rumors that he was a Muslim. There was rumors that he was just off the boat and seeking asylum. There were rumors that he was a Syrian refugee.

But one of the most widespread rumors was someone gave him a fake name. They said, Oh, we hear that the attacker is called Ali Al-Shakati. This person never existed. I mean, Ali Al-Shakati never existed. But the first person who was documented to use this name was a middle-aged woman living in Cheshire in the U.K.

SIMON: Yeah. What about the role of Elon Musk? Mr. Musk said in a post on his own platform, quote, "Civil War is inevitable." What sort of influence do you feel he ought to except for his own posts?

JONES: I would argue that Elon Musk is the disinfluencer-in-chief and someone who routinely spreads fake information. And Elon Musk, interestingly, took it up himself to fan the flames of a lot of the disinformation. And he replied, - and by replying to a post, you are amplifying it - he replied to a number of these accounts that were spreading false information about the identity of the attacker. He also jumped on a hashtag attacking Keir Starmer because Starmer had been quite firm in the immediate aftermath of the riots that social media companies need to be held more accountable for their role in spreading disinformation.

And obviously, Elon Musk is someone who took it upon himself when he took over Twitter to disband the trust and safety team responsible for making sure that the platform is safe or free from hate speech and disinformation, and also by doing things like changing the identity verification system from an identity-based system to a credit-based system. So now anyone can set up an account, pay for a blue tick, and by doing so, they're going to have their content boosted algorithmically. So what you're doing is creating a pay-for-play propaganda system.

SIMON: You understand, professor, one person's hate speech is another person's free speech.

JONES: I mean, sure. At the same time, in a civilized society, we also need to be mindful of the fact that speech that can lead to someone else's harm is dangerous. I mean, liberty in its truest form is individualism, and liberty in its truest form can also justify me attacking someone else because I feel that way. So I think in any civilized society, we have a duty of care to people within our community.

SIMON: Are you suggesting there ought to be a lot more government oversight?

JONES: Well, I think so. I mean, I don't think government oversight means censorship. I think when we saw the, you know, Russian interference in the U.S. elections in 2016, that did create a lot of pressure on social media companies. You know, Mark Zuckerberg, if you remember, was called before Congress. This caused those companies to implement more measures for trust and safety to have teams that would at least try to get rid of state-backed disinformation, hate speech and fake news, all right? But that pressure was useful. It didn't mean that the social media companies then started censoring things. They just took more responsibility for the content on their platform.

But I do think, also, social media companies need to stop instrumentalizing the fact that they put themselves at the mantle of free speech. Free speech existed before social media. And these are private companies that are profiting off engagement and our data. If they start to profit off the fact that hateful speech and propaganda and disinformation gets more engagement, then I think we've got a problem, because this is no longer about free speech; this is about the free market of speech. And that's a different thing entirely.

SIMON: Marc Owen Jones is a disinformation expert at the Northwestern University division in Qatar. Thanks so much for being with us.

JONES: Thank you very much. Transcript provided by NPR, Copyright NPR.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.

Scott Simon is one of America's most admired writers and broadcasters. He is the host of Weekend Edition Saturday and is one of the hosts of NPR's morning news podcast Up First. He has reported from all fifty states, five continents, and ten wars, from El Salvador to Sarajevo to Afghanistan and Iraq. His books have chronicled character and characters, in war and peace, sports and art, tragedy and comedy.