A look at Google's hands-off content moderation approach for Google Podcasts, where hate speech from white supremacists and pro-Nazi groups is easy to find (Reggie Ugwu/New York Times)

advertisement

The platform’s tolerance of white supremacist, pro-Nazi and conspiracy theory content pushes the boundaries of the medium.

He had already been banned from Twitter, but on his podcast he could give full voice to his hateful conspiracy theories.

The podcaster argued that the man in Atlanta who had confessed to killing eight people at massage parlors last week, including six women of Asian descent, was the one who had truly been victimized — the casualty of a supposed Jewish plot.

“Your heart goes out to the guy,” he said.

The remarks, emblematic of a longstanding online network of white supremacists and pro-Nazi groups, weren’t hidden in some dark corner of the internet, but could be found on Google Podcasts, the search giant’s official podcast app that was released for Android in 2018 and expanded to Apple devices last year.

As leading social networks like Facebook and Twitter have taken some steps to limit hate speech, misinformation and incitements to violence in recent months, podcasts — historically fueled by a spirit of good-natured anarchy — stand as one of the last remaining platforms for the de-platformed.

After Twitter last November suspended the account of Steve Bannon, the onetime adviser to former President Donald J. Trump, for suggesting that several officials be beheaded, he continued to enjoy large audiences with his podcast, available on both Apple and Google’s services.

But even in the world of podcasting, Google Podcasts — whose app has been downloaded more than 19 million times, according to Apptopia — stands alone among major platforms in its tolerance of hate speech and other extremist content. A recent nonexhaustive search turned up more than two dozen podcasts from white supremacists and pro-Nazi groups, offering a buffet of slurs and conspiracy theories. None of the podcasts appeared on Apple Podcasts, Spotify or Stitcher.

Google Podcasts is also one of the few remaining homes of Alex Jones, the “Infowars” broadcaster, who was banned in 2018 from Apple, Spotify and Stitcher for repeated violations of their policies on hate speech and harassment. Google, citing its own policies, terminated Mr. Jones’s YouTube account. Last year, it removed the Infowars app from the Google Play store for spreading misinformation about the coronavirus.

But Mr. Jones’s programs are still available on Google Podcasts.

Told of the white supremacist and pro-Nazi content on its platform and asked about its policy, a Google spokeswoman, Charity Mhende, compared Google Podcasts to Google Search. She said that the company did not want to “limit what people are able to find,” and that it only blocks content “in rare circumstances, largely guided by local law.”

That hands-off approach to moderation recalls the original position of social networks like Facebook and Twitter, which have become more vigilant in recent years in their attempts to rein in the spread of harmful content.

Both Facebook and Twitter barred Mr. Trump from posting after the Jan. 6 attack at the Capitol that left five dead and more than 100 injured.

On Thursday, Sundar Pichai, the chief executive of Alphabet, Google’s parent company, joined the chief executives of Facebook and Twitter in testifying before Congress about the spread of extremism and disinformation. He said in written testimony that Google was committed to “providing trustworthy content and opportunities for free expression across our platforms, while limiting the reach of harmful misinformation,” but did not mention podcasts.

Jessica Fjeld, the assistant director of the Cyberlaw Clinic at Harvard’s Berkman Klein Center for Internet and Society, said she was surprised that Google had taken such a “hard-line” posture against regulating its platform.

She compared Google Podcasts’ positioning to that of Parler, the largely unregulated social network that was a hotbed for disinformation and extremist groups before the largest tech companies turned away from it.

“Google is perfectly well aware of how to moderate content if it cares to,” said Ms. Fjeld. “It seems like they’ve made a decision to embrace an audience that wants more offensive content rather than constrain that content for the sake of safety and respect.”

Google Podcasts, like most other podcast players, including Apple’s, doesn’t host content on its own servers. (Spotify and Audible are prominent exceptions to this rule). Instead, it aggregates RSS feeds — a standardized web format that allows users to receive regularly updated content — that are hosted by third parties.

When Apple bans a podcast, such as Mr. Jones’s, it removes the RSS feed from its directory. Google Podcasts says it’s unwilling to take that step except in a narrow set of cases.

The services’s content policy compares its function to Google Search’s aggregation of links. Unless a podcast violates the rules that apply to links appearing in Google Search, most of which are based on legal restrictions, it will remain on Google Podcasts. Google does draw distinctions when it comes to which podcasts it will promote, and thus make easier to find. It says it does not recommend content that is “inappropriate, insensitive, or offensive in nature,” and goes on to cite specific areas of concern, including content that is harassing, hateful, deceptive or dangerous.

Though the company likens its podcast platform to search, Google Podcast’s own product description notes significant experiential differences, including the ability to manipulate playback speed, create playlists and download and store content.

But there is at least one connection between Google Podcasts and Google Search. In 2019, Google began integrating podcasts into search results using its own platform, making it possible to play content directly from the results page. All podcasts, including those featuring hate speech, currently benefit from this feature under Google’s policy.

In the early days, content moderation in podcasting was virtually nonexistent. Apple, the industry’s largest and most influential player, which added support for podcasts to iTunes in 2005, at first paid little attention to the nascent ecosystem, opting to serve largely as a delivery vehicle.

The company’s benign neglect was a boon for many creators. Popular comedians (Adam Carolla, Marc Maron) and public radio exiles (Kaitlin Prest, “The Heart”; Nick van der Kolk, “Love + Radio”) took advantage of the medium’s low barrier to entry to find audiences. Many prized freedom of expression, unburdened by the Federal Communications Commission that regulates radio and television broadcasters.

“Everyone who had an idea had a place to explore it,” said Johanna Zorn, a longtime public radio producer and the co-founder of the Third Coast International Audio Festival. “There was suddenly a variety of voices and experiences that would never have existed otherwise.”

But not all of those voices, part of a chorus of more than one million podcasts in existence today, were virtuous. And with more popularity — 80 million Americans listened to a podcast at least once a week in 2020, according to Edison Research — came more scrutiny.

Many provocative podcasts, including several hosted by fringe and far-right figures, exist on nearly all the platforms. But the decision to ban Mr. Jones signaled a new willingness among leading services to take action against content they consider beyond the pale.

Noah Shanok, the co-founder and former chief executive of Stitcher, which was the first podcast platform to ban Mr. Jones, said he believes that podcasts provide a unique form of passive, “lean back” entertainment that benefits from curation.

“The more you move in that direction, the more onus there is to police content,” he said.

Stephanie Keith/Reuters

Experts debate whether dangerous speech poses the same threat on podcasts that it does on social media. Keri Hoffman, the chief executive of PRX, a nonprofit network that distributes “This American Life” and other popular programs, noted that while social media allows users to send near-real-time messages to potentially billions of users, podcasts, which take more work to discover, have more limited reach.

“If radio was like talking into a great big pipe, podcasts are like talking into a cocktail straw,” she said.

But the insular, self-selecting nature of podcast communities presents its own challenges.

Michael Edison Hayden, a senior investigative reporter with the Southern Poverty Law Center, said that podcasts were a “fundamental building block” of the deadly “Unite the Right” rally in Charlottesville, Va., in 2017, helping organizers who had been banned from more mainstream platforms cultivate a national following.

“Podcasts were instrumental in radicalizing and galvanizing that movement,” he said.

For Ms. Hoffman, the health of the ecosystem depends on finding some balance between free expression and safety.

“I really believe the openness of podcasting has been key to its success story so far, and will be in the future,” she said. “But there have to be some guardrails.”

This content was originally published HERE

advertisement

Be the first to comment on "A look at Google's hands-off content moderation approach for Google Podcasts, where hate speech from white supremacists and pro-Nazi groups is easy to find (Reggie Ugwu/New York Times)"

Leave a comment

Your email address will not be published.


*