Make us your home page
Instagram

Today’s top headlines delivered to you daily.

(View our Privacy Policy)

YouTube announces plan 'to fight online terror,' including making incendiary videos difficult to find

Google is intensifying its campaign to fight online extremism, saying it will put more resources toward identifying and removing videos related to terrorism and hate groups. In a blog post Sunday, June 18, 2017, Google said that it will train more workers, called "content classifiers," to identify and remove extremist and terrorism-related content faster. [Associated Press]

Google is intensifying its campaign to fight online extremism, saying it will put more resources toward identifying and removing videos related to terrorism and hate groups. In a blog post Sunday, June 18, 2017, Google said that it will train more workers, called "content classifiers," to identify and remove extremist and terrorism-related content faster. [Associated Press]

YouTube has long struggled with the conundrum of how to police videos that advocate hateful ideologies but don't specifically encourage acts of violence.

The basic problem is such videos don't break any of the platform's specific guidelines. Banning some videos based on ideology and not others could lead to a slippery slope that could damage the primary appeal of YouTube: that users can upload their own content, so long as it's not illegal, without fear of being censored.

On Sunday, Google, which owns YouTube, announced new policies to help police such content in a blog post by Kent Walker, Google's general counsel and senior vice president, titled, "Four steps we're taking today to fight online terror." It also appeared as an op-ed in the Financial Times.

The first two steps focus on identifying and removing videos that specifically encourage terrorism. But, as Walker wrote, that isn't always as simple as it sounds, particularly since as of 2012, one hour of content is uploaded to the platform each second, as AdWeek reported, noting that makes a century of video every 10 days.

"This can be challenging: a video of a terrorist attack may be informative news reporting by the BBC, or glorification of violence if uploaded in a different context by a different user," Walker wrote.

Currently, YouTube uses a combination of video analysis software and human content flaggers to find and delete videos that break its community guidelines.

The first step, Walker wrote, is to devote more resources "to apply our most advanced machine learning research" to the software, which means applying artificial intelligence to the software that will be able to learn over time what content breaks these guidelines.

The second step is to increase the number of "independent experts in YouTube's Trusted Flagger Program," which is composed of users who report inappropriate content directly to the company. Specifically, Google plans to add to the program 50 experts from nongovernmental organizations whom it will support with operational grants to review content.

"Machines can help identify problematic videos, but human experts still play a role in nuanced decisions about the line between violent propaganda and religious or newsworthy speech," Walker wrote.

The third step, meanwhile, focuses on content that doesn't actually break the site's guidelines but nonetheless pushes hateful agendas, "for example, videos that contain inflammatory religious or supremacist content."

Take Ahmad Musa Jibril, a Palestinian American preacher who espouses radical Islamic views in line with the beliefs of the Islamic State, for example. A 2014 report by the International Center for the Study of Radicalization and Political Violence found that more than half of recruits to the militant group, also known as ISIS, follow Jibril on Facebook or Twitter.

One of the London Bridge attackers reportedly became a follower of Jibril through social networks such as YouTube, the BBC reported.

But while these videos may help radicalize certain individuals, the ICRS report found that Jibril "does not explicitly call to violent jihad, but supports individual foreign fighters and justifies the Syrian conflict in highly emotive terms."

Therefore, he doesn't violate YouTube's content guidelines.

Since YouTube cannot delete these videos and others of its kind, the company's basic plan is to simply hide them as best they can.

"These will appear behind an interstitial warning and they will not be monetized, recommended or eligible for comments or user endorsements," Walker wrote. "That means these videos will have less engagement and be harder to find."

"We think this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints," Walker added.

The final step is to use "targeted online advertising to reach potential ISIS recruits" and then redirect them "towards anti-terrorism videos that can change their minds about joining."

These reforms come at a time when social media companies struggle with the fact that they're often a breeding ground for radicalism. Most, by their very nature, act as global free speech platforms — which often makes them attractive as recruiting hotspots.

During the final six months of 2017, Twitter suspended almost 377,000 accounts for promoting terrorism. The company first announced it would police extremism on its network in 2015. Facebook, meanwhile, announced last week that it, like YouTube, uses a combination of artificial intelligence and human content flaggers in attempts to rid itself of extremist content.

YouTube's need for some reform was arguably the most pressing, though, as companies such as AT&T and Verizon pulled advertising from the site in March because their ads would sometimes appear on videos that promoted hateful and extremist ideologies.

YouTube announces plan 'to fight online terror,' including making incendiary videos difficult to find 06/19/17 [Last modified: Monday, June 19, 2017 9:16am]
Photo reprints | Article reprints

Copyright: For copyright information, please check with the distributor of this item, Washington Post.
    

Join the discussion: Click to view comments, add yours

Loading...
  1. 'Free speech rally' cut short after massive counterprotest

    Nation

    BOSTON — Thousands of demonstrators chanting anti-Nazi slogans converged Saturday on downtown Boston in a boisterous repudiation of white nationalism, dwarfing a small group of conservatives who cut short their planned "free speech rally" a week after a gathering of hate groups led to bloodshed in Virginia.

    Thousands of people march against a “free speech rally” planned Saturday in Boston. About 40,000 people were in attendance.
  2. Police pull unconscious New Port Richey man from SUV in Cotee River

    Accidents

    NEW PORT RICHEY — Police rescued an unconscious driver whose sport utility vehicle plunged into the Cotee River on Saturday.

  3. Analysis: Bannon is out, but his agenda may live on

    Politics

    WASHINGTON — In his West Wing office, Stephen Bannon kept a chart listing trade actions — on China, steel and autos — that the Trump White House planned to roll out, week by week, through the fall. Now that Bannon, the president's chief strategist, has been pushed out, the question is whether his …

    Steve Bannon thinks he could be more effective influencing policy from outside the White House.
  4. Trump to skip Kennedy Center Honors awards program

    Politics

    BRIDGEWATER, N.J. — Acknowledging that he has become a "political distraction," President Donald Trump has decided to skip the festivities surrounding the annual Kennedy Center Honors arts awards later this year, the White House announced Saturday amid the continuing fallout over Trump's stance on last weekend's …

  5. Bucs' annual Women of RED preseason party attracts nearly 2,000

    Bucs

    TAMPA — Theresa Jones is primarily a college football fan, but she wanted to get a taste of the Bucs. So the 46-year-old Tampa resident bought a ticket for the team's Women of RED Ultimate Football Party at Raymond James Stadium on Friday.

    Lee White of Seminole tries on a helmet at Raymond James Stadium in Tampa, Fla. on Friday, August 18, 2017.

Tampa Bay Buccaneers female fans descended upon Raymond James Stadium for the ultimate football party, the 2017 Women of RED: The Takeover, supported by Moffitt Cancer Center. CHARLIE KAIJO   |   Times