Facebook to use Met Police videos to recognise shooters

  • Published
Met PoliceImage source, CaptureARC Photography
Image caption,
Facebook will be given access to footage from bodycams worn by Specialist Firearms Command officers during their training

Facebook is to use footage from police body cameras to train its algorithms to recognise videos of real-life shootings.

The technology giant will give the cameras to Metropolitan Police Specialist Firearms Command officers.

It will then capture footage as they carry out regular training.

Facebook was criticised for failing to prevent copies of videos of the Christchurch mosque shootings from being shared on its platform.

The announcement came the day before Facebook and other tech firms will face questions from US senators about their efforts to identify and remove violent content from their platforms.

"We did not have enough content depicting first-person footage of violent events to effectively train our machine learning technology," it said in a blog.

The Metropolitan Police said it was "happy to help" develop the technology.

"Technology that automatically stops livestreaming of attacks once identified, would also significantly help prevent the glorification of such acts and the promotion of the toxic ideologies that drive them," assistant commissioner for specialist operations Neil Basu said.

Image source, Getty Images
Image caption,
Firearms instructors have dressed up as terrorists during past training exercises for armed officers in the Metropolitan Police

But one artificial intelligence researcher said it could take many years for the project to come to fruition.

"This is definitely a difficult problem and I don't see Facebook solving it soon," explained Christopher Tegho, a software engineer who specialises in video understanding.

"I guess it could be helpful for Facebook to be able to gather more data from something that is close to a real first-person shooter video to help train its models.

"But we are still not close to the same accuracy levels at being able to recognise what is going on in the scene of a video as we are to recognising what is in a still image."

Facebook also said it had:

  • removed more than 26 million posts related to groups such as Islamic State and al-Qaeda
  • banned more than 200 groups dedicated to white supremacy

"In March, we started connecting people who search for terms associated with white supremacy on Facebook Search to resources focused on helping people leave behind hate groups," it said.