Published: Thu, November 09, 2017
Sci-tech | By Javier West

Facebook seeks nude pics from users to tackle 'revenge porn'

Facebook seeks nude pics from users to tackle 'revenge porn'

Back in April, Facebook took steps aimed at combating revenge porn in Canada and the U.S., allowing users to flag an image they suspect was posted without consent.

The user is then told to delete the image from their Facebook Messenger.

According to the e-Safety Commissioner, one in five women between 18-45 and one in four Indigenous Australians are victims of revenge porn.

Australian e-Safety Commissioner, Julie Inman Grant, was at pains to point out that the images would not end up on the social network's servers.

The method of nude photo transmission, and the duration such photos are held, could raise fears submitted images could be intercepted in transit or while stored - moreover, hashing technology can be fooled by users simply resizing or cropping images. Next, they will be asked to send the images to themselves on Messenger.

It's done by first contacting the e-safety commissioner or regional equivalent (e-safety commissioner is an Australian position, and this test is being carried out in Australia), after which, you will then be advised to send the photo to yourself. Or want to ensure that you are not a victim of revenge porn? But, it still remains to be seen how confident users are in giving their intimate images and videos to Facebook, considering Facebook's bad reputation with regards to privacy and consumer trust.

"They're not storing the image, they're storing the link and using artificial intelligence and other photo-matching technologies", Grant said. The US, UK and Canada will also participate in the pilot with Facebook.

When your private image is sent to yourself, Facebook's technology will "hash" it, which is a high-tech way of saying it will create a digital footprint or ink for the image. "Of course, we always encourage people to be very careful about where they store intimate photos and preferably to not store them online in any form".

This seems to be a weird way to demote explicit content, but Facebook's AI might better protect people if it already knows how they look in natural clothes. "They make you sign on to the service, and then they make you report one of three things". Once you report an image, "specially trained representatives" from Facebook's Community Operations review it and, if it's found to be in violation of the social network's Community Standards, take it down.

Like this: