Facebook is collaborating with four countries to test a pre-emptive method to detect and stop revenge porn. But, the strategy used in the pilot scheme has raised many eyebrows as it requires users to upload their nude photos or videos to the Messenger app.
“We see many scenarios where maybe photos or videos were taken consensually at one point, but there was not any sort of consent to send the images or videos more broadly,” e-Safety Commissioner Julie Inman Grant told ABC, adding that one in five Australian women, aged between 18 and 45, are victims of revenge porn.
“So if somebody tried to upload that same image, which would have the same digital footprint or hash value, it will be prevented from being uploaded,” Grant said.
Although Facebook is using a hashing system to avoid storing the photos or videos directly on its servers, there are indeed a few concerns over user privacy as we have seen successful attempts at tricking machine vision systems.
“Yes, they’re not storing a copy, but the image is still being transmitted and processed. Leaving forensic evidence in memory and potentially on disk,” Lesley Carhart, a digital forensics expert, told Motherboard.
“My speciality is digital forensics and I literally recover deleted images from computer systems all day—off disk and out of system memory. It’s not trivial to destroy all trace of files, including metadata and thumbnails,” Carhart added.
According to Facebook’s Head of Global Safety Antigone Davis, Australia is one of four countries participating in the “industry-first” pilot project, which uses “cutting-edge technology” to prevent re-sharing of intimate images and videos.
The new method came seven months after Facebook implemented a new photo-matching technology in April to ensure users can’t re-share images that had previously been reported and tagged as revenge porn.