In Australia, Facebook (NASDAQ:FB) has begun testing a new system that will allow users to upload revealing photos or videos directly to Messenger without the fear of those photos being used against them later as revenge porn. Partnering with the Australian e-Safety Commissioner, the new system would help prevent the online abuse of minors and the sharing of nonconsensual explicit media online.
Many users who share these types of images or videos online are in favor of this type of content protection. When uploading content to Facebook or any other site, there is always a fear that the content could be used to harass or bully the original uploader. This issue has exploded recently due to widespread content sharing on many social sites.
The social network’s new approach would digitally add a footprint to uploaded media. The “hash” would be similar to face-matching algorithms already being used. This would prevent the images or videos from being reposted later by anyone other than the content owner. The updated system would analyze metadata and fight against a third party altering the tagging of the original content, similar to measures Facebook implemented back in April in its last attempt to curb the spread of revenge porn.
Facebook’s reputation regarding privacy and consumer trust hasn’t always been stellar, leaving a lot of users leery about posting explicit photos and videos directly to Messenger for private use. Many users will still fear revenge porn postings because of well-documented manipulation of AI algorithms in the past.
Facebook may have more work to do in order to refine the new system and protect its more risque users. Hackers and digital security researchers believe the system could still be bypassed by inserting enough hidden code into the image metadata. Past systems have been outsmarted before by simply altering the images to look slightly different or applying filters to distort the images.