Online sexual harassment is a nuisance on social media, and Instagram is working on a tool to combat this. In a screenshot tweeted by app researcher Alessandro Paluzzi, Instagram is working on a “nudity protection” feature for DMs and “technology on your device covers photos that may contain nudity in chats.” Paluzzi further points out that users would still have control of their privacy and that Instagram would not be able to access these photos. Instagram’s parent company, Meta, confirmed to The Verge that the tool is truly in development.
“We’re working closely with experts to ensure these new features preserve people’s privacy, while giving them control over the messages they receive,” a Meta spokesperson said. According to Meta, users can turn off/on this feature, which is designed to shield them from unwanted nude photos or messages. Meta will not be able to view shared chats and can’t share them with third parties. The company plans to share more information about the upcoming feature when it is ready for testing in the coming weeks.
The new feature is similar to the “Hidden Words” tool launched last year. The tool allows users to filter offensive keywords into a hidden folder they can decide not to open.
Online sexual harassment is on the rise; a report published by The Pew Research Centre last year showed that around 33 percent of women under 35 had received some form of online sexual harassment. Termed “cyberflashing,” sending unsolicited nude photos or messages online could soon be a criminal offence in the UK if the Online Safety Bill is passed by Parliament.