Instagram plans to protect users from unsolicited nude photos

Instagram is usually developing a feature that will enable users to obstruct unsolicited nude photos in their direct messages.

“Nudity protection” will be an optionally available privacy setting, exactly like the Hidden Words function launched last year, which filters out messages containing abusive language and emojis. Using machine learning, Instagram will prevent nude images from being delivered. The company also will not view or shop any of the images.

The feature, first reported by the Brink, addresses persistent issues of abuse over the social media application, possessed by Meta Platforms Inc. Some 41% of Americans reported online harassment and 79% believe social networking companies are doing a fair or poor work of addressing such problems, according to a written report last year from the Pew Research Center, which surveyed more than ten, 000 adults in September 2020. One-third of women under thirty-five experienced sexual nuisance online, compared with 11% of men within the same age range, according to the report.

Nudity protection on Instagram is still in the initial phases of development. “We’re working closely along with experts to ensure these new features protect people’s privacy, whilst giving them control over the particular messages they receive, ” a Meta spokesperson said. – Bloomberg