Instagram, which is owned by Meta, is developing a function to shield users from receiving explicit and naughty direct messages (DMs) from strangers. Alessandro Paluzzi, an app developer, shared the first images of the functionality.
Details on Instagram creates a new feature to shield users from seeing naked pictures
“Instagram is working on nudity protection for chats. Technology on your device covers photos that may contain nudity in chats. Instagram can’t access photos,” he published. The Verge received confirmation from Meta that a tool to safeguard Instagram users’ privacy is under development.
“We’re working closely with experts to ensure these new features preserve people’s privacy while giving them control over the messages they receive,” according to a corporate representative.
According to Meta, the technology prevents it from viewing the actual messages or sharing them with outside parties. The action was taken after the UK-based Center for Countering Digital Hate discovered that 90% of abusive direct messages “sent to high-profile women” on Instagram went unanswered by Instagram’s tools.
Instagram made profiles of users under 16 private by default last year in an effort to give young users a safer, more private experience on its site. This move made it more difficult for potentially suspect accounts to locate young people. Additionally, it limited the ways that businesses might approach teenagers.
The business has created new technology that identifies accounts with potentially questionable behavior and prevents those accounts from communicating with accounts belonging to young people.