Instagram finally kickstarts work on ‘nudity protection’ update

Instagram urges people, who send nudes through direct messages, to remove them immediately
An undated image of Instagram logo. — Pixabay
An undated image of Instagram logo. — Pixabay

Most popular picture sharing platform, Instagram, is finally working on “nudity protection” update to protect its users from unsolicited nude photos.

According to a Daily Mail, the Meta-owned app has urged people, who send nudes through direct messages, to remove them immediately.

Similar to technology found in dating apps, it employs artificial intelligence (AI) to recognise the male or female genitalia in sent photographs automatically before hiding them.

However, the receiver still has a chance to tap and see the photo properly, even if they're as young as 13 – which one child NGO has called “entirely insufficient”.

Instagram requires everyone to be at least 13 years old if they want to create an account, a move that is already criticised by experts and the public.

In a statement, Meta said Instagram DMs “overwhelmingly” used harmlessly to send messages and photos to friends and family.

However, sending an unsolicited photo of genitalia constitutes “intimate image abuse”.

Furthermore, when "sextortion scammers" obtain private photos, they could threaten to post them publicly unless the victim agrees to pay a ransom.