Instagram Defends New Features to Protect Teens from Sextortion
Instagram, owned by Meta, has recently introduced new features aimed at protecting teenagers from sextortion attempts on the platform. These tools include preventing screenshots or screen-recordings of disappearing images and videos, in an effort to stop criminals from tricking teens into sending intimate images to scammers.
While the NSPCC has praised these moves as a “step in the right direction,” some critics, like former Meta employee turned whistleblower Arturo Béjar, believe there are easier ways Instagram could protect young people from unwanted contact. Béjar suggested that allowing teens to flag accounts pretending to be teens could be more impactful in preventing sextortion.
Meta, however, has stated that their tools, developed using user feedback, provide clear and straightforward ways for teens to report inappropriate behavior or harassment. They also offer mechanisms for flagging unwanted nude images and prioritize such reports. Despite this, questions remain as to why similar protections are not being rolled out on all Meta products, including WhatsApp, where grooming and sextortion also occur.
Sextortion, a form of blackmail where scammers trick individuals into sending sexually explicit material before threatening to expose them, has become a prevalent issue on social media platforms. Law enforcement agencies worldwide have reported a rise in sextortion scams, with a significant number targeting teenage boys.
The introduction of these new safety features on Instagram is part of Meta’s ongoing efforts to protect teens online. Antigone Davis, Meta’s head of global safety, emphasized that built-in protections are in place to safeguard teens and parents from sextortion attempts. However, critics like Béjar argue that these measures may not be foolproof, as scammers may find ways to circumvent them.
As the debate continues on how best to protect young users from sextortion, the responsibility of ensuring online safety remains a key concern for both social media platforms and regulators. With the implementation of the Online Safety Act on the horizon, the pressure is on for companies like Meta to prioritize the safety and well-being of their users, especially vulnerable teenagers.