Instagram's Meta to Alert Parents If Teens Search Self-Harm Content, Drawing Criticism

sajilo 6
0

 Instagram's Meta to Alert Parents If Teens Search Self-Harm Content, Drawing Criticism

Kathmandu. Meta, the parent company of the social media platform Instagram, has announced the launch of a new and powerful feature aimed at protecting teenagers. Now, if any teenager repeatedly searches for words related to suicide or self-harm on Instagram, their parents will immediately receive an alert message on their mobile phones.

This is the first time Meta is directly informing parents based on a user's search history. This feature will be implemented next week in the UK, the US, Australia, and Canada, and will later be expanded worldwide.

Previously, Instagram would block results when such words were searched and only display links to support organizations. However, Meta claims it now seeks to actively involve parents. Along with such alerts, parents will also receive expert advice and resources on how to have sensitive conversations with their children.

Suicide prevention organizations have strongly criticized Meta's announcement. The 'Molly Rose Foundation,' established in memory of 14-year-old Molly Russell, who died in 2017 after viewing harmful content on Instagram, warned that this step could cause more harm than good. According to Andy Burrows, the foundation's CEO, sending such a serious message to parents suddenly without any prior preparation could cause them to panic and risk further damaging their relationship with their children.

Molly Russell's father, Ian Russell, also expressed his disagreement. He said, "Imagine a parent working at the office receiving a message saying, 'Your child is thinking of ending their life.' Even if Meta provides support materials, no one can use them correctly in that moment of panic."

Attempt to Shift Blame onto Parents

Various charitable organizations have accused Meta of shifting responsibility onto the shoulders of parents instead of improving the harmful algorithms on its platform. Jedd Flynn, head of the organization Papyrus, stated that the main problem is children getting trapped in the dangerous world of the internet, and Meta is not addressing that core issue. Parents would prefer that such harmful content never reaches their children in the first place, rather than receiving an alert after their children have searched for it.

According to experts, Instagram's algorithm still recommends content related to depression and self-harm to teenagers. Therefore, there is a demand for Meta to redesign its system to be safe according to the age of the children.

Meta, however, refuted these criticisms, claiming they are trying to empower parents and provide safety to teenagers.

Post a Comment

0Comments

Post a Comment (0)