WhatsApp now allows children to join scary groups due to decreased age limit

Children as young as 13 can now access the app.

April 12th 2024.

WhatsApp now allows children to join scary groups due to decreased age limit
Meta is a big player in the tech world, owning popular platforms such as Facebook, WhatsApp, Instagram, and Threads. However, recently WhatsApp has come under scrutiny for lowering its minimum age requirement. This came to light after an investigation revealed that children as young as nine years old were being added to malicious groups on the app. These groups were promoting self-harm and contained explicit content such as sexual violence and racism.

The BBC found that even children in primary school had been exposed to this harmful content on WhatsApp. In response, Northumbria Police sent out a warning to thousands of parents in Tyneside about the dangers of these groups. One concerned parent, Mandy, shared her experience of her 12-year-old daughter being exposed to sexual images, racism, and swearing on the app. Mandy was shocked and disturbed by what her child had seen, stating that "no child should be seeing" such things. She immediately removed her daughter from the group, but she worries that the damage may have already been done. Mandy's name has been changed to protect her family's privacy.

In response to these alarming findings, Meta, the parent company of WhatsApp, stated that all users have the option to control who can add them to groups. However, this statement did little to ease the concerns of parents like Mandy. The dangers of online content have been highlighted before, with the tragic case of 14-year-old Molly Russell, who took her own life while struggling with depression. An investigation found that Molly had followed over 40 Instagram accounts related to mental health, including those promoting self-harm and suicide. Campaigners have accused Meta of prioritizing profits over the safety of children by lowering the minimum age requirement for WhatsApp use from 16 to 13 in the UK and Europe.

Daisy Greenwell, co-founder of the campaign group Smartphone Free Childhood, expressed her disappointment with Meta's decision, calling it "tone deaf." She believes that WhatsApp is often seen as a "gateway drug" to other social media platforms and that lowering the minimum age will only make it easier for children to access harmful content. Greenwell also points out that WhatsApp is not a risk-free platform, as it is often the first place where children are exposed to extreme content and bullying is prevalent.

Conservative MP Vicky Ford, a member of the education select committee, also criticized Meta's decision, stating that it was "highly irresponsible" to reduce the minimum age without consulting parents. Prime Minister Rishi Sunak echoed these concerns, stating that the Online Safety Act would give regulators the power to ensure that social media companies are protecting children from harmful material. He emphasized the importance of keeping children safe online, just as we would want them to be safe when playing outside.

In response to the backlash, WhatsApp defended its decision to lower the minimum age, stating that it brings them in line with the majority of countries and that protections are in place. However, they also announced new safety features, specifically aimed at protecting young people from "sextortion" and intimate image abuse. One of these features is a filter in Instagram's direct messages, called Nudity Protection, which will automatically blur any images that contain nudity. Additionally, users will receive a message urging them not to feel pressured to respond to explicit images, and they will have the option to block and report the sender. These measures are a step in the right direction, but only time will tell if they are enough to keep young people safe on social media platforms.

[This article has been trending online recently and has been generated with AI. Your feed is customized.]

 0
 0