New measures will come into force to protect users of “teenage accounts” on Instagram and increase parental control, according to a statement from American technology company Meta.
If it is discovered that young people on the platform repeatedly make calls for suicide or self-harm within a certain period of time, their parents will be warned.
Parents using Instagram's parental control program will receive an alert via email, SMS, WhatsApp or in-app notification stating that “their children may need support.”
Parents also have the opportunity to benefit from expert content to help them talk to their children about sensitive topics.
The new application, which will launch in the US, UK, Australia and Canada in the coming weeks, will be available in other countries later in the year.
It is noteworthy that this action came at a time when Meta was facing multiple lawsuits in the US for “causing harm to children.”

Bir yanıt yazın