Meta states that it cares about the privacy of its younger users and has been applying since yesterday More restrictive settings for new minors who sign up for FacebookWhile encouraging them to pay closer attention to all the growing phenomena of abuse they may be experiencing.
Through a post on the official blog (in SOURCE), it is reported that protection measures were already in place last year for teens from interacting with potentially suspicious adults. For example, adults were banned from messaging unrelated minors, and minors no longer appear in recommendations for potential contacts we might know.
But Meta does not want to be limited to these intrusions and it seems that it is also testing the removal of the send message button from teen Instagram accounts, when viewed by adults deemed suspicious, i.e. recently blocked by another minor. Meanwhile, Facebook is doing just that Extensive use of notifications to encourage more attention and use available tools, for example requesting accounts to be reported after someone has been blocked.
In addition to applying more protection standards to profiles, Facebook reminds minors registered on Facebook to pay more attention to settings such as.
- Who can see his friends list
- Who can see the people, Pages, and lists they follow
- Who can see posts that have been tagged on their profile
- See flagged posts before the post appears on their profile
- Who is allowed to comment on his public posts
Finally, Facebook wanted to explain the direction taken regarding the need to stop publishing sexual images of minors, especially when such images are used to exploit them, a practice known in the United States as “sextortion”.
The non-consensual sharing of intimate images can be very painful and Meta wants to do everything possible to discourage teens from sharing such content on their social networks. So, we’ve partnered with the National Center for Missing & Exploited Children (NCMEC) to build a global platform for kids concerned about sharing personal photos without their consent. Therefore, the aim will be to prevent the further spread of this phenomenon. To achieve this, Facebook has worked closely with NCMEC, experts, academics, parents and victim advocates globally to help develop the platform and ensure it responds to the needs of teens so they can take back control of their content in these terrible situations.
Finally, Meta is also collaborating with Thorn and the NoFiltr brand, with the goal of creating educational materials that reduce the shame surrounding these phenomena, thus encouraging teens to seek help and not suffer from traumatic blackmail.