On Nov 21, Meta announced new updates that seek to protect young people from the dangers they are susceptible to facing on Facebook and Instagram.

Meta Updates

According to the announcement, they also “seek to create safe, age-appropriate experiences for teens” on both platforms. The first update has to do with the limitation of unwanted interactions. This update initially introduced on Instagram, allows people under the age of sixteen to gain default private accounts immediately after they join the sites as a form of safety measure.

Those that already owned accounts before the updates are sent notifications to encourage them to set their accounts to private with steps and directions to take to be able to successfully do so.

This update, according to Meta, came about as a result of a poll they conducted that indicated that an overwhelming amount of young people felt safer and preferred private accounts as opposed to public accounts.

Meta also stated they have managed to build a technology that allows them to detect suspicious accounts (in this case, accounts owned by adults that may have been blocked or reported by a young person) and subsequently restrict such accounts from interacting with young people.

The update was also set to restrict advertisers from targeting young people based on their interests or activities on other apps.

On Nov 21, Meta announced that it has now introduced these measures to Facebook:

In addition to these updates, Meta also announced on the 21st, that it is introducing new measures to prevent teens from messaging adults Meta has deemed suspicious as well as not including these suspicious accounts on the “people you may know” sections of accounts belonging to teens.

They also announced that they are testing out removing the message buttons of teen accounts that have been viewed by suspicious accounts run by adults.

The second update involves encouraging teens to use safety tools that have been made available. An example of this is for teens to report accounts they have blocked which will then prompt the site to send them “safety notices with information on how to navigate inappropriate messages from adults”.

In addition to the above, Meta is also introducing new tools to stop the spread of teens’ “intimate” images which could be used to exploit teens (sextortion).

 Meta has also stated that it is working in tandem with The National Center for Missing and Exploited Children (NCMEC) and Thorn (with their NoFiltr brand) to protect teens who have been worried that their private images might be shared publicly and to empower those who have been subjected to sextortion by providing educational materials to “empower them to take back control” respectively.

Check out more Tech News here.

Leave a Reply

Your email address will not be published. Required fields are marked *