" "

Meta introduces stricter content controls for teens on Instagram, Facebook amid mounting concerns

0 280
img_2481.jpg

Meta revealed its decision to bolster content restrictions for teenagers on Instagram and Facebook amid growing concerns regarding the potential harm inflicted on young users.

The move follows allegations from numerous US states, accusing Meta of jeopardizing the mental well-being of children and teenagers, along with deceptive practices regarding platform safety.

Mark Zuckerberg’s company disclosed the implementation of stricter measures, preventing teens from accessing specific content, even if shared by friends or followers. Such restricted content encompasses discussions on suicide or self-harm, nudity, and references to prohibited items. Notably, Instagram’s restricted goods include tobacco, weapons, alcohol, contraception, cosmetic procedures, and weight loss programs.

Furthermore, Meta has adjusted default settings for teens on Instagram and Facebook, aligning them with the most restrictive options. This policy, initially applicable to new users, is now extended to existing ones, aiming to make it more challenging to encounter potentially sensitive content or accounts, particularly in Search and Explore.

Addressing concerns related to suicide and self-harm, Meta expanded its search result concealment policy to include a broader range of terms. The company’s actions come in the wake of leaked internal research, including revelations from the Wall Street Journal and whistle-blower Frances Haugen, exposing Meta’s long-standing awareness of the adverse impact its platforms have on the mental health of young individuals.

For the purposes of these measures, teenagers on the platforms are defined as individuals under the age of eighteen, based on the date of birth provided during the signup process.

About Author

Leave a Reply