Facebook and Instagram will tighten restrictions around the data available for companies to target ads to teen users, the platforms’ parent company Meta said.
Starting in February, advertisers will no longer be able to see a user’s gender or the type of posts they have engaged with as a way to target them. Under the enhanced restrictions, only the user’s age and location will be used to serve ads, Meta said.
The social media company also confirmed that new controls would be introduced in March that would allow teens to access settings on both apps and choose to “see less” certain types of ads.
Many online safety advocates say social media platforms need to do more to control the types of advertising shown to younger users, saying inappropriate ads can cause just as much harm as offensive or abusive content posted by others.
Meta previously added restrictions that prevent advertisers from targeting teens with ads based on their interests and activities, and the company said the latest updates came in response to research on the topic, direct feedback from experts and global regulation.
“As part of our ongoing work to keep our apps age-appropriate for teens, we’re making more changes to their ad experiences,” Meta said in a blog post.
“We recognize that teens are not necessarily as equipped as adults to make decisions about how their online data is used for advertising, particularly when it comes to showing them products that are available to buy.
“For that reason, we are further restricting the choices advertisers have to reach teens, as well as the information we use to serve ads to teens.”
This is not the first time that Meta has been forced to analyze its impact on its adolescent users. Irish regulators launched a two-year investigation into whether Instagram exposed the contact information of its underage users by allowing them to post their phone numbers and email addresses when they switched to a business account in 2020. In September 2022, Goal fined 405 million euros ($492m) for violating the General Data Protection Regulation.
Facebook whistleblower Frances Haugen also first revealed to the Wall Street Journal in September 2021 that the company knew and had investigated which showed that its photo-sharing app, Instagram, had a damaging impact on the mental health of adolescent girls.
in a blog post Responding to the article, Instagram’s head of public policy, Karina Newton, said the company takes the findings seriously, but said social media isn’t “inherently good or bad for people.”
“Many find it useful one day and problematic the next. What seems more important is how people use social networks and their state of mind when they use them.