
Meta has announced a significant global rollout of its Teen Accounts for Facebook and Messenger, extending beyond its initial launch in the U.S., U.K., Australia, and Canada. This initiative aims to provide enhanced safety features and parental controls for younger users, building on the framework first introduced on Instagram last fall. The introduction of Teen Accounts comes in the wake of intense scrutiny from U.S. lawmakers regarding the protection of minors on social media platforms. With this expansion, teenagers will automatically enter an experience tailored to minimize exposure to inappropriate content and unsolicited interactions. Notably, users under 16 will require parental approval to modify any settings. To further safeguard their experience, teens can only receive messages from accounts they follow or have previously engaged with. Their friends are the only ones able to view and respond to their stories, while interactions like tags, mentions, and comments will be restricted to their connections. Additionally, Meta will prompt teens to log off after one hour of usage daily, and an automatic “Quiet mode” feature will be activated overnight. Despite these measures, recent research from a former Meta employee revealed ongoing risks for children and teens on Instagram, indicating that harmful content, including posts related to self-harm and sexual exploitation, remains accessible. Meta countered these findings, asserting that its protective measures have effectively reduced harmful content visibility for young users. In conjunction with the Teen Accounts rollout, Meta introduced its School Partnership Program, aimed at enabling educators to report safety issues, such as bullying, directly to Instagram for expedited review and action. The program, which initially received positive feedback during its pilot phase, is now available to all middle and high schools across the U.S., providing them with enhanced reporting capabilities and educational resources. Participating institutions will display a banner on their Instagram profiles to inform parents and students of their official partnership with the platform. This latest development underscores Meta’s commitment to addressing mental health concerns associated with social media use among adolescents, a topic that has garnered attention from the U.S. Surgeon General and various state governments, with some proposing restrictions on teen social media access without parental consent.
Recent advertisements attacking New York Assembly member Alex Bores highlight his previous affiliation with Palantir, a ...
TechCrunch | Mar 03, 2026, 22:05
Security experts have uncovered a set of sophisticated hacking tools designed to breach older iPhone software, which hav...
TechCrunch | Mar 04, 2026, 24:00
Users of ChatGPT are set to experience a significant shift in interaction thanks to OpenAI's latest update, GPT-5.3 Inst...
TechCrunch | Mar 03, 2026, 21:00
Major tech companies, including Nvidia, Amazon, and Google, are taking urgent measures to protect their employees in the...
CNBC | Mar 03, 2026, 23:25
In a bold move, Palantir has reached out to its former employees with an enticing invitation reminiscent of epic tales. ...
Business Insider | Mar 03, 2026, 21:20