A handbook for social media parental restrictions
In a series of congressional hearings, executives from Facebook (FB), TikTok, Snapchat, and Instagram were grilled by lawmakers about how their platforms could expose young users to harmful content, harm mental health and body image (especially in teenage girls), and lack adequate parental controls and safeguards to protect teenagers.
The corporations promised to make changes after the hearings, which came after whistleblower Frances Haugen's revelations concerning Instagram's effect on teenagers in what became known as the 'Facebook Papers.' Since then, additional tools and parental control choices have been added by the four social networks in an effort to better safeguard younger users.
Some have also modified their algorithms, such as defaulting to less sensitive content for minors and stepping up their moderating efforts. The new remedies, according to some legislators, social media specialists, and psychologists, are still insufficient, and more has to be done.
Sen. Richard Blumenthal, who heads the Senate's subcommittee on consumer protection, said to CNN Business that social media giants have only taken 'little, incremental moves' to clean up their acts more than a year after the Facebook Papers publicly exposed Big Tech's wrongdoing.
Social media companies are 'providing very little of substance to offset the ills their networks incur,' according to Michela Menting, director of digital security at market research firm ABI Research. She claimed that their recommendations put the onus on parents to turn on various parental controls, including those designed to filter, block, and restrict access, as well as more passive choices, including background-running monitoring and surveillance programmes.
After the reaction from the leaked documents, Instagram, which is owned by Meta, put on hold its much criticised plan to launch a version of the site for children under the age of 13, and instead concentrated on making its regular service safer for young users.
Additionally, parents may observe and be notified if their child changes their privacy or account settings, as well as receive updates on the accounts their teenagers follow and the accounts that follow them. Parents can also view the accounts that their teenagers have blocked.
The Facebook Safety Center offers ideas and tools for supervision, including articles and suggestions from respected professionals. A representative for Meta, Liza Crenshaw, told CNN Business that the goal of Family Center is to eventually enable parents and guardians to assist their teenagers in managing experiences across Meta technologies from a single location.
The hub also provides a tutorial to Meta (FBVR )'s parental control tools from ConnectSafely, a nonprofit organisation that works to keep children safe online, to help parents talk to their teenagers about virtual reality. Guardians can view the accounts that their children have blocked, use monitoring tools, and either allow the download or purchase of an app that is by default blocked due to its rating or block particular apps that may be problematic.
In August, Snapchat (SNAP) released a parent portal and guide to give parents more information about how their teenagers use the app, including who they have been chatting with recently (without divulging the content of those conversations). Parents must sign up for Snapchat themselves in order to utilise the feature, and kids must consent and opt-in.
Despite the fact that this was Snapchat's first official entry into parental controls, it already had a few safeguards in place for its younger users, such as a requirement that teens be mutual friends before they can start communicating and a ban on having public accounts.
TikTok introduced additional filters to remove mature or 'possibly harmful' videos in July. The additional safeguards gave videos that were suspected of having mature or complex subjects a 'maturity score.'
It also released a tool to aid users in determining how much time they want to spend watching TikToks. The programme allows users to schedule regular screen breaks and includes a dashboard with information on how often they used an app, a split of daytime and evening usage, and other things.
Currently, the well-known short form video app provides a Family Pairing centre where parents and teenagers can personalise their security settings. Additionally, a parent can link their TikTok account to their teen's app and establish parental controls, such as how much time can be spent using the app daily, what content can be viewed, whether or not teens can search for videos, hashtags, or Live content, and whether or not their account should be private or public. Additionally, TikTok provides its Guardian's Guide, which explains how parents may best safeguard their children on the network.