There’s been a huge movement towards making social media safer for young people. Most of the major platforms took steps to refocus on safety in 2021 and that’s really important if you’re a brand focusing on social media marketing to younger people.
I’ve always been interested in this area – I’ve advised the UK government on child safety in the past, and am still involved with various safety bodies – so have a real interest in what the platforms are doing, and why.
As any parent will tell you, it’s hard to stop under-13s accessing social media, and the platforms know this is a problem. TikTok removed more than seven million accounts of users under 13 in the first quarter of 2021 alone. As well as restricting access to younger users, the platforms have focused more on how to create safer and more private spaces for teens under 18.
Here’s what brands should know.
Privacy is moving to be by default for teens’ accounts
Social media platforms such as TikTok and Instagram led the charge here, making changes that mean when a teen sets up an account, it’s locked by default. All sites allow them to make their accounts public after setup, but they have to make an active choice to make the change.
Instagram has made this change for anyone under 16 (or 18 in some countries). TikTok has not only made accounts private by default for users under 16 – it’s turned direct messages off by default too.
Previously, research by Instagram found that 8 out of 10 “young people” opted to make their accounts private when prompted during setup. So, it will be interesting to see how many teens decide proactively to make their accounts public.
Teens have more control over who can view their content
Platforms like Google and TikTok are making changes to give users more control over who sees their content.
YouTube creators between 13 and 17 are set to have their content seen only by a small group of users by default – again, unless the creator changes the setting to make it public. Google also announced changes that allow users who are under 18 (or their parents) to have their images removed from Google search. TikTok now allows under 16s to select who can view their videos (friends, followers or just themselves) before posting them.
These changes are fantastic for teens who want to use social apps to chat and share social media content with friends, rather than creating trending posts and worrying about dealing with trolls or bullies. It’s not a silver bullet, but it’s a good start.
Social media platforms are focusing on digital wellness by prompting teens to use their apps less
It seems strange that social apps – which, let’s be honest, do everything they can to keep people engaged on their platforms – are now actively looking for ways to reduce the time people use their apps, but they are (at least for teen users).
In September, Instagram announced its ‘take a break’ feature, which reminds teens to take some time away from the app. YouTube made ‘break and bedtime’ reminders a default feature for teens between the ages of 13 and 17, with autoplay turned off by default for teens (I’m sure many of us have been sucked into a YouTube autoplay wormhole!).
The impact of social media on mental health was well documented in reports that Instagram knew about the negative effects its app could have on teenagers, so these changes have to be a move in the right direction.
Restricting advertising to children and teens on social media platforms
Back in July, Facebook (now Meta) announced that it would change how it dealt with advertising to under 18s. Advertisers can still target ads to teens based on age, gender and location (across Facebook, Messenger and Instagram), but ads will stop using teens’ activity on other sites to create targeted ads.
Google’s taking things a step further, blocking targeting ads based on ‘age, gender or interests’ to anyone under 18. It also said that it would be removing ‘overly commercial content’ from YouTube Kids.
Creating a safer space by removing unwanted adult content and contacts
Early in 2021, Instagram started rolling out measures to protect under-18s from adults on the app. It prevents adults from sending direct messages to teenagers who don’t follow them, and alerts teens when an adult they’ve been chatting with has a pattern of contacting other under-18s (allowing them to restrict the person’s access to their account).
Google also introduced a range of additional safety measures, including turning SafeSearch on by default for teens under 18 who set up new accounts and enable it for existing users under 18 (under 13s already had SafeSearch enabled).
None of this will solve all the problems that young people can face on social media. But – after many years of campaigning for better safety – I’m delighted that safety is part of the conversation and that social media platforms are taking it seriously.
And for brands, it’s important to know that the platform you use for social media marketing or advertising is behaving in a way that is responsible and ethical.