back to top

TikTok to alert parents when teens post public videos | Science, Climate & Tech News

Share post:


TikTok will now alert parents when their teenager posts a public video, as part of new safety tools launched this week.

Just days after the UK introduced sweeping new internet safety rules, the social media company released more protections for young users.

One new feature alerts parents, via the platform’s ‘family pairing’ setting, when their teen makes a public post.

Parents can also get notified when their child reports a video, although they won’t be able to see the reported content.

Other new tools include giving creators more control over the comments they see on their posts and how their fans can interact with them.

TikTok has had parental control options for its teen accounts for more than five years, and other social media apps like Instagram also have the option for parents and children to link their accounts.

How to turn on ‘family pairing’ on TikTok

To turn on ‘family pairing’ on TikTok as either a teenager or parent, you’ll first need to get agreement from the account you are pairing with. You’ll also need to have a TikTok account.

In the TikTok app, tap Profile at the bottom – the icon looks like a little person.

Tap the Menu button on the top which looks like three horizontal lines on top of each other.

Click on the Settings and Privacy option, which will take you to a longer drop-down menu.

Scroll down and click Family Pairing and then Continue.

Select either Parent or Teen, depending on who you are.

If you’re a parent, your teen will need to scan the QR code with their app to link the accounts.

If you’re a teen, you will need to scan a code generated by your parent’s account after they’ve followed these steps.

Instagram also has parental control settings that can be set up.

Ali Law, TikTok’s UK director of public policy, told Sky News the platform views online safety as “a race that is never won”.

Speaking exclusively to Sky News, Mr Law added that the company will continue to update protections for teen accounts.

“We know from research we conducted with Internet Matters that when teens feel in control of their online behaviours and habits, it plays a positive role in their well-being,” he said, “but they also value their parents helping them to check and understand privacy and safety settings.”

The new features are being introduced as discussions around internet safety heat up in the UK.

Since Ofcom began enforcing new age checks and platform settings last Friday, more than 400,000 people have called on the government to repeal its Online Safety Act.

Read more from Sky News:
Police investigate Tommy Robinson video over assault claims
Five boys arrested on suspicion of attempted murder

Asked whether he thought the Online Safety Act had gone too far in its restrictions, Mr Law said TikTok “already prohibit[s] the vast majority of content classified as harmful to children under the Online Safety Act for all users.

“Unlike most other large platforms, TikTok has been regulated by Ofcom since 2020, and we have therefore approached Online Safety Act compliance with a robust set of safeguards for our community, and we welcome that the legislation sets a level playing field for all platforms.”

Young people gathered at the Warrington Youth Zone to talk to Sky News about online safety
Image:
Young people gathered at the Warrington Youth Zone to talk to Sky News about online safety

When Sky News interviewed a group of teenagers last week, they said they regularly saw harmful content on platforms like TikTok.

“There’s some things that are so accessible on social media like TikTok that really shouldn’t be easy to find, in my opinion,” said 15-year-old Freya.

Robin, 15, talking about adult sexual content, said: “I obviously don’t go actively searching for it, but it is easy to stumble across, especially if it’s on a platform with short-form content like TikTok where you just flick, flick, flick.”

Sky News put these teenagers’ comments to TikTok and Mr Law responded: ‘The content you describe has no place on our platform.”

“We have tens of thousands of safety professionals who work alongside technology to remove this content at scale and at speed, and 98% of violative content found is removed before it is reported to us and we restrict access to age-inappropriate material.”

Popular

Subscribe

More like this
Related