Blog.

Article topics

How changes to children’s privacy online will affect websites

Mark Tomkins

Concern for children’s privacy online is resulting in sweeping changes across the internet – and your website or social media could be affected.

The UK Information Commissioner’s Office (ICO) has published an Age Appropriate Design Code that could effect the design of online services aimed at children from autumn 2021. This follows on the heels of new US legislation with similar aims in mind.

The issue centres on growing concerns about data privacy for children (mostly the under-13s). Elizabeth Denham, the UK Information Commissioner, said: “One in five internet users in the UK is a child, but they are using an internet that was not designed for them. There are laws to protect children in the real world – film ratings, car seats, age restrictions on drinking and smoking. We need our laws to protect children in the digital world too.”

The code has its basis in GDPR, but goes further, setting out standards for apps, connected toys, social media platforms, online games, educational websites and streaming services. It will require digital services “likely to be accessed by children” to automatically provide children with a baseline of data protection whenever they visit a website, download an app or play a game. It first needs parliamentary approval, but once granted it will be rolled out with a short implementation period.

In practical terms, this means privacy settings will need to be high by default and location settings and data collection/sharing will need to be minimised. Profiling that could allow children to be targeted by advertisers will also need to be switched off by default.

It’s interesting to see how the new regulations might play out on social media. For example, YouTube has recently made policy changes in order to comply with COPPA, the US Federal Trade Commission’s new rules on children’s data privacy online. As part of this, videos that are aimed at children under 13 will no longer show personalised advertising. But YouTubers not intentionally targeting children could also be affected – and there doesn’t seem to be a way to appeal.

The main issue seems to be that the YouTube guidelines around what is a ‘child’s video’ is a bit vague. Even if you’re not based in the US and/or your video hasn’t got anything particularly ‘adult’ in its content (for example, they are instruction videos like our guides to WordPress), you are obliged to edit each of your channel videos and ensure they are correctly marked as being ‘not made for children’, made for children’ or specifically for over an age group. YouTube Help says “Regardless of your location, we require you to tell us whether or not your videos are made for kids.” Confusingly, family-friendly content doesn’t necessarily fit into the ‘made for kids’ category. If you don’t do this, YouTube says “there may be consequences on YouTube. Additionally there may be legal consequences under COPPA or other applicable local laws.”

Alarmingly, “consequences on YouTube” could be channel deletion – if recent activity among publishers of cryptocurrency-related content are anything to go by.  Just before Christmas, many crypto YouTubers found they’d had their accounts shut down or videos deleted with reasons including ‘harmful or dangerous goods’ or ‘sale of regulated goods’. This included even some of the most basic videos explaining the topic.

Twitter was alight with retribution and backlash from this large community. Especially given that the publishers and channel owners couldn’t even log in to download their videos to move them elsewhere. Only a few days later did YouTube respond, citing a change in algorithm that had mistakenly struck out all the channels and they would reinstate them. The crypto community reported that more than 50% of channels were still offline over a week later. This might severely affect revenue for some because livestreaming can be affected for up to 90 days. It’s not the first time that Google (which owns YouTube) has restricted the reach of crypto-related posts, but it’s not clear why it contravenes their rules.

On the one hand, it’s great to see that safeguards for children are being introduced. On the other hand, there’s a danger that these new rules will be enforced in a lazy way, resulting in problems for people with content in grey areas (gaming broadcasters, for example). At worst, they could be used as an excuse to censor content that isn’t to the liking of social media platforms. And who is policing that?