‘Instagram for kids' – Will it work, or is this something we need to avoid?
Plans for an ‘Instagram for under 13’s’ have been claimed as ‘dangerous for children’s health and privacy’ in news reports this week. Online privacy expert, Dr Garfield Benjamin responds…
The idea of Instagram making a children’s version is unlikely to be a good idea – Facebook has time and time again shown that it is not capable or willing to truly address the problems of operating a social media platform, so allowing them to define our children’s digital lives as well, is something we need to avoid.
But we still need to counter the problems caused by children accessing existing platforms before the right age. In my mind, this is best addressed in the same way as broader issues with social media, but that will require more redesigning and regulating than Facebook would like.
The letter by the Campaign for a Commercial-free Childhood, rightly identifies that making an 'Instagram for kids' will just end up shifting the target age of social media downwards. Children already on main Instagram aren’t going to downgrade to a kids' version, so the target will inevitably be even younger kids. For example, magazines which are aimed at children younger than the stated target audience, it’s part of the appeal (how many 17-year olds were going to read Just 17, for example? It was clearly aimed younger). So, having a kids Instagram, even if aimed at children 10+, is unlikely to help the problem of age-inappropriate access.
But the letter’s rather wholesale rejection of social media being bad for children, ignores research showing it is the type of platforms and interactions rather than just how much they are used that is the problem.
Responses to the issue of children on social media are tricky though. People often go for measures like ID verification. But while this helps with some issues enabled by anonymity – cyber-bullying is an important topic – removing anonymity is far from a blanket fix. And removing anonymity is a dangerous path to go down.
There are many groups for whom linking their social media to their real identity can be dangerous – particularly LGBTQ+ people, for example. This can include children and young people using anonymity to figure out who they are in safe spaces and supportive communities, without wanting to reveal this facet of their identity to the rest of their family, friends, school or other contacts.
LEGO have been running a ‘kid-safe by design’ platform for a few years now, although it doesn’t seem to have caught on. It’s aimed at ages 6-12 and requires ID verification by the parents (notably not the children themselves). This is an interesting compromise as it places the burden of consent on responsible adults, hopefully inspiring greater parental engagement and support for children online.
The platform itself removes a lot of the potential avenues of abuse of most social media – you aren’t allowed to post pictures of your face, it encourages people to comment in positive ways with a curated set of LEGO faces, it’s heavily moderated, there’s ‘captain safety’ to give guidance along the way, and there are no ads or in-app purchases.
Unfortunately, the LEGO platform doesn’t seem to have taken off. This could be because the LEGO framing limits its appeal, the age band is too wide, or because the desire to join “real” social media is strong.
But it is important to develop safe ways for children to establish themselves online. We can’t just wait until they are 13 and then cast them out into the lawless wastelands of the unregulated internet. It requires parents and schools to help children develop a wide range of social, digital and critical skills, as well as the tools and platforms to nurture this development.
LEGO’s platform, along with things like the ICO’s Age Appropriate Design, do give models for better ways of running social media. We would probably be better off introducing a lot of these measures across all social media. This may include upending current ad models and data collection - personalised advertising is being shown to be less effective than contextual advertising with good quality content (aka, a good ad placed by something relevant is more effective than loads of the same ad following you around the internet).
Regulation of online platforms continues to fall through the gaps of different regulators focused on different problems. But we need a more comprehensive set of regulations, including wider access to critical digital skills and enforceable design practices, in order to redesign social media so that it is safer not just for children, but for everyone.