Here is why Instagram is making age verification mandatory
It must have seemed like a great idea at the time: Instagram Kids – a special version of Instagram, for children. Surely this would be the perfect way for parent company Meta to silence critics concerned about under-13s accessing age-inappropriate content on the social media platform.
Of course, it wasn’t. Once the world heard about the development of ‘Instagram Kids’, Meta faced a barrage of criticism for encouraging children to use social media. And the Wall Street Journal discovered documents demonstrating that Meta knows Instagram can be harmful to teenage girls’ mental health. Instagram Kids was then ‘paused’, according to Adam Mosseri, head of Instagram. Instead, the platform has refocused on what many campaigners said was the main issue all along – restricting under-13s from using Instagram at all.
Although Instagram always made it clear that you must be 13 or over to have an account, everyone knew that didn’t stop under-13s from accessing the platform. Before 2019, it didn’t even ask for birthdates. In 2019 it started asking for birthdates, but these were never verified –making it easy to lie.
In fact, a 2021 report from found that 40% of under-13s surveyed used Instagram every day. And 26% of children surveyed said they had encountered potentially harmful experiences on Instagram – where, along with Snapchat, the report also found the most sexually explicit interactions between children and adults.
Following the pause of Instagram Kids, Instagram announced that it would make age verification mandatory, and started asking for birthdays from new signups. It also rolled out AI which will search other accounts for information – such as people wishing you a happy birthday, or the age you have shared to Facebook and other linked apps.
Now, if you haven’t provided that information, you’ll be locked out of your account until you do. If the AI makes a mistake and thinks you’re not old enough when you are, you’ll need to submit valid photo ID that clearly shows your face and date of birth to get your account back.
In the UK, Instagram – and every other online service company which operates in the UK, or processes data of children in the UK – didn’t have a choice. The Age Appropriate Design Code (AADC) was introduced by the UK in September 2021. It contains 15 standards that online services need to follow. If companies are found to be flouting that code, they can be fined up to 4% of their revenue.
And the UK’s not alone: regulation in this area is getting tighter around the world. TikTok and YouTube have also recently introduced new privacy and wellbeing restrictions for kids, such as stopping push notifications at night.
So, what will this mean for brands operating on Instagram? It’s certainly likely to make over-18 age information more accurate. Having more and better age information will make it easier for brands to target groups by age. That could mean an increased click-through rate for retailers and a better ROI for paid Instagram advertising. That could make Instagram a better opportunity for brands which have previously preferred Facebook for its bigger reach and lower prices.
However, that advantage is only present if the groups you’re targeting are over 18. If you’re looking to advertise to those under 18, it could get a lot more difficult, as the AADC has also forced Instagram to reduce the amount of information it gives advertisers about under-18s.
But the move could also enable retailers who deal in age-restricted goods to feel more confident about compliance and expand their social reach onto Instagram. Nobody wants to be the brand caught selling to under-age customers, with all the legal and reputational damage that causes. For brands serious about getting their products and service in front of an age-appropriate audience, social media platforms getting equally serious about age verification could prove to be a good thing.