Scroll to read more

Instagram is expanding its age verification program to several more regions, as it looks to improve its processes of clarifying and confirming user ages, and limiting their exposure in the app.

Instagram age verification

As per Meta’s Andy Stone:

“Starting today, we’re beginning to expand our Instagram age verification test to Mexico, Canada, South Korea, Australia, Japan, and more countries in Europe. This builds on an expansion to India and Brazil we announced in October, with more countries coming in the next few months.”

Initially launched in the US last June, Instagram’s age verification process requires users to verify their stated age by using one of three options:

  • Upload their government ID
  • Record a video selfie
  • Ask mutual friends to verify their age.

Instagram’s video selfie process uses video analysis from Yoti to estimate a person’s age in the clip.

Yoti’s process has proven to be extremely accurate, and that enhanced level of verification, along with these alternative processes, could be a big step in ensuring that young users are not accessing potentially harmful elements of the app, and are not being targeted by advertisers with inappropriate promotions.

This is an important focus, because as various investigations have found, social media platforms, including Instagram, can be harmful for young users, in a range of ways, while underage usage can also expose kids to predators and inappropriate content.

Such issues have been exacerbated over the last two years, with the pandemic lockdowns forcing more kids online for entertainment and social connection. And with parents also working from home, it’s almost impossible to be monitoring what your child is up to all of the time.

Additional measures like this are a significant and important step, and while it won’t stop every youngster from cheating the system, the combined effort will limit the capacity for kids to cheat their way into Meta’s apps.

The expansion, then, is another important development, which could go a long way towards improving the safety aspects of the platform.

It’s not a solution, but it’s another step – and it could be a significant one at that.