The ESRB has issued a statement clarifying that this technology does not “confirm the identity of users” and is not designed to scan minors’ faces to determine their eligibility to purchase certain games. Instead, it uses images solely to estimate the subject’s age, ensuring compliance with COPPA privacy requirements.
“To be perfectly clear: Any images and data used for this process are never stored, used for AI training, used for marketing, or shared with anyone,” the ESRB emphasized in its statement.
“The only information communicated to the company requesting VPC is a simple ‘Yes’ or ‘No’ regarding whether the person is over the age of 25.”
A few years ago, Chinese gaming giant Tencent began using facial recognition to prevent children from playing video games excessively. Now, the Entertainment Software Rating Board (ESRB), North America’s video game rating agency, appears to be pursuing a similar approach.
The ESRB, in collaboration with digital identity company Yoti and Epic Games-owned “youth digital media” company SuperAwesome, has submitted a proposal to the FTC seeking approval for a new “verifiable parental consent mechanism” called Privacy-Protective Facial Age Estimation.
This system involves the parent taking a selfie, assisted by an “auto face capture module.” The system then analyzes the image to confirm that it is indeed the face of an adult, who can subsequently grant the necessary permissions.
The entire verification process takes less than a second “on average,” and the images are permanently deleted once verification is complete.
“The upload of still images is not accepted, and photos that do not meet the required quality standards for age estimation are rejected,” the filing states. “These measures reduce the risk of circumvention and prevent children from using images of unaware adults.”
However, the risk of children outsmarting the system isn’t the only concern. Accuracy is a significant issue, particularly given that facial recognition technology has a history of racial bias.
A study in the US, for example, found that Asian and African American individuals were up to 100 times more likely to be incorrectly identified by facial recognition systems than white individuals.
Additionally, the idea of determining whether someone is 16 or 18 based on a single selfie seems like a gamble. The ESRB downplayed concerns about the system’s “fairness,” stating that “the difference in rejection rates between gender and skin tone is very small.”
The ESRB provided data suggesting that, for those between 25 and 35, 15 out of 1,000 females and 7 out of 1,000 males might be incorrectly classified as under 25, with the option to verify using another method.
The variation by skin tone ranges from 8 out of 1,000 to 28 out of 1,000. While acknowledging that bias exists, the ESRB argued that it is not significant, especially when considering the benefits and increased access for certain groups of parents.
It’s important to note that this system is not intended to replace current methods. The ESRB presented its facial age verification plan as “an additional, optional verification method” particularly useful for those without photo ID.
In a statement to PC Gamer, Yoti also noted that the system does not recognize or identify individuals; it merely estimates the age based on the image it sees.
While this may seem reassuring, it doesn’t change the fact that this technology feels like a gross invasion of privacy. The idea of sharing my face with a digital system just so my hypothetical child can play a video game is unsettling.
Furthermore, relying on potentially flawed technology to enforce societal norms doesn’t seem like the best solution. And let’s be honest—does anyone really believe that a clever 16-year-old won’t figure out how to bypass this system within minutes?
The ESRB actually submitted its request to the FTC on June 2, but it has only come to public attention now (via GamesIndustry) as the FTC is seeking public comment on the proposal.
Leave a Reply