A new report published by child safety groups Heat Initiative and ParentsTogether Action details the alarming presence of inappropriate apps on Apple’s App Store that were deemed suitable for children as young as four.
The groups worked with a researcher to review as many apps as possible in a 24-hour span, and say they ultimately identified more than 200 apps that contained “concerning content or features” according to their rated age – including stranger chat and AI girlfriend apps, gaming apps with sexual or violent innuendos and imagery, and AI-powered appearance rating apps.
The research focused on apps that were given age ratings of 4+, 9+ and 12+ in categories considered “risky”: chat (including AI and stranger chat apps), beauty, diet and weight loss, unfiltered internet access (apps for accessing schools’ restricted sites) and gaming.
Among the findings were at least 24 sexual games and 9 stranger chat apps flagged as appropriate for children in these age groups, the report said. The research also identified 40 apps for unfettered internet access and 75 apps related to beauty, body image and weight loss, as well as 28 shooter and crime games.
Collectively, the nearly 200 objectionable apps identified during the 24-hour investigation have been downloaded more than 550 million times, according to the Heat Initiative. Nearly 800 apps were reviewed in total, and the research found that some categories were more likely to contain apps with inappropriately low age ratings than others. For stranger chat apps and games, “lower ratings were given for children,” the report said.
In most cases, they were 17+. But in the categories of weight loss and unfettered internet access, “almost all of the apps reviewed were approved for children 4+.” The report calls on Apple to do better in terms of child safety measures on its App Store, urging the company to use third-party reviewers to verify the age ratings of apps before they are available for download and to make its age rating process transparent to consumers.
In a statement to Engadget, an Apple spokesperson said, “At Apple, we work hard to protect user privacy and security and provide a safe experience for children.
We do this by giving parents a variety of capabilities they can enable to restrict purchases, web searches, and app access on their children’s devices; block explicit content; flag problematic content through problem reports; and more.
Developers are required to provide clear age ratings in accordance with App Store policies, and apps designed for children are designated in a unique category and undergo a strict app review process. In cases where an app’s age rating does not match its content, we take immediate action to fix the issue.”