Frequently Asked Questions

We identified reviews using a basic machine learning technique that analyzes reviews alongside manual human review. More details can be found in our paper:
Indications of Child Sexual Abuse Revealed in App-Store Reviews. Brian Levine, Jagath Jai Kumar, Hany Farid, Eloghosa Ikponmwoba, and Ed Dixon. USENIX SOUPS 2022: Workshop on Kids’ Online Privacy and Safety, pages 4. May 2022.
See also:
A Quantitative Analysis of Inappropriate Content, Age Rating Compliance, and Risks to Youth on the Whisper Platform
Jeng-Yu Chou and Brian Neil Levine. International Workshop on Child Online Safety and Harms (COSH), July 2024.
We believe these reports of abuse should be displayed prominently to parents and users on Apple and Google’s app stores. In addition to choices such as “Sort by Most Critical” we believe Apple and Google should have an option to “Sort by Reports of Abuse”. Any report of child sexual abuse to Apple and Google should prompt an investigation into the moderation practices of the app developer, as well as an investigation of the possibility that there is population of dangerous users that the developers may have allowed to thrive on their app. The result of the investigation should be posted publicly in the app stores. If you read some of these reviews, you might agree with us that some apps should be removed from their stores immediately. Cars undergo safety recalls; planes are ground for safety problems; bridges and roads are closed for repair when they are unsafe. Google and Apple should be suspending the download of apps when there reports of child sexual abuse until the apps can demonstrate action has been taken. The age ratings that Apple and Google advertise to consumers should reflect these reports of child sexual abuse — as well as accounts in courts and in the news — rather than the intentions of the developer. Age ratings should reflect the content on the app that passes the developer’s moderation process (if any). Currently, Apple and Goggle’s age ratings appear to reflect the intentions of a developer and not what users actually experience.
If you spot a review that you believe is inaccurate, please report the review to us with this error form.
To suggest an app for us to analyze, please use this suggestion form.
If you have encountered or observed abuse on an app please report it as a CyberTip to the National Center for Missing and Exploited Children. Consider involving law enforcement if you need help.
For more information about this project, please read our paper.
Indications of Child Sexual Abuse Revealed in App-Store Reviews. Brian Levine, Jagath Jai Kumar, Hany Farid, Eloghosa Ikponmwoba, and Ed Dixon. USENIX SOUPS 2022: Workshop on Kids’ Online Privacy and Safety, pages 4. May 2022.
See also:
A Quantitative Analysis of Inappropriate Content, Age Rating Compliance, and Risks to Youth on the Whisper Platform
Jeng-Yu Chou and Brian Neil Levine. International Workshop on Child Online Safety and Harms (COSH), July 2024.
We know only that these are reviews that Apple and Google have deemed sufficiently real to appear on their own websites where users can download apps. Some may be fake; some may be accurate reflections of users’ experiences on these apps.
You can makes use of the information in the reviews presented on this site alongside other resources. The following is a list of websites that can help you and your children navigate these challenges:
https://www.icactaskforce.org/internetsafety
https://www.justice.gov/coronavirus/keeping-children-safe-online
https://www.protectchildren.ca/en/resources-research/trending-resources-timely-support
If you would like to donate to this projects and other efforts by the UMass Rescue Lab to thwart internet-based child sexual abuse, please visit our donation page. We are a non-profit organization and your help can allow us to expand this effort and other projects we have that help rescue children from internet-based sexual abuse.