- A recently-published study found that thousands of Android apps on the Play Store potentially violate COPPA.
- The study also found a number of Safe Harbor-approved apps that featured “potential violations.”
- The study raises the question of whether industry self-regulation is enough.
In a study published in the Proceedings on Privacy Enhancing Technologies journal, researchers at the International Computer Science Institute at the University of California, Berkeley found that over half of child-oriented Android apps might be in violation of the U.S. Children’s Online Privacy Protection Act (COPPA).
Titled “‘Won’t Somebody Think of the Children?’ Examining COPPA Compliance at Scale,” the study examined 5,855 child-directed apps from the Play Store. According to the researchers, all of these apps exhibited “several concerning violations and trends.”
4.8 percent of these apps had “clear violations when apps share location or contact information without consent,” 40 percent shared personal data without proper security measures, 39 percent disregarded “contractual obligations aimed at protecting children’s privacy,” and 18 percent shared identifiers for ad targeting and other prohibited purposes.
Things don’t get better from here, as 28 percent of the 5,855 examined apps accessed sensitive data protected by Android permissions and 73 percent of these apps sent sensitive data over the internet. Some of these apps include KidzInMind, TabTale’s Pop Girls-High School Band, and Fun Kid Racing.
This is where Google’s Designed for Families program is supposed to step in and provide developers with information on COPPA. Even though Google requires developers to certify compliance, the study found that enforcement was either lacking or non-existent.
Researchers believe that many of these privacy violations are “unintentional and caused by misunderstandings of third-party SDKs.” Even so, they stressed that Google should be more active in the vetting process for COPPA compliance.
Enacted in 1999, COPPA aims to protect children’s privacy online. The act compels companies that design apps for children under 13 years old to obtain consent from parents before they can collect personal information. The FTC revised COPPA in 2012 to include geolocation markers, IP addresses, and an order that third-party advertisers comply with the rules.
However, COPPA only applies to online services either “directly targeted” to children under 13 or have “actual knowledge” of folks who are under 13. That is why an app like Duolingo, which sends information to third-parties and supposedly does not fall under COPPA, was nonetheless listed as a potential violator in the study.
A Duolingo spokesperson said the information that the app shares with third-parties is only used to fix bugs and provide data on crashes.
Google should be more active in the vetting process for COPPA compliance.
Interestingly, the study also took a look at whether apps with potential COPPA violations were part of the U.S. Federal Trade Commission’s (FTC) Safe Harbor program. The program allows developers to submit their apps to make sure those apps are COPPA-compliant.
The study found that a small number of the 5,855 examined apps were certified under Safe Harbor and had prevalent “potential violations.”
Overall, the study cites several significant issues with where things stand today. As it relates to COPPA, the aforementioned gray area and daunting task of enforcement seem to be strong-enough deterrents to not prosecute potential violators.
Arguably the more significant issue, the study concluded that industry self-regulation is “ineffective.” The study also concluded that it is unclear whether “industry self-regulation has resulted in higher privacy standards; some of our data suggest the opposite.”
Where do we go from here?
There is hope, however. Keep in mind that the study’s researchers customized Android with their own automated observation tools. Anyone, including the FTC, can use these techniques to identify potential COPPA violators.
Also, just because an app was identified as a potential violator does not mean its developer has nefarious intentions. As is the case with Duolingo, which opted in to the Designed for Families section of the Play Store, it could be due to the developer only wanting to make sure that the app runs well.
At the end of the day, mobile app developers have a responsibility to make sure that third-party services are protecting children’s information. Third-party services also have a responsibility to make sure they are not receiving children’s information from integrated apps.
There also remains the issue of industry self-regulation. Calls for government regulation have only grown louder following the Cambridge Analytica fiasco, while a bipartisan bill aiming to protect online privacy could be introduced any day now.
The FTC seems intent on investigating Facebook, but what happens after could be the turning point in how online privacy is maintained.
>> Source Link