After Google’s Landmark Settlement, How Ad Networks Should Tackle Child Privacy
Congress enacted the Children’s Online Privacy Protection Act (COPPA) in 1998 to address the rapid growth of online marketing techniques targeting children. Programmatic advertising was just taking off with the introduction of banner ads and targeted ad placement.
Websites were collecting personal data from children without parental knowledge or consent, and research showed that children did not understand the risks of revealing personal information online.
Congress charged the Federal Trade Commission (FTC) with enforcing COPPA and issuing regulations implementing it. The FTC’s original COPPA Rule came into effect in 2000, and the FTC began enforcing it against operators of websites and video games.
Fast Forward to the 2010s
As technology evolved, the FTC brought cases against makers of mobile apps, connected toys, and IoT devices. But it wasn’t until 2013, when the COPPA Rule was amended, that ad tech companies started to pay attention to it. That’s when the Rule updated the definition of personal information to include “[a] persistent identifier that can be used to recognize a user over time and across different websites or online services.”
These identifiers (including a cookie, an IP address, a processor or device serial number, or a unique device identifier) are what online advertisers use to target consumers with relevant ads. This expansion of the COPPA Rule dramatically increased ad tech companies’ potential liability, and such companies should be aware of the pitfalls inherent in collecting information from children via other websites and apps.
The COPPA Rule applies to operators of commercial websites and online services directed to children under 13 that collect, use, or disclose personal information from children. It also applies to operators of general-audience websites or online services with actual knowledge that they are collecting, using, or disclosing personal information from children under 13.
Here’s the critical part for ad networks, platforms, and other third parties: The Rule also applies to operators of websites or online services that have actual knowledge that they are collecting personal information directly from users of another website or online service directed to children.
InMobi and Google’s FTC Settlements
The FTC has since brought more than thirty enforcement actions alleging violations of the COPPA Rule. Two of the FTC’s actions involved ad networks, the first a 2016 enforcement action against InMobi, an advertising platform for app developers and advertisers.
The second — and arguably higher-profile — COPPA case that the FTC brought against an ad network/platform was the 2019 enforcement action against Google and YouTube. In that case, the FTC alleged that YouTube violated the rule by collecting persistent identifiers from viewers of child-directed channels to deliver targeted ads without first notifying parents and getting their consent.
During the investigation, the FTC uncovered evidence that YouTube had actual knowledge that children under 13 were viewing certain channels on their platform. For example, YouTube told toy-making companies that “YouTube is today’s leader in reaching children age 6-11 against top TV channels” and that YouTube is the “No. 1 website regularly visited by kids.” Even while making these representations, YouTube claimed to be a general-audience platform that did not have any content directed to children under 13.
By default, YouTube enabled targeted advertising on its monetized channels. This meant that YouTube was collecting cookies from users of child-directed channels without giving parents notice and getting their consent to collect that data. The settlement required YouTube to pay a $170 million penalty. YouTube was also required to implement a system that permits channel owners to identify content as child-directed so that YouTube can ensure it is complying with the Rule going forward.
Five Things Today’s Ad Networks Should Be Doing
With that backdrop, here are five important takeaways from the InMobi and Google/YouTube settlements:
- Improve transparency. Ad networks and platforms should consider implementing a system that lets online services (like websites, apps, or channels) identify to the ad network/platform that their content is child-directed.
- Stop collecting children’s data. Once an ad network or platform sets up a system where developers can signal that their app is child-directed, that ad network needs to take steps to not collect personal information through those websites, apps, or channels.
- Involve parents when required. Even if an ad network is not collecting precise geolocation information from children, if it collects wireless network identifiers to infer precise location, it is required to provide notice and obtain consent from parents.
- Protect sensitive data. If an ad network decides to collect children’s data, it must maintain the confidentiality, security, and integrity of the information. It should only retain the data as long as necessary to fulfill the purpose for which it was collected. The ad network should delete the data in a way that protects against its unauthorized access or use.
- Remain stringent on protecting children. If a platform or ad network has knowledge that a channel or app is child-directed, it cannot collect personal information like persistent identifiers to serve targeted advertising without giving notice and getting parental consent.
The bottom line: An ad network should avoid collecting information via apps that they know are directed to children, and the safest practice for an ad network or platform is to not serve targeted ads on child-directed websites, apps, or channels to begin with.
Read More HERE