This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
Insights Insights
| 1 minute read

Microsoft to Pay $20M to Settle Kids' Privacy Charges with FTC

Microsoft will be forced to pay the FTC $20M for violating privacy laws that govern children's information. Microsoft must also institute new procedures such as deleting children's information within two weeks if their parents do not consent to creating an account for the child. This is the third FTC action involving children's privacy within the last few weeks.  

State and federal regulators have exhibited a strong interest in protecting children online in 2022 and 2023. The states are passing laws that purport to limit children's access to certain online services; and the FTC is aggressively enforcing violations of the federal children's privacy law, COPPA. In this case, the problematic data fell into two buckets. First, the company required kids to provide personal information and consent to certain privacy practices even after learning that they were under 13. Second, the company held onto data from kids who started to create an account but whose parents did not validate the account under the children's privacy rules; the company should have deleted that data when the parents failed to give consent.  

Why It Matters

If your service has users under 13, it is a good time to check your practices to be sure they are in line with the new crop of rules and the old rules that are now being enforced vigorously. Once you learn a potential customer is under 13, you must get verifiable parental permission for them to create an account; you cannot require them to create the account and then seek the parents' permission. If the account creation process is abandoned because the parents never provide consent, you must delete the children's information. Furthermore, you should not collect more information than you reasonably need to provide your service and should not keep it longer than reasonably required. Finally, be aware that certain new laws may create or extend privacy and content protections for children as old as 17, and that you must keep up with those if minors are using your services.  

As part of a proposed order filed by the Department of Justice on behalf of the FTC, Microsoft will be required to take several steps to bolster privacy protections for child users of its Xbox system. For example, the order will extend COPPA protections to third-party gaming publishers with whom Microsoft shares children’s data. In addition, the order makes clear that avatars generated from a child’s image, and biometric and health information, are covered by the COPPA Rule when collected with other personal data. The order must be approved by a federal court before it can go into effect.

Subscribe to Taylor Duma Insights by topic here.

Tags

data security and privacy, hill_mitzi, insights, youth services law