
The FTC finalized the biggest COPPA overhaul in over a decade. The shift from opt-out to opt-in consent changes how every AI tool in your elementary and middle school must operate. Full compliance was required by April 22, 2026.
The Children's Online Privacy Protection Act (COPPA) regulates how online services collect and use personal information from children under 13. In school settings, COPPA intersects with FERPA to create a dual compliance requirement for any technology — including AI tools — used with elementary and middle school students.
Until 2025, COPPA had not been substantially updated since 2013. The digital landscape has changed dramatically since then. Generative AI, persistent behavioral tracking, voice assistants, and data-driven personalization were not contemplated by the original rules. The FTC's 2025 revision addresses these gaps directly.
Under the old rules, operators could collect data and allow parents to opt out of certain uses. The new rules flip this: operators must now obtain separate, verifiable parental consent before using children's personal information for targeted advertising or any purpose beyond what is strictly necessary to provide the service.
What this means for schools: If an AI tool collects student data for an educational purpose (permitted under school consent) but also uses that data for product improvement, analytics beyond the service, or model training, the tool now needs separate parental consent for those additional uses. School consent only covers the educational purpose.
Operators must limit data collection to what is reasonably necessary to provide the service the child is using. They cannot condition a child's participation on providing more personal information than is needed.
What this means for schools: AI tools that collect extensive behavioral data, usage patterns, or metadata beyond what is needed for the educational function may be in violation. Districts should review what data each AI tool actually collects and whether it exceeds what is necessary.
Operators must retain children's personal information only for as long as reasonably necessary to fulfill the purpose for which it was collected. They must establish and follow written data retention policies.
What this means for schools: AI tools that retain student conversation histories, interaction logs, or generated content indefinitely are likely non-compliant. Districts should verify that vendor retention policies include automatic deletion after a defined period.
The revised rules clarify that biometric identifiers (including voiceprints and faceprints) are personal information under COPPA. This is directly relevant to AI tools that use voice recognition, facial recognition, or other biometric processing.
What this means for schools: AI tools with voice input, speech-to-text, or image analysis features that process student biometrics need COPPA-compliant consent. Districts using AI tools with voice interfaces for students under 13 should verify the tool's COPPA compliance status under the new rules.
Operators must implement and maintain a written information security program that is reasonably designed to protect the confidentiality, security, and integrity of children's personal information.
What this means for schools: Districts should request documentation of each AI vendor's security program and verify it meets the new standard. This goes beyond encryption — it requires a comprehensive, documented security program.
School consent under COPPA only covers data collection for educational purposes. It does not — and never has — covered commercial uses. The 2026 rules make this boundary sharper and the consequences of crossing it more severe. If your AI vendor uses student data for anything beyond the educational service you contracted, that use requires separate verifiable parental consent.
The FTC has consistently recognized that schools can consent on behalf of parents for the collection of personal information from students under 13. This authority remains under the revised rules, but with clearer boundaries:
Beni's platform is designed for COPPA compliance. Student data is never used for model training, advertising, or any non-educational purpose. Data minimization is built into the architecture — we collect only what is necessary to provide the educational service. Read our full privacy documentation or apply to become a Founding Partner.
Beni collects only what is necessary for the educational service. No model training on student data. No commercial use. No exceptions.
Apply to Learn More