
Student data privacy law was written before generative AI existed. This guide explains how FERPA, COPPA, and related regulations apply when your district uses AI tools — and what you need to do to stay compliant.
The Family Educational Rights and Privacy Act (FERPA) protects the privacy of student education records. It applies to all educational agencies and institutions that receive federal funding — which is virtually every public school district in the United States.
When any technology, including AI tools, accesses, processes, stores, or generates content based on student education records, FERPA applies. There is no AI exception.
An education record is any record that is directly related to a student and maintained by the educational agency. In the context of AI tools, this includes:
If an AI tool can identify a student — directly or indirectly — and the tool is operating in an educational context maintained by the school, the data it handles is likely an education record under FERPA.
FERPA allows districts to share education records with third parties (including AI vendors) without parental consent under the "school official" exception. For this exception to apply, the vendor must:
This exception is the legal basis for most AI tool deployments in schools. But it requires a proper agreement — and it has limits that matter specifically for AI.
Generative AI introduces FERPA risks that did not exist with traditional edtech. Districts need to understand these risks to govern AI effectively.
When students interact with AI tools, their inputs (prompts, essays, questions) may be used by the AI vendor to improve or train their models. Under FERPA, education records disclosed under the school official exception can only be used for the authorized educational purpose. Using student data to train a commercial AI model goes beyond that purpose.
What to do: Your data processing agreement must explicitly prohibit the vendor from using student data for model training, product improvement, or any purpose other than providing the contracted educational service.
When a teacher signs up for a consumer AI tool (like a free ChatGPT account) and has students use it, the district has likely created an unauthorized disclosure of education records. The teacher did not have authority to enter into a data agreement on the district's behalf, and the AI vendor has no obligation to protect student data under FERPA.
What to do: Establish a clear AI tool approval process. Only AI tools that have been vetted and that operate under a district-level data processing agreement should be used with students.
AI tools may retain conversation histories, student inputs, and generated outputs. Unlike traditional software where you can delete a database record, AI systems may embed student data in model weights, vector stores, or retrieval systems in ways that are difficult to isolate and delete.
What to do: Understand exactly how the vendor stores student data, whether data persists in AI memory or context systems, and what the vendor's data deletion process actually does at the technical level.
AI tools can infer sensitive information about students that was never directly provided — learning disabilities, emotional states, family circumstances, political or religious views. These inferences, if stored, may themselves become education records.
What to do: Define in your governance framework what types of AI-generated inferences are permissible and ensure that sensitive inferences are not stored, disclosed, or used for automated decision-making about students.
A student writes a personal essay using an AI writing assistant. The AI tool stores the essay, generates a profile of the student's writing level, and flags potential emotional concerns based on the content. All of this data is now likely an education record under FERPA — and the district is responsible for how it is handled.
A data processing agreement (DPA) is the legal document that governs how a vendor handles student data. For AI tools, standard DPA templates may not cover the unique risks. Here is what to include:
The Children's Online Privacy Protection Act (COPPA) adds another layer of requirements when AI tools are used with students under 13.
The FTC allows schools to consent on behalf of parents for the collection of children's personal information, but only when:
If an AI tool collects personal information from students under 13 and uses it for advertising, profiling, or any non-educational purpose, separate verifiable parental consent is required — the school cannot consent to commercial use on parents' behalf.
When your district evaluates an AI tool, these are the questions to ask — and the answers to look for:
Beni is built for FERPA compliance from the ground up. Student data is never used for model training. All data processing happens under district-level DPAs. Policy Cards give districts granular control over what data AI can access in each context. Read our full Trust Center documentation or apply to become a Founding Partner.
Use this checklist when evaluating or deploying any AI tool in your district:
Beni's platform gives districts the technical controls to enforce data privacy policies at the classroom level. No student data is used for model training. Ever.
Apply to Learn More