Beni
Read about Beni's approach to AI Governance in K-12
Compliance Guide

FERPA & AI in K-12: What Districts Need to Know

Student data privacy law was written before generative AI existed. This guide explains how FERPA, COPPA, and related regulations apply when your district uses AI tools — and what you need to do to stay compliant.

Published April 2026 Updated April 11, 2026 Reading time 10 min

FERPA Fundamentals for AI

The Family Educational Rights and Privacy Act (FERPA) protects the privacy of student education records. It applies to all educational agencies and institutions that receive federal funding — which is virtually every public school district in the United States.

When any technology, including AI tools, accesses, processes, stores, or generates content based on student education records, FERPA applies. There is no AI exception.

What counts as an education record?

An education record is any record that is directly related to a student and maintained by the educational agency. In the context of AI tools, this includes:

Key principle

If an AI tool can identify a student — directly or indirectly — and the tool is operating in an educational context maintained by the school, the data it handles is likely an education record under FERPA.

The school official exception

FERPA allows districts to share education records with third parties (including AI vendors) without parental consent under the "school official" exception. For this exception to apply, the vendor must:

  1. Perform an institutional service or function for which the district would otherwise use employees
  2. Be under the direct control of the district with respect to the use and maintenance of education records
  3. Use education records only for the purposes for which the disclosure was made
  4. Meet the district's criteria for who constitutes a "school official" with a "legitimate educational interest"

This exception is the legal basis for most AI tool deployments in schools. But it requires a proper agreement — and it has limits that matter specifically for AI.

AI-Specific FERPA Risks

Generative AI introduces FERPA risks that did not exist with traditional edtech. Districts need to understand these risks to govern AI effectively.

1. Model training on student data

When students interact with AI tools, their inputs (prompts, essays, questions) may be used by the AI vendor to improve or train their models. Under FERPA, education records disclosed under the school official exception can only be used for the authorized educational purpose. Using student data to train a commercial AI model goes beyond that purpose.

What to do: Your data processing agreement must explicitly prohibit the vendor from using student data for model training, product improvement, or any purpose other than providing the contracted educational service.

2. Unauthorized teacher adoption

When a teacher signs up for a consumer AI tool (like a free ChatGPT account) and has students use it, the district has likely created an unauthorized disclosure of education records. The teacher did not have authority to enter into a data agreement on the district's behalf, and the AI vendor has no obligation to protect student data under FERPA.

What to do: Establish a clear AI tool approval process. Only AI tools that have been vetted and that operate under a district-level data processing agreement should be used with students.

3. Data persistence and retrieval

AI tools may retain conversation histories, student inputs, and generated outputs. Unlike traditional software where you can delete a database record, AI systems may embed student data in model weights, vector stores, or retrieval systems in ways that are difficult to isolate and delete.

What to do: Understand exactly how the vendor stores student data, whether data persists in AI memory or context systems, and what the vendor's data deletion process actually does at the technical level.

4. Inference and profiling

AI tools can infer sensitive information about students that was never directly provided — learning disabilities, emotional states, family circumstances, political or religious views. These inferences, if stored, may themselves become education records.

What to do: Define in your governance framework what types of AI-generated inferences are permissible and ensure that sensitive inferences are not stored, disclosed, or used for automated decision-making about students.

Real scenario

A student writes a personal essay using an AI writing assistant. The AI tool stores the essay, generates a profile of the student's writing level, and flags potential emotional concerns based on the content. All of this data is now likely an education record under FERPA — and the district is responsible for how it is handled.

Data Processing Agreements for AI Tools

A data processing agreement (DPA) is the legal document that governs how a vendor handles student data. For AI tools, standard DPA templates may not cover the unique risks. Here is what to include:

Essential DPA provisions for AI

COPPA Considerations for AI in K-12

The Children's Online Privacy Protection Act (COPPA) adds another layer of requirements when AI tools are used with students under 13.

School consent under COPPA

The FTC allows schools to consent on behalf of parents for the collection of children's personal information, but only when:

If an AI tool collects personal information from students under 13 and uses it for advertising, profiling, or any non-educational purpose, separate verifiable parental consent is required — the school cannot consent to commercial use on parents' behalf.

AI-specific COPPA issues

Evaluating AI Vendors for Compliance

When your district evaluates an AI tool, these are the questions to ask — and the answers to look for:

Questions for AI vendors

  1. Where is student data stored and processed? Look for: specific data center locations, confirmation that data stays within the US (if required by state law), and details about cloud infrastructure.
  2. Is student data used to train your AI models? Look for: explicit "no" with contractual commitment, not just a policy statement.
  3. What happens to student data when a conversation or session ends? Look for: clear retention timelines and automatic deletion processes.
  4. Who are your sub-processors? Look for: a complete list including model providers, cloud services, and any third-party services that touch student data.
  5. Can you provide a completed Student Data Privacy Consortium (SDPC) agreement? Look for: willingness to sign standardized state DPA templates.
  6. What security certifications do you hold? Look for: SOC 2 Type II at minimum, with specific attention to how AI-specific risks are addressed.
  7. How do you handle data subject access and deletion requests? Look for: documented processes with specific timelines.
  8. Can the district control what data the AI tool can access? Look for: granular data access controls, not all-or-nothing access.
How Beni handles this

Beni is built for FERPA compliance from the ground up. Student data is never used for model training. All data processing happens under district-level DPAs. Policy Cards give districts granular control over what data AI can access in each context. Read our full Trust Center documentation or apply to become a Founding Partner.

FERPA Compliance Checklist for AI Tools

Use this checklist when evaluating or deploying any AI tool in your district:

Frequently Asked Questions

Yes. When an AI tool accesses, processes, or stores student education records, FERPA applies. This includes AI tools that use student names, grades, assignments, behavioral data, or any other personally identifiable information from education records. The district remains responsible for FERPA compliance regardless of which vendor provides the AI tool.
Generally no, if student data will be involved. Individual teachers typically do not have the authority to enter into data sharing agreements on the district's behalf. When a teacher creates an account on a consumer AI tool and has students use it, the district may be creating an unauthorized disclosure of education records under FERPA. Districts should have a clear approval process for AI tools.
A DPA for AI tools should include: what student data the tool will access, how that data will be used and processed, data retention and deletion policies, security measures and incident response procedures, prohibition on using student data for non-educational purposes (including AI model training), restrictions on re-disclosure to third parties, and audit rights for the district.
Under FERPA, student data disclosed to a vendor under the school official exception can only be used for the purpose for which it was disclosed — providing an educational service. Using student data to train AI models for general commercial purposes would typically violate FERPA. Districts should explicitly prohibit this in their data processing agreements.
COPPA applies when online services collect personal information from children under 13. In school settings, teachers can provide consent on behalf of parents, but only for educational purposes. If an AI tool collects data from students under 13 for non-educational purposes (advertising, profiling, or commercial use), separate parental consent is required under COPPA regardless of the FERPA school official exception.

Built for FERPA Compliance From Day One

Beni's platform gives districts the technical controls to enforce data privacy policies at the classroom level. No student data is used for model training. Ever.

Apply to Learn More