Beni
Read about Beni's approach to AI Governance in K-12
Free Template

AI Acceptable Use Policy Template for K-12 Districts

A ready-to-customize AI acceptable use policy covering student use, staff use, data privacy, academic integrity, and governance. Built for district administrators who need a starting point that meets state requirements and federal compliance standards.

Free Resource
Download the Full Template

Get an editable version you can customize for your district. Includes board resolution language and parent communication templates.

Get the Template

Multiple states now require school districts to adopt formal AI acceptable use policies. Ohio's deadline is July 1, 2026. Idaho's comprehensive SB 1227 requires aligned district policies. Whether your state has mandated it or not, a clear AI AUP protects your district, your teachers, and your students. Below is a complete template you can read, customize, and adopt. [Bracketed text] indicates where you insert your district's specific information.

For context on the federal compliance requirements this template addresses, read our FERPA & AI Compliance Guide. For the full governance framework, see our AI Governance in K-12 Guide. To track which states require these policies, visit the State AI Policy Tracker.

[District Name] Artificial Intelligence Acceptable Use Policy
Template v1.0 | April 2026

Section 1: Purpose and Scope

1.1 Purpose

This policy establishes guidelines for the responsible use of artificial intelligence (AI) tools and systems within [District Name]. It applies to all AI-powered technologies used in educational settings, including but not limited to generative AI, machine learning systems, automated decision-making tools, and AI-enhanced educational software.

1.2 Scope

This policy applies to:

  • All students enrolled in [District Name]
  • All certified and classified staff, contractors, and volunteers
  • All district-owned devices and networks, as well as personal devices used for educational purposes on district networks
  • All AI tools used in connection with district educational activities, whether provided by the district or accessed independently
1.3 Definitions

Artificial Intelligence (AI): Technology systems that can perform tasks typically requiring human intelligence, including generating text, images, code, or other content; analyzing data; making recommendations; or engaging in natural language interaction.

Approved AI Tools: AI tools that have been vetted by the district's technology review process, have a signed data processing agreement, and are authorized for use with students.

Generative AI: AI systems that can create new content (text, images, audio, video, code) based on prompts or inputs from users.

Section 2: Approved AI Tools and Authorization

2.1 Authorization Required

Only AI tools that have been approved through the district's technology review process may be used with students or to process student data. Individual teachers, staff, or students may not independently adopt AI tools for educational use without prior authorization from [Technology Director / CTO / designated role].

2.2 Approval Process

Before any AI tool is authorized for district use, it must:

  1. Complete the district's technology review application
  2. Undergo a data privacy assessment, including review of the vendor's privacy policy, data practices, and security measures
  3. Have a signed Data Processing Agreement (DPA) that meets [state] requirements and includes provisions specific to AI (prohibition on model training using student data, sub-processor disclosure, data deletion procedures)
  4. Be added to the district's approved technology list with documentation of permitted use cases
2.3 Prohibited Tools

The following categories of AI tools are prohibited for use with students or student data:

  • Consumer AI tools that do not have a signed DPA with the district (e.g., free-tier consumer AI chatbots)
  • AI tools that use student data for model training, advertising, or profiling
  • AI tools that do not provide adequate content filtering for K-12 settings
  • AI tools that cannot comply with [state] student data privacy laws and FERPA requirements

Section 3: Student Use of AI

3.1 General Guidelines

Students may use approved AI tools for educational purposes under the direction of their teacher. Students should understand that:

  • AI tools are learning aids, not substitutes for their own thinking, analysis, and creativity
  • AI-generated content may contain errors, biases, or fabricated information and must be verified
  • AI tools should be used in ways that enhance learning, not circumvent the learning process
  • All district conduct policies apply when using AI tools
3.2 Academic Integrity

The use of AI tools in academic work must be transparent and consistent with the teacher's instructions for each assignment:

  • Students must follow their teacher's specific instructions regarding whether and how AI may be used for each assignment
  • When AI use is permitted, students must disclose how AI was used in their work (e.g., "I used [tool] to help brainstorm ideas" or "I used [tool] to check my grammar")
  • Submitting AI-generated content as one's own original work, without disclosure and without the teacher's authorization, is a violation of the district's academic integrity policy
  • Teachers will communicate AI use expectations clearly for each assignment
3.3 Age-Appropriate Use

AI tool access will be configured based on grade level and developmental appropriateness:

  • Grades K-5: AI tools are used only under direct teacher supervision with age-appropriate content filters and limited functionality
  • Grades 6-8: Guided AI use with teacher-configured parameters. Students receive instruction on AI literacy, including how AI works, its limitations, and responsible use
  • Grades 9-12: Broader AI access with appropriate guardrails. Students are expected to demonstrate critical evaluation of AI outputs and responsible use practices

Section 4: Staff Use of AI

4.1 Professional Use

Staff may use approved AI tools to support instruction, planning, and administrative tasks. Examples of permitted professional use include:

  • Lesson planning and instructional material development
  • Generating practice problems, rubrics, or differentiated materials
  • Administrative tasks such as drafting communications, summarizing meeting notes, or organizing data (using approved tools only)
  • Professional development and learning about AI capabilities and limitations
4.2 Prohibited Staff Use
  • Entering student personally identifiable information (PII) into non-approved AI tools
  • Using AI to make consequential decisions about students (grades, placement, disciplinary actions) without human review and professional judgment
  • Creating student-facing AI accounts using personal (non-district) credentials
  • Relying solely on AI-generated content for formal student evaluations or IEP documentation

Section 5: Data Privacy and Security

5.1 Student Data Protection

All AI tools used in the district must comply with:

  • Family Educational Rights and Privacy Act (FERPA)
  • Children's Online Privacy Protection Act (COPPA) for tools used with students under 13
  • [State student data privacy law, e.g., Ohio Student Data Privacy Act, California SOPIPA, etc.]
  • Section 504 of the Rehabilitation Act and IDEA accessibility requirements
5.2 Data Processing Requirements

All AI vendors must:

  • Process student data solely for the contracted educational purpose
  • Not use student data to train AI models, for product improvement, or for any commercial purpose
  • Provide a complete list of sub-processors and notify the district before changes
  • Delete student data upon contract termination or district request within [30/60/90] days
  • Encrypt student data in transit and at rest
  • Maintain security certifications as required by the district
5.3 Incident Response

In the event of a data breach or unauthorized access involving an AI tool, the district will follow the established Data Breach Response Plan. AI vendors are required to notify the district of any breach within [24/48/72] hours of discovery.

Section 6: AI Governance Structure

6.1 AI Governance Committee

The district will establish an AI Governance Committee consisting of:

  • The [Technology Director / CTO] (chair)
  • A curriculum and instruction representative
  • A building-level administrator
  • A classroom teacher representative
  • A special education representative
  • A parent or community representative
  • The district's data privacy officer (or designee)
6.2 Committee Responsibilities

The AI Governance Committee will:

  • Review and approve AI tools for district use
  • Review this policy at least annually and recommend updates
  • Monitor AI tool usage data and compliance reports
  • Address AI-related concerns raised by staff, students, or parents
  • Stay informed about changes in state and federal AI regulations
  • Report to the Board of Education on AI governance status at least annually
6.3 Review Cycle

This policy will be reviewed by the AI Governance Committee at least [annually / semi-annually / quarterly]. Given the pace of AI development, interim updates may be issued as needed. All updates require Board approval before taking effect.

Section 7: Training and Professional Development

The district will provide:

  • Annual training for all staff on this policy and approved AI tools
  • AI literacy instruction for students as part of the [digital citizenship / computer science / technology] curriculum
  • Ongoing professional development opportunities for teachers on effective and responsible AI integration
  • Communication to parents about the district's approach to AI, including which tools are used and how student data is protected

Section 8: Compliance and Enforcement

8.1 Student Violations

Violations of this policy by students will be addressed through the existing student code of conduct. Consequences will be proportional and may include loss of AI tool access, academic integrity consequences, or other disciplinary measures as outlined in the student handbook.

8.2 Staff Violations

Violations of this policy by staff will be addressed through existing employee conduct procedures. Staff who use non-approved AI tools with student data or who bypass established governance processes may face disciplinary action.

Section 9: Effective Date and Adoption

This policy was adopted by the [District Name] Board of Education on [Date] and is effective as of [Date].

Supersedes: [Previous policy number/name, if applicable, or "N/A - new policy"]

How to Use This Template

Step 01

Customize the Variables

Replace all [bracketed text] with your district's specific information: district name, state law references, roles, timelines, and review cycles.

Step 02

Check State Requirements

Use the State AI Policy Tracker to verify your state's specific requirements and ensure all mandated elements are included.

Step 03

Legal Review

Have your district's legal counsel review the customized policy before Board adoption. This template is a starting point, not legal advice.

Step 04

Board Adoption

Present the policy to your Board of Education for formal adoption. The policy should be published and communicated to all staff, students, and parents.

A Policy Is Just the Beginning

Beni turns your AI policy into enforceable classroom controls. Policy Cards, Teacher Controls, and compliance reporting make governance operational, not aspirational.

Apply to Learn More