Office for Civil Rights Issues Guidance to Ensure Artificial Intelligence is Used in Nondiscriminatory Manner
The Office for Civil Rights released guidance to ensure artificial intelligence (AI) is “used in a nondiscriminatory manner in the nation’s elementary and secondary schools and institutions of higher education consistent with federal civil rights laws.”
The guidance includes examples of conduct that could constitute discrimination. In the disability discrimination section of the guidance, the primary examples are of actions taken by educational agencies that could constitute discrimination. They don’t address students using AI as accommodations or students being penalized by schools who refuse to allow AI to be used as an accommodation.
Examples of Educational Agency AI-Related Conduct
The examples in the disability discrimination section of OCR's guidance document includes the following:
"Example 13: An AI test proctoring software uses facial recognition technology and eye movement tracking to monitor students for behavior that indicates they might be cheating during exams. A student with a disability receives a failing grade for an exam after the software flags her behavior as suspicious and her professor accuses her of cheating. The student appeals the grade because her vision impairment causes eye movements that the software falsely flagged as suspicious. The student also requests that she receive an academic adjustment so that she does not have to take tests using this particular proctoring software. The university threatens to expel the student if she is flagged for the same behavior again and does not respond to the student’s request. OCR would have reason to open an investigation based on this complaint. Based on the facts, as alleged, the student may not have been provided with necessary academic adjustments."
"Example 14: A deaf student enrolls in a university’s engineering program. The university provides the student with an AI-aided closed circuit captioning transcription application for class lectures. The student repeatedly informs the university with specific detail that the AI service provided to transcribe the lectures does not accurately capture the advanced engineering terminology essential to her education program. The university does not provide the student alternative auxiliary aids and services to access the class lectures. OCR would have reason to open an investigation based on this complaint. Based on the facts, as alleged, the student may not have been provided with necessary academic adjustments."
"Example 15: A school staff member utilizes an AI-driven adaptive assessment to determine admission to the school’s gifted program. The assessment’s questions get more difficult if a student answers quickly and correctly, and students’ scores on the tests are determined, in part, by how quickly students answer difficult questions. During the assessment, a student with attention-deficit/hyperactivity disorder (ADHD) is distracted and takes longer to answer questions, causing the assessment to produce easier questions, and thus reflects his disability rather than his actual aptitude or achievement. Although the student has an Individualized Education Program (IEP) with test taking accommodations including increased time, the teacher does not believe the IEP applies to the gifted admissions assessment. The student is deemed ineligible for the gifted program. The student’s parent files a complaint alleging that the student’s final score is lower than it would have been if the teacher had not used the AI-driven adaptive assessment and if the student had been provided with his test taking accommodations. OCR would have reason to open this complaint for investigation. Based on the facts, as alleged, the student may have experienced disability discrimination in the admissions process and/or not have been provided with FAPE."
"Example 16: A school district allows schools in the district to use a generative AI tool to write Section 504 Plans for students with disabilities. The school district does not have any policies regarding how to use the tool or how to ensure that the group of knowledgeable people responsible for evaluating a student review what the AI produces to determine whether it meets the individual needs of each student. One school begins using the tool to create Section 504 Plans for all students with diabetes. School staff do not review or modify the generated Section 504 Plans and begin implementing them, and they inform parents that they believe AI tools make more effective choices than people. A local group of parents of students with diabetes at that school files a complaint with the school district stating that their students’ Section 504 Plans’ provisions look almost identical and, in some cases, do not match the specific needs of their children. The school district states that they defer to the school’s decision on how to utilize AI tools and does not investigate further. OCR would have reason to open this complaint for investigation. Based on the facts, as alleged, students may not have been provided with FAPE because their 504 plans may not have been designed to meet their individual educational needs."
"Example 17: A middle school uses content moderation software to alert the school if any language that violates the student code of conduct is used on school-issued devices. A student with obsessive-compulsive disorder has a compulsion to say certain swear words if certain inciting incidents occur, and using some of the words would otherwise violate the student code of conduct. The student’s Section 504 Plan addresses the techniques the student and staff are using to work on this compulsion as well as what happens when the student uses the identified words. The student uses his school-issued device to chat with a peer and uses some of the words identified in his Section 504 Plan that would otherwise violate the student code of conduct. The content moderation software flags this language and alerts school officials that problematic language has been detected on his school-issued device. Based only on the flag from the content moderation software, the principal immediately punishes the student without consulting or following the procedures in the 504 Plan. The parent of the student files a complaint against the school stating that the school’s disciplinary response denied the student FAPE. OCR would have reason to open this complaint for investigation. Based on the facts, as alleged, the school did not follow the student’s 504 Plan and, and as a result, the student may not have been provided with FAPE."
"Example 18: An elementary school teacher uses an AI enabled application, which monitors noise and provides feedback to assist her in managing classroom noise. The application uses the class computer’s built-in microphone to detect when students’ voices are raised, and it displays a color meter ranging from green (quiet classroom) to red (loud classroom). The application tracks patterns for class noise and predicts the times of day that the teacher is likely to have more difficulty managing classroom noise. During the times of day that the application predicts will be the loudest, the teacher keeps the color meter projected for all the students to see and offers a pizza party if they only have a few instances where the meter reaches red. A student who is hard of hearing reports that they are being bullied by a few classmates because they believe the student’s speaking voice consistently causes the meter to be red and the class does not receive a pizza party as a result. The school does not respond to the student’s report, and the teacher tells the student that they need to learn to speak more quietly. OCR would have reason to open this complaint for investigation. Based on the facts, as alleged, the student may have experienced prohibited harassment about which the school knew and failed to appropriately respond."
"Example 19: A school district purchases an AI-driven application to streamline the universal screening process for speech and language disorders. The school district decides to only utilize the AI-driven application and not employ or seek opinions from Speech Language Pathologists, or other appropriate professionals. The application falsely flags students who are ELs as students with a speech disorder. The district refuses to evaluate students for speech and language disabilities, if a student was not identified through the universal screening, unless the student’s parent obtains and submits a private diagnosis. The application also misses several students who have speech and language disorders. Although parents complain to the school about the inaccuracy of the AI-driven determinations, the school does not change its process. OCR would have reason to open this complaint for investigation. Based on the facts, as alleged, students who are ELs may not be able to equally and meaningfully participate in the standard instructional program. OCR would also investigate whether the school district’s use of the AI driven application is erroneously impacting its obligations to identify and locate qualified students with disabilities, to evaluate them, and provide FAPE. Example 20: A school district starts using an internal software program to generate student Individualized Education Programs (IEPs). The software is trained on past IEPs to recommend appropriate placements for current students. The software inputs include all available demographic information about students, including race. Historically, more Black students with disabilities in the district had IEPs that included more hours of special education instruction in a separate setting and educational placements that were more restrictive than other students. Most of the IEPs for Black students that are generated by the software recommend more special education services in separate settings and would result in placements in restrictive educational environments, but the IEPs that the software generates for white students with similar disability-related needs recommends more integrated instruction and would result placements in less restrictive educational environments. A special education teacher complains to the principal that the software is drafting inappropriate IEPs for many of the Black students with disabilities, but the principal says the software is just doing its job and they do not have the resources to review every IEP to ensure that it is appropriate for each student. OCR would have reason to open this complaint for investigation. Based on the facts, as alleged, the school may be treating Black students differently than similarly situated white students, and students with disabilities may not be receiving FAPE. Example 21: A high school purchases AI software that electronically tracks how often students sign out for hall passes to estimate students’ mental well-being. The software flags students who sign out an electronic hall pass more than three times a day as a factor related to mental well-being. If the algorithm determines that a student has a low mental well-being score, the student is pulled out of class to have mandatory meetings with the school counselor. During the school’s first week using the software, the algorithm gives low mental well-being scores to students who are menstruating, pregnant students, and students with gastrological disabilities, since those students needed to use the bathroom more than their peers. The students tell the principal that they would like to return to their classes and that they do not believe meetings with the counselor are needed or helpful, but the principal indicates that whether the meetings are needed or not, he cannot ignore a potential issue with a student’s mental well-being flagged by the software. As a result, the students miss substantial learning time because of the mandatory meetings with the school counselor. OCR would have reason to open this complaint for investigation. Based on the facts, as alleged, the school may be treating students who are menstruating or pregnant differently than students who are not menstruating or pregnant. The school may also be failing to implement provisions or discouraging students from taking advantage of necessary provisions, such as unlimited bathroom breaks, in the Section 504 Plan of a student with a gastrological disability."
Accommodations Using Artificial Intelligence
The use of AI tends to be paired with the use of technology. If a student requires AI-related accommodations, are schools required to provide applications that would address the needs of the student? If yes, are schools limited to companies it has contracts with or must it make an application available to a student if the student needs it?
Imagine a student has visual impairments and/or struggles with reading and/or struggles with recall and/or comprehension. An application such as Seeing AI could help the student access instruction by reading print formats of work for students and/or describing the student’s environment and/or reading signs around the school and/or helping in other ways. Is the school required to provide access to such an app and/or teach a student how to use it?
What about students struggling with comprehension? Will the student be allowed to use an application like ChatGPT to obtain rephrasing of directions and/or difficult sections of text? If yes, will the school be required to provide the app and teach the student how to use it? If the school provides computers to students, will it have to change its security settings to allow ChatGPT on the student’s device?
For a student who struggles with writing, will the student be allowed to put sections of text into Grammarly to have it checked and/or suggest different words, grammar, and/or phrasing? If the point of the assignment is to show comprehension of a book report, as one example, is presenting a paper that shows comprehension enough or is the teachers actually grading on use of commas, spelling, and vocabulary, too?
Another example is of a student who struggles with anxiety and school refusal and/or processing and/or organization. Imagine the student is assigned to do a science project. Now imagine the student freezes and can’t come up with a project and/or figure out how to do a project. If the student uses ChatGPT to generate ideas and steps on how to do the project, will the student be charged with cheating? Ultimately, what is being graded? The student’s ability to be creative or the student’s ability to actually go through the steps of a science project and learn from the project?
For the student who struggles with organization, will he be allowed to use AI to create a schedule of the work he needs to do? If he has a goal in his IEP to address organization, will the goal include use of AI to help him organize himself? Is the point of the goal for him to become organized or is it to learn strategies and use tools that will help him organize his work, schedule, etc? If the school says it uses graphic organizers to help students organize writing assignments, does the school need to provide access to AI, too?
In addition, who will teach the students how to use AI? If a student is allowed to use ChatGPT, as one example, who will teach the student how to use it?