It has been shown that artificial intelligence (ai ) is a useful tool in special education for both teachers and students. While concerns about ethics, privacy of students have emerged and how ai could affect critical thinking, it is commonly accepted that the benefits exceed concerns. Even so, it is important that all in education understand the possible disadvantages of the use of ai and plan ways to avoid them.
Some of the possible problems with ai in special education are well known, such as those related to student learning (privacy of data, bias, hallucinations, etc.). Others, such as how the Teacher Agency could affect, how people see students with disabilities and legal problems, do not understand themselves so well.
This article focuses on legal problems, but here is a quick look at the other two less known impact areas.
Slide to move horizontally
Potential problem of ai
Description
Solution
Decrease in the Teacher Agency
As ai admits tasks previously administered by special education teachers, such as data analysis and program planning, His role changes from an expert to collaborator . This possible loss of professional identity can lead to a decrease in the agency, exhaustion and wear.
Schools need to support teachers while navigating changing professional roles. The ai should be used to help teachers, not to replace them.
Negative perception of students with disabilities
If ai is presented as the magical solution for teachers with excess of work who fight to support students with special needs in inclusive classrooms, it makes students look like the problem and teachers seem to be victims of the system (Rice & Dunn, 2023 ; Mintz, et al., 2023 ).
Note the language used to frame the use of ai in Special Education.
Getting the language to position ai as a catalyst for systemic change to support all students in an inclusive environment.
<h2 id="legal-issues-with-ai -in-special-education-3″>Legal problems with special education
These problems are related to the teacher's responsibility and transparency on how ai is used. Treating these aspects of ai will help teachers know what to do, and schools know how to support them.
Teacher's responsibility
Teachers must use their professional judgment by making decisions. Simply use documents generated by ai (such as IEP or behavioral plans) or recommendations of ia without careful consideration means that they are not using their professional discretion. This is important because professional discretion is a key factor to protect teachers from responsibility (Jones, S., 2025, Legal nurseries for ai in assistance technology : what administrators need to know (Webinar). Infinitec).
It is also important to remember that school districts are responsible for the precision of all documents of the Student Program, even if the ai was used to create any. This is not a new idea: best practices guidelines have always emphasized that ai should be a tool for support Professional Judgment of teachers, no replace he (Mintz, et al., 2023 ).
What teachers can do:
Follow best practices and use ai to support your judgment, not replace it.
Not only use ai to create an IEP or behavior plan without making sure that it fits the needs of each individual student.
Do not add a suggested accommodation in ai to an IEP without trying it first and collecting data to see if it works for the student.
To be able to explain the specific parts of the programming and documentation of each student. This demonstrates a professional reasoning, which cannot be replaced by ai tools.
Be transparent
Many schools have general rules about the use of ai . Having specific rules about the use of ai for student programming creates transparency and can relieve concerns that students do not receive individualized support. Educational documents such as IEP or behavior support plans must be customized for each student. The use of ai to create these documents can make families feel that their children's education is not adapted to them.
Create statements on how the school uses ai in student programming, beyond the general guidelines of the districts.
These statements must explain what IA programs are approved and describe staff expectations to use ai to improve instead of replacing professional judgment. This serves as notification to families regarding the use of ai and reminders for the staff of their professional responsibilities when using ai . Sample language can include: – The school/district has approved the following ai programs for potential use in the development of the IEP, development of the lessons plan, etc. It is the expectation of – School/District that any personnel who use ai to support the programming of students will not use it to replace their own decision -making or professional judgment. It will be used as a complement to the professional skills of district employees.
Add these statements to the students' manuals, the registration information and the general policy of the school.
ai can be a powerful tool to support students and educators in special education. It depends on us as educators and administrators to ensure that we are being attentive and proactive when addressing the impacts, both positive and negative, that can arise as this tool is introduced into the educational field.
NOTE : Article written by the human in the loop, with the used for partial editions
Other references
Drawing, such as, Ioannidus, Re (2013). A review of artificial intelligence in special education . In: Lytras, MD, Ruan, D., Tennyson, RD, Ordonez de Pablos, P., García Peñalvo, FJ, Rusu, L. (eds) Information Systems, E-Learning and Knowledge Management Research. WSKS 2011. Computer communications and information sciences, Vol 278. Springer, Berlin, Hedelberg.