For parents of children with disabilities, we all know, oh so painfully, that individual education programs (IEPs) can have “good” or “bad” goals. There are so many ways a goal can be bad: It stays the same from year to year, it focuses on behavior and not skill development, isn’t aligned to grade level learning, or treats our children as robots that must comply. What is it with compliance-based goals?
And so, historically we asked ourselves two questions: 1. What makes a good IEP goal? 2. How can we write a good IEP goal?
Now, in the days and weeks before the singularity, we asked ourselves a different question: Can artificial intelligence write a good IEP goal?
Yes. And no.
AI can, in fact, write mediocre IEP goals with ease. However, just like IEP teams and special education teachers, the online oracle must first have data to generate an almost-acceptable individualized goal. It needs context. It also needs guidance, which means that the goal will be crafted through an iterative prompting strategy.
IEP goals must be based on a child’s present levels of academic and functional performance (PLAFP). To understand the PLAFP, AI needs to know a few things.
Let’s get to it!
This first prompt requires a bit of research, but trust me, it is worth the effort.
Act as a (grade of student) grade special education teacher with a master’s degree and 15 years of experience. Your task is to craft an IEP goal for a (grade) grade student for (standard). The student took several assessments and qualified for special education due to (disability).
The IEP evaluation noted the following scores on the (name) assessment (insert scores). In (season), the student took the (state/district assessment) and scored (score). (Include any additional information, such as deficits identified by teachers/parents/assessments). The student has struggles with (challenge areas). As a good teacher, you know that IEP goals should be specific, measurable, achievable, relevant, and time bound.
Additionally, good IEP goals are skills-based and so you refrain from writing compliance-based goals. You also want to make sure that progress can be monitored regularly, so the goal has to be flexible enough to create a custom data collection sheet. Do not complete your task. Instead, ask me for more information and clarifying questions, and list your assumptions. I will respond to each.
The AI should have responded with a list of questions for you to answer and assumptions it is making. As I tested this prompt (and revised!), the AI asked me about the student’s current levels of performance, accommodations, supports, scaffolds and parent input. Respond to each, and behold the new goal.
That, friend, is up to you and the IEP team. I exhausted myself and the digital demigod in crafting this prompt. Every bit of the context improved the goal, but at the end of the day, the goals it generated were acceptable. While they were potentially better than the average goal produced by IEP teams, to be frank, the goals were not awesome. They were, however, a great starting point for IEP team consideration and discussion.
And that, dear readers, is why AI usage is always H-AI-H — human, artificial intelligence, human. The human has an idea, uses AI as a productivity tool, and then revises the output. Without the IEP team’s knowledge of the child, learning environment and curriculum, AI-generated goals fall flat. We need that human revision.
Assessments and IEPs and progress reports, oh my!
This column is written by Shannon Sankstone, she is an Olympia-based special education advocate and the owner of Advocacy Unlocked. She may be reached at ShannonSankstone@theJOLTnews.com.
Comments
No comments on this item Please log in to comment by clicking here