ISAT Tests: Worth It?
Worth the time?
Worth the Money?
by Dawn Earl, Education Policy Analyst, IFI
Chances are if you have children in public school one or more of them has recently emerged from ISAT testing.
The tests, according to a report prepared by a former state superintendent of schools, are supposed to "measure individual student achievement relative to the Illinois Learning Standards." His explanation optimistically continues, "The results will be used by parents, teachers, and schools as one measure of student learning. The state uses the results to report student achievement to the public."
Not only are millions of dollars expended to produce, distribute and score the ISATs, but teachers and students spend precious instructional time preparing for and taking these tests.
Is all this effort and expense paying a dividend?
According to the Illinois State Board of Education's own research the answer is a resounding NO!
In the Year Three Report of the Evaluation of the Implementation of Illinois Learning Standards (released in August 2001), the summary of findings states, "At this time, no significant, statistical relationship can be detected between changes in ISAT performance and changes in the Illinois Learning Standards."
"Although we have yet to detect a significant statistical relationship between ILS implementation and changes in ISAT scores and may never be able to do so," the report continues, "anecdotal information and teachers' perceptions suggest that such a relationship does exist and is growing stronger" (emphasis added).
Translated into plain English, this report tells us that no data can be found to indicate that the ISAT does indeed measure student achievement relative to the Illinois Learning Standards. Their data says one thing, but their actions send a different message: "Trust us. We don't need data to prove we are right. We need hundreds of dollars funneled into this [futile] effort because we have a hunch." Taxpayers should recognize voodoo accountability when they see it.
The question that must be asked is: Why is the state board persisting with the ISAT if it is not serving its designed purpose?
Are the results of the ISAT able to measure student learning as claimed?
Hardly. In 1999, then-state superintendent Glenn "Max" McGee stopped reporting test averages and started reporting only percentages of students in each of four performance categories: exceeds standards, meets standards, below standards, and academic warning.
Once again we turn to lay terminology to explain what this means: Suppose I tell you that 8 out of every 10 students met or exceeded state standards. From this information, can you tell me what the state standard was they had to meet? Can you tell me what their average score was?
This is exactly how the state is reporting ISAT results. Parents are lulled into complacency by seeing the rows of numbers (percentages) in neat columns published in newspapers across the state. The data looks so good, so impressive. But the reporting of percentages of students is meaningless without knowing what the students actually scored and where the state standard was set.
Are the extraordinary efforts and massive tax dollars producing something of value?
After much study, I, for one, would argue NO. Testing is an important educational tool that can be used to improve student achievement by giving teachers important feedback. Numerous reliable and reputable achievement tests, including the Stanford and the Iowa Test of Basic Skills would provide this kind of important student learning data. But the educratic testing monstrosity of the ISATs will not set the course for improved education for the students in Illinois public schools.
Now is the time to pull the plug on the ISATs and replace it with one of the proven nationally normed tests that will allow for true accountability and improved education in Illinois public schools.