An Analysis of CCS Results
Recently, many social media sites have been flooded with claims that Carmel Clay Schools have been failing and seeing a “significant decline” in academic results. These claims are often accompanied by a graph such as the below (or similar) that shows a massive drop in proficiency in ELA and Math at CCS. While this graph uses actual data from the Indiana Department of Education (IDE), it is misleading at best. This analysis provides the same information, along with a full analysis of multiple data points that will help those in the Carmel community fully understand the results of the school system with the details that tell the full story, not just the narrative that politically-driven individuals want to push.
The chart above could be better designed by including a separation between two versions of the ISTEP exam as well as ILEARN. The 2015 results that appear to be a significant drop, are actually the baseline score for a new version of the ISTEP exam that included more rigorous standards. In fact, the IDE anticipated this drop and addressed it in their FAQ’s about the revised version of the test (bold text added for emphasis):
“Any time tests change, there is generally an initial decrease in scores. Other states have seen significant drops in scores after switching to a new college and career ready test. This does not mean that students or schools are performing worse than in previous years. Instead, it simply means that the test is measuring something different than it had in previous years and a drop in scores is expected. In Indiana, we have never before measured if your child will be ready for college and career.”
With this information, it is misleading to attempt to draw a connection between the results from 2014 to 2015. As the same issues can be expected with any new test, conclusions cannot be drawn between the results from 2018 to 2019. The first year scores for each new test should be considered a baseline for that test and we should expect to see overall improvement over time. Unfortunately, there were only 3 years of the updated ISTEP+ test before we switched to ILEARN so we cannot draw any definitive conclusions from the small sample for this test.
What we know about ILEARN is that it was interrupted by the COVID-19 pandemic. The pandemic forced schools worldwide into situations for which they were not prepared. According to the trend report by the National Association of Educational Progress (NAEP), scores from 2020 to 2022 for 9 year-old students were down an average of 8 points in reading and 9 points in math in suburban areas. This is consistent with the scores reported for CCS with reading dropping 8.2 points in reading, but only dropping 6.3 points in math. These numbers show that CCS scores on ILEARN dropped in-line with schools around the country in reading, but that CCS has performed above average in math compared to the drop in scores seen nationwide.
So far, we know that we cannot compare the results from the different tests to determine performance and that the post-pandemic declines in scores for CCS students have been at worst on par with the declines seen nationwide. The question then is, “what do these numbers mean?”
Many have been saying that these are the percentage of students who are proficient in reading and math at the minimum standards and ready to continue to the next level. As many people have established over the years, standardized testing is not a valid or reliable measure or predictor of student success and potential. Allensworth and Clark (2020) found that standardized test scores (such as the ACT score) are poor predictors of student success compared to their GPA. These authors found that the GPA and the school the students attend have a much stronger correlation to their success in post-secondary education than standardized testing. Dr. W. James Popham, Emeritus Professor in the Graduate School of Education at UCLA and former president of the American Educational Research Association, wrote an article for the Association for Supervision and Curriculum Development on Why Standardized Tests Don’t Measure Educational Quality.
So if these tests are really telling us the true quality of the education and potential of the students, what good are they?
One answer would be that these tests can show us how schools are doing comparatively. This gives us the relative performance, or benchmark, for each school. By comparing the results for each district to the state average, we can rank school performance. Looking at a chart comparing the same years of data as the misleading graphic, we can see that CCS did no worse in any single measure than 114.97%. Over that time period, CCS performed at 136% of statewide ELA performance, 141% of math performance, and 154% of the combined measurements. A graph of the relative performance shows that CCS has not only performed at a significantly higher rate than the average Indiana school system, but the gap between the average and performance of CCS is growing. The fact that the gap jumped with each new test, suggests that CCS has performed increasingly well compared to other schools as the state has instituted new, more rigorous tests.
Reviewing what we now know about the test results from CCS, we can see that, when properly comparing the scores of the tests to their baseline and using the scores as indicators of relative performance, CCS has performed at a high level compared to the average performance of schools in Indiana. This explains why CCS is regularly ranked among the top schools in the state. When we’re having the conversation about CCS results, we should be using a graphic more like this:
But wait, there’s more!
As we know, the scores on the standardized tests aren’t great indicators of what students are actually learning (as evidenced by the poor correlation to future success in college). For this reason, it is important to take more than these scores into consideration when judging the performance of the schools. There are other indicators that can help us determine how the school is doing. For example, one argument that is going around with the misleading graphic is that a high percentage of CCS students are failing to meet minimum standards in ELA and math. If this were true, CCS would have a low graduation rate. We can easily verify this by looking at the non-waiver graduation rates for CCS. The non-waiver rates include the percentage of students who graduate and pass the graduation exam requirements for the state. On average, the non-waiver graduation rate is 117% higher than the state average.
Another way to measure school performance is to follow the statistics of students as they progress into college. The Indiana Commission for Higher Education collects this information and posts the results for the school corporations. A few of these measures include the percentage of students enrolled in college at two years, the percent meeting early success benchmarks (students did not need remediation, students completed all coursework attempted, students persisted into their second year), and the percent of students who did not need remediation. This data shows that over 140% of CCS students are enrolled in college and are meeting early success benchmarks. Over 94% of students from CCS do not need remediation. This is around 107% of the state average.
Finally, we can look at the SAT and ACT scores for CCS students as they compare to other schools. While this is another standardized test, it can give us another indicator of the relative performance similar to the ISTEP/ILEARN scores. The average scores for these tests confirm the results of the earlier scores as they are routinely around 117-118% of the state average.
There are many more ways to review the performance of the schools; however, the information reviewed in this analysis all contradicts the narrative that often accompanies the misleading graphic that is making its way around social media. Using this data, we can conclude that CCS is not only performing very well academically, but that the gap between the performance of CCS and the average school in Indiana is actually growing. This indicates that CCS has been successful in the areas of academic planning and performance.
This analysis used the publicly available statistics from the Indiana Department of Education’s Data Center & Reports.