I know we’re all supposed to be on the “local control” bandwagon when it comes to setting school accountability standards, but a recent report made it crystal clear why this is going to be a hot mess.
Achieve–an independent education nonprofit focused on high standards and raising graduation standards–set out to measure how well students are doing nationwide when it comes to preparation to succeed in college or careers.
Well … so much for that idea. Turns out that it’s pretty much impossible to measure standards when just about every state has a different standard and a different way to measure it.
Some states only report how “all students” are doing, which is pretty much worthless when it comes to meaningful comparisons of progress. Some are willing to break it down into groups based on race, income, disability status, and English proficiency.
Even when states take the same tests as neighboring states, they report data for different grades, or just graduates, or just test completers. Not only is it inconsistent across states, sometimes reporting isn’t even consistent in the same state.
Some states make it very easy for the public to find the data and report it in a clear and timely way. Some states bury the information in unwieldy and unreadable websites–or release information irregularly and unpredictably.
As the report points out:
“States cannot make good policy and practice decisions—and ultimately cannot improve student performance—if they do not have basic information about how students are performing along the way… State leaders, partners, advocates, and the public should continue to push for more transparency and better reporting of the information they need.”
The report looks at a range of factors, from course taking and graduation standards in high school, to postsecondary remediation and persistence. And all that seems pretty simple on its face, but some states engage in some magical thinking when it comes to progress.
Consider this measure–the percentage of students who completed a college-and-career course of study. Indiana looks great on this measure, with data suggesting that 87 percent of students meet this bar, but because they are only counting high school graduates, not all students who started in ninth grade. And Indiana’s definition of college/career ready is far looser than it is in other states because it includes three different types of diplomas, including a technical diploma.
Compare this with 32 percent who completed a “college-and-career-ready” diploma in New York, which sets a far higher bar for success. New York counts the entire class of ninth graders starting in high school, not just the ones who made it to graduation day, and they must secure an “advanced Regents” diploma.
College enrollment? Colorado reports on how many high school graduates enroll the fall after graduation. Connecticut gives these graduates a year to enroll in college before counting them. Ohio gives its graduates two years to enroll before they are counted.
These aren’t even apples-to-oranges comparisons. More like bananas to bowling balls.
This report’s authors do a fine job analyzing the fine mess they’ve been handed, but it’s clear just how meaningless state-by-state comparisons are going to be in the future, as ESSA takes full hold in states and accountability measures are watered down so they meet a new bar—political palatability.