Why Dropout Data Can Be So Unreliable
Filed by KOSU News in Education.
August 3, 2011
Accurate dropout figures are very hard to find because most states don’t adequately collect or analyze the data.
Part of the problem is that every state has had a different definition for dropout. In some states, for example, students who leave school aren’t counted as having dropped out if they enroll in adult education classes like night school.
Many schools don’t count kids as dropouts if they enroll in a GED program. The U.S. Department of Education says GED recipients should be counted as dropouts but that rule isn’t uniformly applied.
And then there are students who did drop out but aren’t counted because they go to prison. Very few school districts count kids who are incarcerated — even in juvenile justice facilities — in dropout statistics. Some schools don’t think they should be held responsible if a kid quits school and gets in trouble with the law.
Responsibility in this case equates to funding. It’s not in the interest of schools to have an honest, accurate account of dropouts — not just because a high dropout rate makes a school look bad, but also because there’s serious money at stake. Most schools get funded based on attendance. If kids don’t show up, schools lose money.
So some schools are allowed to “fudge” the numbers. For example, in Baltimore, some schools take attendance at 10 a.m. rather than 8 a.m. because they know if kids show up at all, they come in late and would be counted absent. A late roll call allows them to report them as present.
Until recently, the Department of Education could do nothing about this hodgepodge of methods for counting dropouts. It maintained that the collection, reporting and accuracy of dropout data were state responsibilities.
In 2005, the National Governors Association announced that 45 states had agreed to develop voluntary “common measures” for reporting high school graduation rates. That didn’t get at the problem of counting dropouts, however. Experts say it’s wrong to assume that if a school has an 80 percent graduation rate, its dropout rate is 20 percent — precisely because there’s no single definition of what a dropout is. But it was a start.
The three states with the highest dropout rates — Texas, California and Florida — did not sign on to the agreement.
There’s been so little pressure on states to gather and report accurate dropout data that researchers at Northeastern University in Boston have concluded that dropout data in 26 states is unreliable and unusable.
By 2008, dropout data had become so unreliable that the Bush administration told states they would be required to use one federal formula to calculate both graduation and dropout rates.
At the time, it was considered the most far-reaching action taken by the federal government.
The Department of Education requires that every year states count the number of students who enter ninth grade. If they don’t graduate in four years, they are counted as dropouts. But it’s not clear how many states are actually doing this. Many states still don’t have a system capable of tracking a student through four years of high school, so how can the federal government require states to calculate a rate that’s beyond their technical ability?
A few states have created a common identifier — a tracking number — that helps schools confirm whether a student re-enrolled after transferring to another school in the same state. This common identifier is viewed as the most promising approach to gathering accurate dropout data yet. [Copyright 2011 National Public Radio]