It’s become a truism: Nearly every school will likely fail No Child Left Behind’s requirement that 100 percent of students be proficient in reading and math by 2014.
Politicians recite that reflexively. I’ve said it myself.
U.S. Secretary of Education Arne Duncan, in calling for immediate reform of the law, upped the ante in testimony to Congress last month: 82 percent of the nation’s schools will fail to meet their proficiency goals this year – an increase of 45 percentage points over last year, he said. Challenged about the accuracy of the prediction, Duncan stuck to it in comments to education writers last Friday at a conference in New Orleans.
That same day, however, the Public Policy Institute of California released a report that not only contradicts Duncan’s projection; it calls into question the more dire predictions for 2014. According to “Improving School Accountability in California,“ close to 50 percent of schools in the state could make their proficiency goals in 2011, and that could actually increase to 56 percent in three years, even though by then the target for the percentage of students testing proficient will have risen from roughly two-thirds of the student body to 100 percent.
It’s not because hordes of California students will suddenly master their times tables and read beautifully. Eric Larsen, a research fellow with PPIC, said that probably in only a handful of California’s 10,000 schools will every student test proficient in math and English in 2014. They’ll likely be small elementary schools, with students from primarily wealthy families.
But an escape hatch – the “Safe Harbor” provision under the federal law – could spare a majority of schools from the law’s penalties and the failure label.
The Safe Harbor rule hasn’t been a secret, but it’s only now beginning to have a large impact. And what’s true in California, with some of the toughest academic standards in the nation, is true elsewhere. The U.S. Department of Education hasn’t released a state-by-state breakdown for Duncan’s projection. In an email, spokeswoman Elaine Quesinberry said that the department based the estimate on the change in the proficiency rate of the top 25 percent of schools in a state and did factor in Safe Harbor. I’m still skeptical of the accuracy.
As the PPIC explains it, Safe Harbor allows schools to improve at slower rates than the state requires under the standard targets for NCLB. Over the next four years, proficiency targets will jump about 11 percentage points annually until reaching 100 percent in 2014. But not under Safe Harbor.
As PPIC explains it, “Safe Harbor says that schools and districts are responsible for reducing their rate of non-proficiency by 10 percent annually. As proficiency rates grow, the percentage point gain in proficiency needed to meet Safe Harbor requirements shrinks.” Take a school where only 40 percent of students test proficient in math – and therefore 60 percent don’t. A tenth of the 60 percent is 6 percent. If the school can raise its score to 46 percent, it complies with the law, even though the standard target for 2011 will be about 67 percent. In 2012, the target for proficiency will come close to 80 percent (it varies a percentage point or two between elementary and high schools). If 60 percent of the students tested proficient the year before, they’d have to raise their scores to 64 percent.
Safe Harbor applies schoolwide and to subgroups of students. In 2009-10, 53.9 percent of students in California were proficient in English language arts, and 56.3 percent were proficient in math. In math, this ranged from 39.6 percent of African American children and 46.7 percent of Hispanics to 69 percent of whites and 82.5 percent of Asian students.
In 2009-10, only about 40 percent of California schools made all of the targets under NCLB. PPIC based its 56 percent school pass rate projection for 2014 on the assumption that schools would continue to improve as they have in the past. But if the rate of improvement slows, as few as 23 percent of schools will make AYP: 77 percent of schools will fail.
Value-added model to measure schools
The data I cited were a sidebar to the main point of PPIC’s report, which is urge the federal government to adopt a different measure of holding schools accountable when it revises the Elementary and Secondary Education Act, the formal name for NCLB.
PPIC is recommending a value-added model of measuring a school’s effectiveness. In a case of perfect timing, that’s precisely what Los Angeles Unified did on its own and published Wednesday on its web site.
A value-added model makes it possible to examine a school’s impact on students’ test scores, based on students’ background characteristics and their past results on test scores. Under NCLB, most low-income and minority schools are designated as failures and, if they received Title I dollars, face stiff sanctions, while mediocre middle-class schools skate by. A value-added model has the potential of identifying – and perhaps rewarding – low-income schools that are improving faster than thier peers while singling out mediocre schools that should be doing better.
Value-added measures have become controversial when used to measure individual teachers’ performance. The Los Angeles Times did its own value-added model for elementary teachers in Los Angeles Unified and published the results. It will release another batch soon.
The PPIC study makes no judgment on using value-added models to evaluate teachers. But Larsen noted that a schoolwide application of value-added methodology can avoid many of the pitfalls of using it for individual teachers, such as the effects of team teaching and tutoring, non-random assignment of students, and distortions created by a small sampling size.
“By doing a better job of identifying exemplary schools and schools where changes are needed, an accountability system based on a value-added model has the potential to improve student outcomes in California,” the report concludes.
It also discusses potential challenges of a value-added model, such as how to factor in dropouts, and notes that to be effective, the statewide student data system – CALPADS – needs to be operational, in part so that data for students transferring into a school can be incorporated. The report recommends that sanctions against schools be postponed until California can evaluate schools on the basis of gains in individual student achievement.