California students’ improvement on AP exams deserves more attention

There is some good news in California student achievement trends. High performers, as measured by passage of the Advanced Placement exam, are increasing, and rank very high in interstate comparisons.

AP is college level work in high school, and indicates that students attending California’s most selective colleges are better prepared than ever. This positive trend is obscured by national studies, like the National Assessment of Educational Progress (NAEP), that do not focus on the highest achieving students when making interstate and racial/ethnic comparisons. In California, Hispanic growth in both taking and passing the AP exam is especially impressive.

According to the the Eighth Annual AP Report to the Nation, 23.4 percent of California’s 2011 public school graduates were successful on one or more AP exams – seventh highest in the nation. Overall, 19 states’ graduates exceeded the national average by scoring 3 or higher, out of 5, on one or more exams during their high school careers. Maryland was number one, with 27.9 percent. The U.S. average is 18.1 percent. This high national ranking for California does not receive the public attention that it deserves in a sea of negative reports on state education.

California ranked second to New Mexico on the College Board Hispanic Index for Equity and Excellence on AP. This calculation combines percent of successful AP exam takers in the graduating class with the percent of Hispanics in the graduating class. About 27 percent of California students take the AP Spanish exam, and almost 80 percent of those score 3 or higher. This compares favorably to Texas, where only 17 percent of the students take the AP Spanish exam and 60 percent score 3 or higher.

In California, 136,787 students, from the graduating class of 2011, took an AP exam during their high school career.  From that number, 90,409, or 66 percent, achieved at least one AP exam score of 3 or higher – scores that are predictive of enhanced college success, according to the College Board, the not-for-profit membership organization that administers the AP Program. In 2010, the most current data available, California’s 12th grade student population numbered 405,087.

In California, the AP performance gap between Hispanic and African American graduates compared with Asian and white graduates continues to exist. For example, 61 percent of Hispanic graduates and 39 percent of African American graduates score 3 or higher on AP exams, compared with 71 percent of Asian graduates  and 74 percent of white graduates. All of these scores represent an increase in AP performance over the previous five years, but if California is truly going to close the AP equity gap, educators and students alike will need to continue to find ways to increase AP  participation and improve performance on these exams.

Source:  College Board California State Integrated Summary Report for Public Schools, 2010-11
Source: College Board California State Integrated Summary Report for Public Schools, 2010-11

California’s Hispanic students are the fastest growing population and the largest individual group taking AP exams in public schools and the second largest group including public and private schools. Their  AP participation and performance rates show a five-year increase in the number of Hispanic students taking AP exams, from 57,700 (2006-07 school year) to 85,638 (2010-11 school year) – a 47 percent increase.  The number of Hispanic students receiving an AP score of 3 or higher – 29,664 (2006-07 school year) to 43,650 (2010-11 school year) – also represents a 47 percent increase.

The five-year AP data trends for California’s Hispanic public and private school students shows the same pattern of increases in participation, from 62,135 (2006-07) to 91,452 (2010-11), for a 47 percent increase. The AP performance trend over the same period shows increases from 32,720 to 47,515, an increase of 45 percent.

Nationally, students who find success on AP exams lessen their chances of being required to take remedial college courses and increase their chances of graduating from college on time. These remedial courses cost taxpayers an estimated $1 billion each year. Educators in California and throughout our nation must continue to target the divide between high school graduation standards and the skills needed for all students to be successful in college. Finally, we must examine and address equity and access issues that hinder academic excellence for all California students.

Michael Kirst is a Professor Emeritus of Education at Stanford University, where he has been on the faculty since 1969, and president of the California State Board of Education. He thanks Don Mitchell of the College Board and Russ Rumberger of the University of California Office of the President for data help and advice on this article.

****************************************

Sources for this article:

* The 8th Annual AP Report to the Nation California State Supplement, February 8, 2012. The College Board.

* California State Integrated Summary Report, 2010-11. The College Board.

* California Department of Education Statewide Graduation Rates, 2009-2010.

* “Preparing Students for Success in College,” Policy Matters (2005), American Association of State Colleges and Universities.

* Chrys Dougherty, Lynn Mellor, and Shuling Jian, “The Relationship Between Advanced Placement and College Graduation” (2005), National Center for Educational Accountability.

* The College Completion Agenda 2011 Progress Report (New York: The College Board, 2011).

API has served its purpose

A court decision this week involving Los Angeles Unified has raised again the contentious issue of evaluating teachers using standardized test scores. But a recent report for the think tank Education Sector recommends adopting the same method developed by Los Angeles Unified to replace the Academic Performance Index as a statewide way of measuring schools’ progress.

Called Academic Growth over Time, AGT is a value-added model that compares students’ actual performance on state tests to their predicted performance based on demographic characteristics – family income, language, and ethnicity – as well as past test scores. The intent is to distinguish factors of learning that schools can control from those they can’t.

The use of AGT to evaluate individual teachers has sharply divided teachers in Los Angeles Unified. United Teachers Los Angeles opposes using AGT in any manner, while teachers affiliated with Teach Plus Los Angeles and Students Matter support using it as one of several measures, counting for no more than a third of an evaluation. But less controversial is the district’s use of AGT as a tool to evaluate schools, in part because it involves a larger number of student test scores and doesn’t call for high-stakes decisions affecting individual teachers’ careers. To the contrary, a schoolwide AGT can encourage collaboration and team-teaching

This is a page from the Academic Growth over Time report for Taft Senior High School in Los Angeles Unified, cited in the report for its underwhelming achievement. Scores in green indicate a performance that exceeded the district averge for the popularion of students served (Algebra II over three years); gray is close to the dsitrict average (geometry); yellow is below the predicted AGT (English language arts last year; and red (Algebra I last year) is far below the predicted AGT. Taft's overall API was 744 last year; for whites, who comprise 40 percent of the student body, it was the state's target of 800; Source: Los Angeles Unified
This is a page from the Academic Growth over Time scorecard for Taft Senior High School in Los Angeles Unified, cited in the Education Sector report for its underwhelming achievement. The score, on a 1 to 5 scale, in green indicates a performance that exceeded the district average for the population of students served (Algebra II over three years); gray is close to the district average (geometry); yellow is below the predicted AGT (English language arts last year); and red (Algebra I last year) is far below the predicted AGT. Taft's overall API was 744 last year; for whites, who comprise 40 percent of the student body, it was the state's target of 800; for Hispanics, it was 695. (Source: Los Angeles Unified)

Last fall, for the first time, Los Angeles Unified released AGT report cards for all schools, breaking down every subject or grade taught on a scale of one to five, with students’ actual scores compared with where they should have been, given student populations, for a one-year and a three-year average. The AGT’s advantage is that it can highlight improvements in high-minority, high-poverty schools that may flunk under the federal and state accountability criteria, while pointing to mediocre performances in high-wealth schools that can glide by the targets of No Child Left Behind and the state’s API.

Here is the AGT report card for Audubon Middle School for 2010-11. All of he subject and grade level scores are in green and blue, indicated progress that exceeded and far exceeded the district averages. Its API score remains relatively low at 733. (Source: Los Angeles Unified.)
Here is the AGT report card for Audubon Middle School for 2010-11. All of the subject and grade level scores are in green and blue, indicating progress that exceeded and far exceeded the district averages. Its API score remains relatively low at 733. (Source: Los Angeles Unified.)

The Education Sector report pointed to Audubon Middle School that, under a new principal and re-energized staff, had a 12 percent gain in the API score in one year. But it was still in the bottom 20 percent and failed to meet the proficiency target under NCLB for the 10th straight year.

The state’s three-digit API number, on a scale of 200 to 1,000, is “a crude proxy for student achievement and allowed schools to be ranked,” writes Richard Lee Colvin, former executive director of Education Sector and author of “Measures That Matter: Why California Should Scrap the Academic Performance Index.” “But it was not designed to give educators much help in analyzing school performance, and it told the public more about who attended each school than how well they were being taught.”

The API’s shortcomings have been known for a long time, and Colvin  lists them:

  • It’s an indicator of students’ wealth rather than of a school’s educational quality;
  • It places too much emphasis on math and reading scores, so that schools end up giving short shrift to science, social studies, and the arts ­– subjects that don’t factor much or at all in the API number;
  • More than 40 percent of schools are above the arbitrary target of 800 and so are no longer held accountable for helping students who are struggling academically;
  • It doesn’t track individual students’ academic growth over time; progress is measured by comparing  how students in a particular grade or subject do one year, compared with different  students the previous year.

Narrow measure of school success

The Legislature had intended that the API be a wider index when it created the index in 1999, but nothing has changed. Now, for the second year, Senate President pro Tem Darrell Steinberg has proposed SB 1458 to broaden the API to include possible factors as graduation, dropout rates and college acceptances, and Advanced Placement scores, along with giving science and other subjects more weight. In a nod to Gov. Jerry Brown, who suggested the idea, Steinberg’s bill could include the results of school inspections measuring non-quantifiable but important factors like school climate and parent evaluations.

There’s no reason why a new index that emerges – whatever it’s called – couldn’t also incorporate AGT as a measure of student progress in combination with proficiency rates on state tests. Colvin said that the costs for districts to compute the AGT scores for its students need not be significant; Colorado has developed an open-source model that districts or the state could buy for $250,000.

State Board of Education President Michael Kirst said he was open to innovative accountability models, but that now is not time to switch to value-added method. The state will begin using Common Core assessments in 2014-15, and at least two or three years of new data would be needed, bringing the adoption of a new system to 2018-19 at the earliest. The State Board will be reviewing the state’s accountability methods over the next year. Colvin called for making a commitment to AGT now and preparing for a transition. The State Board could grant waivers from the use of API to districts like Los Angeles Unified in the meantime.

But Los Angeles Unified Superintendent John Deasy told me the district was interested in a federal waiver from No Child Left Behind, not a state waiver, so that it get out from federal sanctions for school failures as the feds defined it and also gain more control over federal Title I money. After months of delay, the state has requested an NCLB waiver, but not on terms requested by the Department of Education; getting the waiver would appear problematic.

Mixed results using iPads

A study conducted in four California school districts found that students studying Algebra I on an iPad did no better overall than students equipped with a traditional textbook.

The results of the 2010-11 study – the largest to date – disappointed Houghton Mifflin Harcourt, the publisher that commissioned the research and had expected better outcomes from the new technology. But at the same time, a company executive  said the firm remains undeterred in developing its digital textbooks and was heartened by scores in Riverside Unified, the one district in the study where students using iPads markedly outperformed their peers. In a white paper that the company published, putting a positive spin on the research, the Riverside teachers in the study extolled the software, which it said motivated students to take charge of  their learning.

HMH Fuse, the software that Houghton Mifflin Harcourt developed exclusively for the iPad, comes equipped with many nifty features: more than 300 video tutorials by the textbook’s author, a scratch pad for note-taking, icons that provide links for support, an ability to record notes by voice, and animated views that walk students through sample problems. Students I spoke with last year at a San Francisco middle school that was part of the study said they liked the features and found them helpful. (HMH has since then improved the note-taking capability, after students complained about its limitations.)

The study found no particular iPad feature directly contributed to math improvement, but collectively they kept  students more engaged; there were also indications – though no hard numbers – that students with iPads did more math at home and after class. The students using the software who did outscore other students tended to have better attitudes, said Denis Newman, president of Empirical Education, the Palo Alto firm that did the research.

Riverside Unified students using Houghten Mifflin Harcourt's Algebra program on an iPad scored 9 percentile points higher than students using the company's Algebra textbooks, an impressive difference. Source: research by Empirical Education, Inc. (Click to enlarge.)
Riverside Unified students using Houghton Mifflin Harcourt's Algebra program on an iPad scored 9 percentile points higher than students using the company's Algebra textbooks, an impressive difference. Source: research by Empirical Education, Inc. (Click to enlarge.)

But overall scores on the California standardized tests and the publisher’s year-end course exam averaged nearly the same for students using iPads and textbooks, after controlling for pretest and demographic differences – except for Riverside, where there was a 9 point increase in the percentile ranking, a significant amount, for those using an iPad. Put another way, by the district’s analysis, 78  percent of students using iPads scored proficient or above on the Algebra CST; 59 percent scored proficient using textbooks.

78 percent of Riverside students using an iPad to learn Algebra I scored proficient or advanced on the state Algebra CST, compared with 59 percent using a standard texbook by the same publisher, Houghton Mifflin Harcourt. Source: research by Empirical Education.  (Click to enlarge.)
78 percent of Riverside students at the Amelia Earhart Middle School using an iPad to learn Algebra I scored proficient or advanced on the state Algebra CST, compared with 59 percent using a standard textbook by the same publisher, Houghton Mifflin Harcourt. Source: Houghton Mifflin. (Click to enlarge.)

Empirical Education did a rigorous analysis, using 11 teachers in six schools across four districts: San Francisco Unified, Fresno Unified, Long Beach Unified, and Riverside. They taught a combined 34 classes ­­– 23 sections with textbooks and 11 sections with iPads, chosen at random. Most of the teachers volunteered to be part of the iPad study, but at least one teacher who proved to be a Luddite was assigned to the study. By the end of the year, nine of the 11 teachers said they would continue to use the iPad if given the choice (sorry, but HMH took them back).

Why the big difference with Riverside? Empirical Education found that the two teachers in Riverside used the iPads as a teaching tool much more extensively than the other nine teachers in the study, and the students used the iPads many more minutes per week in class than all but one of the others – a possible connection.

‘Personalized learning devices’

But Riverside Superintendent Rick Miller has another explanation.  The district has been proactive in deploying mobile technologies among its 42,000 students; it has learned that the best strategy is to encourage students to make iPads and tablets their personalized learning devices, as indispensable as cell phones. Allow students to download their own applications, including music, and they’ll be more prone to access math videos and use the technology for learning. Others districts adopted more restrictive policies, at least at first: They reportedly locked up iPads when they weren’t in use and didn’t allow students to take them home.

The other difference, Miller said, is how the teachers used the iPads. Riverside’s two teachers who volunteered for the study weren’t chosen because they were tech-savvy; they hadn’t owned Apple products. But, Miller  said, they were good math teachers who came to recognize opportunities for differentiating instruction and for assigning  videos at home to introduce concepts.

John Sipe, HMH senior vice president and national sales manager, concurred. “In Riverside, teachers were more comfortable from the beginning. They let the technology organically change the classroom structure and front-load instruction.”

In retrospect, said Sipe said, the company should have done a two-year study, because there was a learning curve to the new technology. And it would spend less time at the start of the year teaching teachers on using the device and more time on classroom strategies with the software.

This year, Riverside has expanded its use of HMH’s iPad Algebra program to four classes in the middle school that participated in the study. However, without the free iPads, students have to supply their own, which has led fewer low-income students to participate. (Parents can buy inexpensive insurance to cover theft or damage.) Meanwhile, Riverside is charging ahead with going digital, experimenting with multiple devices and free and proprietary software in multiple subjects.

HMH Fuse sells for $59 retail; districts can buy it for $49 with a six-year use. It’s also now available for  $19 for a use of one year. Miller said that if  Houghton Mifflin Harcourt releases an app that can run on cheaper Android-based tablets, which will soon sell for under $200, then Riverside would deploy the HMH Fuse program much more extensively. Sipe said an Android app might be available this fall.

Dilemma over English learners

English language learners currently get about 8 percent per student in extra funding, says the Legislative Analyst’s Office. That amount would more than quadruple in six years, to 37 percent, if the Legislature adopts Gov. Brown’s weighted student formula, phasing in substantially more money for every poor student and each English learner.

That potential bonanza has advocates for English learners ecstatic and anxious. They see it as a long-overdue opportunity to help struggling students. But many also agree that without better ways to measure English learners’ progress and to make sure districts actually spend the money on English learners, there will be incentives to label too many children as English learners and then to keep them in that status for too long – doing more harm than good.

“Just to give money based on characteristics of one type or to an ambiguous program with little or no program goals is to court disaster because (the money) will just go on and on and on forever,” Sen. Jean Fuller, a Republican who was a superintendent of the Bakersfield City School District, testified at a hearing on the potential impact on English learners of a weighted student formula. “And the kids lose. They don’t get the proficiency they need fast enough.”

There appears to be a consensus that some English learners are already stuck in the system, although there’s disagreement as to why. Nearly one in four students in the state are English learners, with 70 percent in elementary school. In some districts, like Santa Ana, more than 80 percent of English learners were born in the United States. If parents, filling out a four-question survey, responded that they most often speak to their children in a language other than English, their children were required to take the California English Language Development Test, or CELDT, often as kindergartners.

In a study issued last year, the Center for Latino Policy Research at UC Berkeley concluded that CELDT is prone to misidentification, with only 6 percent of kindergarten students taking it in 2009-10 being classified as English language proficient. Since CELDT has never been administered to English-only kindergartners for comparison, it’s hard to know if the test is an accurate language measure or is a proxy for poverty and other deficits at home.  The language survey is not able to distinguish students who are truly bilingual, says Associate Professor Lisa Garcia Bedolla, co-chairwoman of the Center for Latino Policy Research.

Inconsistent approaches to redesignation

But once identified as an English learner, a student is likely to stay an EL for years; an average of only 11 percent  are redesignated annually as fluent in English and no longer needing extra help, which can include pulling a child from regular class for extra English support.

Robert Linquanti, project director for English Learner Evaluation and Accountability Support and senior researcher for the California Comprehensive Center at WestEd, said that a reasonable target for reclassification should be about 20 percent each year, or one in five.

The state auditor a few years back concluded that there were financial incentives for retaining students as English learners, but Linquanti said that districts with low reclassification rates tend to have poor procedures for monitoring, and administrators don’t make reclassification a priority.

State Sen. Alex Padilla agrees. “There’s no smoking gun memorandum” that he’s seen discouraging teachers to reclassify English learners as proficient. The problem, he says, is that every district has its own redesignation criteria; they’re all over the map. That’s why Padilla, a Democrat from Los Angeles, is sponsoring two bills dealing with English learners, both of which passed the Senate Education Committee last week.

SB 1109 is a broad bill; it would establish a master plan for English learners, looking at issues of initial classification, best practices and techniques for instruction, parent involvement, and the long-term learning needs of English learners.

SB 1108 deals with redesignation specifically. It would require districts and county offices to report the redesignation criteria they use to the state Department of Education, which would analyze them and report back to the Legislature which methods are the most effective for determining fluency in English.

“There needs to be uniformity and consistency,” Padilla said in an interview. The Legislature would determine whether the criteria would be mandatory for local districts after it sees the results, he said.

The State Board of Education guidelines recommend using a combination of factors for determining redesignation: English proficiency scores on CELDT; scores on the California Standardized Tests, which students take in English; grades; teacher evaluations; and parental views. Students are considered proficient in English if they score early advanced or advanced on CELDT, but only 38 percent of students who test that high are actually redesignated. Districts come up with their own weights and throw in grades in science and math and district benchmark tests. Some teachers, it’s been said, consider absenteeism and other subjective factors.

Financial incentives can cut both ways, said  Chuck Weis, Santa Clara County superintendent, in testifying  at the  hearing, organized by Padilla,  on a weighted student formula and English learners. Middle schools and high schools tend to push English learners through the pipeline too soon, he said, because they don’t have enough bilingual teachers, and the amount of state aid isn’t enough  to cover needs of English learners in upper grades.

Funding beyond redesignation

Assemblymember Chris Norby, a Republican from Northern Orange County, sides with those who believe too many children are classified as English learners (he cites his own daughter as one) and then become “stuck in English language learning and cannot get out.” His bill, AB 1767, would consider scoring proficient or advanced on California Standardized Tests and a B average “prima facie evidence” for redesignation. Those students – only a few thousand per year by one estimate –  clearly have academic fluency to succeed in regular English classes, he said.

Norby’s bill so far has failed to get out of the Assembly Education Committee, but it includes an idea that Linquanti and other experts in English learners like: It would extend funding for English learners two years after they were reclassified as fully English proficient. This is critical, advocates say, because redesignation is a minimal standard; it doesn’t prepare students for rigorous academic English found in middle and high school courses. The money could also be used for monitoring students’ progress after redesignation– something that’s not done now.

Gov. Schwarzenegger’s Committee on Education Excellence anticipated possible perverse incentives with extra funding for English learners. The committee suggested gradually cutting back funding over time, and then eliminating it for an English learner after seven or so years.

Some of the concern about the disincentive to redesignate may be overstated, because the Department of Finance estimates that 74 percent of English learners also come from low-income households. Even if these students are redesignated as fully English proficient, without extra funding,  they would receive an extra 37 percent for poverty – same dollars, different label.

Flexibility without accountability?

Brown is proposing to fund the weighted student formula by ending nearly all categorical programs, the specially protected programs for restricted purposes. Districts would have near total flexibility to spend money as they see fit. The governor is calling for full flexibility immediately, with weighted student funding phased in over six years. That worries advocates for English learners, because Economic Impact Aid, the biggest source of state aid for English learners, would be one of the eliminated categoricals. Thus, there’d be no guarantee that districts would continue to spend the extra money on English learners – now or when the weighted formula is ratcheted up.

Padilla agrees that is a problem and says he will present accountability requirements if Brown doesn’t include them in the budget revision next month. “I want to ensure that gaining flexibility in how to use funds does not relieve districts of the obligation to improve on English language instruction and to demonstrate progress in moving students toward proficiency.”

The debate over funding and classification coincides with other major changes. By summer’s end, the State Department of Education will finish recommendations for a new set of standards for English learners that aligns with the Common Core standards. New standards, in turn, will require a new assessment to replace CELDT. California must decide whether to join with other states in creating the test or go it alone.

These are a lot of moving parts, not all of them in sync. All the more important, Padilla argues, to create a master plan for dealing with changes and to include uniform standards for tracking English learners’ progress.

Placement exams are ineffective

Tens of thousands of California community college students may be wrongly assigned to remedial English and math courses based on placement exams that are flawed. At a time of increasing state and national scrutiny on completion rates, two national studies from the Community College Research Center (CCRC) at Columbia University’s Teachers College found that the most common placement exams are poor predictors of college success.

In Predicting Success in College: The Importance of Placement Tests and High School Transcripts, researchers found “high ‘severe’ error rates using the placement test cutoffs.” How bad? Three out of ten students were wrongly assigned in English. The numbers were lower in math, “but still not nontrivial,” wrote the authors.

A far better indicator of how students will do in college is how they performed in high school. When researchers compared success rates using only high school GPA or only placement exams,they found the “severe error rate” was cut in half for the GPA group.

Comparing success rates by high school grades and scores on the two main placement exams. (Source:  CCRC 2012) Click to enlarge.
Comparing success rates by high school grades and scores on the two main placement exams. (Source: CCRC 2012) Click to enlarge.

The results were slightly less conclusive in math in the second study, titled Do High-Stakes Placement Exams Predict College Success? Researcher Judith Scott-Clayton found that math placement exams were more accurate than English placement tests. But Scott-Clayton had this caveat: “Placement test scores are better at predicting who is likely to do well in the college-level course than predicting who is likely to fail.”

None of this surprises Robert Gabriner, director emeritus of the Center for Student Success at the Research & Planning Group for California Community Colleges. He said researchers have been skeptical about the validity of placement tests for years.

“Some of my colleagues said you could put a chart on the wall and have monkeys throw darts at it and it would be just as reliable,” Gabriner recalled.

“This is a real positive step,” he said. “Where this leads is that overnight we have increased the number of students who do not go into basic skills in English and math because they shouldn’t have been there in the first place.”

That could mean the difference between earning a degree or credential or giving up. Numerous studies have shown that students stuck for several semesters in remedial – also known as basic skills – courses are significantly more likely to drop out of community college. A student required to take three semesters of basic skills math has just a 16 percent chance of finishing those courses according to another CCRC study; for English it’s 22 percent.

In California, between 70 and 90 percent of first-time community college students place into remedial math, English, or both. They have to pay for the courses, but the credits earned don’t count toward an Associate degree or units needed to transfer to a four-year college.

“Our average, and we’re not unusual, there are students who typically take seven to eight semesters at City College before they’re able to get out; that’s four years,” explained San Francisco City College chancellor Don Griffin. “The question really is, if you were allowed to go to a college-level class based on high school performance would you do as well as students who had been placed there?”

Chancellor Griffin believes those students would do as well, and the studies back him up. CCRC director Thomas Bailey found that students who ignored the results of the placement exams and went directly into college-level classes had lower success rates than the students who placed into those courses; but, weighed against students who accepted the test results and enrolled in basic skills, they did significantly better in college-level courses, passing at a rate of 72 percent compared to 27 percent.

City College ran its own experiment on placement exams and the results reinforced Griffin’s lack of confidence in the tests. They retested students two weeks after they took the placement exam, and 40 percent of them scored well enough to move into a higher-level course.

Next up, he’s asked the academic senate to approve a pilot program to take 500 new students and not test them all, and instead place them based on their high school grades. Griffin hopes to start the program next fall, and he predicts that the results will be good.

“If you want to look at how competent a person is, you place them in the best class, give them competent instruction, and see how well they perform,” said Griffin. “And I think the expectation is that they will perform well.”

Outside evaluator must tell us how well prepared are we for Common Core

California, like many states, is embarking on an ambitious rewiring of its public school system. By the 2014-15 school year, it plans to implement new Common Core academic standards in English and math for all grades. The new standards were adopted by the State Board of Education only 18 months ago. Having all of our state’s teachers and schools on board with this shift in core content in just another two-and-a-half years would be an impressive feat of bureaucratic derring-do. The last time the state undertook a similar effort with the current academic content standards – under an even longer time frame and in better fiscal straits – we didn’t meet our lofty goals so well.

Because this undertaking is too important to implement poorly or unevenly, Public Advocates is calling for an independent study of how well California’s Common Core implementation is proceeding. Adopting the standards was the easy part. Now the state must simultaneously modify its teacher education programs so that all new teachers are prepared to teach the revised standards, adopt and disseminate curriculum materials integrating the new standards, ensure current teachers receive appropriate professional development in how to adapt their curriculum, and build new assessments to measure student progress in learning the new standards.

The Department of Education and the State Board are doing what they should be doing in developing an implementation plan. Still, only by examining how well we’re building this plane before take-off can the powers that be identify deficiencies and engage in the timely re-engineering needed to ensure no child’s exposure to the new Common Core  standards substantially lags behind others’.

Recent history reminds us that having an ambitious plan alone isn’t enough. When the state imposed a new high school exit exam in 1999 based on California’s then relatively new English Language Arts and Mathematics academic content standards, it wisely required that the implementation of the exam and students’ exposure to the ELA and math content be studied by an independent contractor. That contractor, known as HumRRO, has published a series of biennial reports, the most revealing of which occurred in the years leading up to the implementation of the exit exam’s diploma penalty. The State Board of Education delayed the diploma penalty for two years based on HumRRO’s reports of widespread unequal and insufficient access to math and ELA standards-based content prior to the initial exam implementation date of June 2004.

HumRRO’s later evidence showed that, even during the 2005-06 academic year when the diploma penalty took effect, many students still were not being exposed to the English and math standards covered by the test. Mind you, at that point, those content standards had been adopted fully eight years earlier and the standards-aligned exit exam requirement had been imposed six years prior. Nonetheless, at the start of the 2005-06 year, HumRRO found that fewer than half of high schools had fully aligned their curriculum to the material tested on the exit exam. One in seven at-risk students in the class of 2006 reported that they had not been taught most of the English topics tested; one in six such students made the same report for math. Of the schools that responded to HumRRO’s survey, 12 percent of English departments and 8 percent of math departments reported that that they were operating with more than 25 percent of their teachers lacking appropriate credentials, and less than a third of high school principals reported that nearly all of their teachers had received professional development on how to teach the content standards tested on the exam.

Whether through an NCLB waiver, under the impending reauthorization of the Elementary and Secondary Education Act, or under the inevitable revisions to the state’s own Academic Performance Index, California in the not-too-distant future will be operating under an accountability system significantly based on Common Core assessment performance. Waiting until after we’ve imposed the new standards on all students and schools to see if we have effectively implemented them makes no sense. It is imperative that we examine now whether implementation is proceeding unevenly and if so, whether we are systematically underserving certain sub-populations like English learners or students with disabilities or certain sectors like low-income districts or low-performing schools. It is better to know where the weaknesses in Common Core implementation lie so that they can be addressed by policy makers that much sooner.

Some may feel that we simply cannot afford any additional expense in these tight fiscal times. Yet, even before exploring the possible ways in which federal or private foundation dollars might help support such a study, the state itself should acknowledge the value of the minor investment here. For a few hundred thousand dollars, the state will know if its $50 billion educational enterprise is on the right track or not. Seems like a no-brainer.

Fortunately, Assemblymember Ricardo Lara (D-Bell Gardens), who has worked tirelessly to ensure that every student has an equal opportunity to learn and be educated, will partner with Public Advocates and will carry a bill to institute a Common core implementation study.  We hope the entire education community will get behind this important measure, AB 2116.  Hopefully, this time we make sure our new standards are implemented fully and fairly.

John Affeldt is Managing Attorney at Public Advocates Inc. a nonprofit law firm and advocacy organization and a leading voice on educational equity issues. He has been recognized by California Lawyer Magazine as a California Attorney of the Year, The Recorder as an Attorney of the Year and a Leading Plaintiff Lawyer in America by Lawdragon Magazine.

Common Core, ‘dubious causality’

The horse race of international rankings in education is based on misconceptions that can lead countries such as the United States to consider sweeping reforms that probably won’t improve academic achievement, according to a new report. The 2012 Brown Center Report on American Education released yesterday by the Brookings Institution makes a case against Common Core standards – arguing that California’s current standards are superior – and cautions against placing too much weight on the National Assessment of Educational Progress (NAEP) and international comparisons.

“We have to be careful when looking at test score data; it’s not the same thing as how many points did the New York Giants score versus the New England Patriots in the Super Bowl. These tests have to be interpreted very carefully,” said author Tom Loveless in a video accompanying the study.

Loveless is especially critical of using international exams, such as PISA, to rank countries’ educational systems. The United States tends to score in the average range on the test, behind top performers including Shanghai, Finland, Singapore, and Canada.

In what Loveless calls a flaw of “dubious causality,” lower performing countries mistakenly look for a single policy to explain the success of top performers. One of those dubious connections he’s referring to is Common Core standards. He said advocates of Common Core often point out that the top ten countries all have national standards. But, said Loveless, “if you look at the bottom ten nations in the world, they all have national standards too.”

The report cites arguments by two outspoken critics of Common Core in California, Ze’ev Wurman and Bill Evers, who “conclude that the math standards, in particular, are inferior to existing standards in Massachusetts and California.”

Wurman was a member of the Mathematics Curriculum Framework and Criteria Committee that developed California’s 1997 mathematics framework, and Evers served on the 1996 California State Commission for the Establishment of Academic Content and Performance. Both were members of the California State Academic Content Standards Commission and, as TOP-Ed reported here, their fellow commission members overwhelmingly rejected their efforts to rewrite the Common Core standards to look like California’s earlier math standards.

One of those other commissioners is Scott Farrand, a math professor at Sacramento State University.  He questioned how California’s standards can be considered the highest in the country when two-thirds of elementary students score advanced or proficient on the California Standards Test, but that falls to less than a quarter by eleventh grade. What that says to Farrand is that merely setting a high bar doesn’t improve achievement.

“What sets the Common Core State Standards apart is not the level of the standards, however one might measure that. It is their focus and coherence, and their insistence on student understanding,” said Farrand.  He’d like to see the “my standards are higher than yours” posturing end so the people responsible for implementing Common Core standards in California can spend their time understanding “what standards can and should do,” rather than engaging in “silly bickering” that detracts from that progress.

Final exams

It’s not just any Friday; it’s the Friday before the biggest holiday week of the year. So if you’re reading this, you’re probably trying to escape from a) out-of-town guests, b) another tin of homemade cookies that reminds you about that resolution to join a gym, c) the annual office party that’s been downgraded by the economy from a formal soiree with a band and open bar to cheese Danish and coffee, or d) all of the above.

Hmmm, multiple choice. That gives us an idea for something that will keep TOPed in your thoughts next week while we take a break. We’ve compiled an array of questions, released by the publishers, from a variety of tests. Pour yourself a glass of eggnog – no scrimping on the nutmeg – power up your graphing calculator, and open your booklet.
[Click here for answers]

California High School Exit Exam

Mathematical Reasoning

1.    The table below shows values for x and corresponding values for y.

table

Which of the following represents the relationship between x and y?

A.        y = 1/7x

B.        y = 7x

C.        y = x-6

D.        y = x-18

Algebra I

2.    Which of the following is equivalent to 1-2x>3(x-2)?

A.        1-2x>3x-2

B.        1-2x>3x-5

C.        1-2x>3x-6

D.        1-2x>3x-7

Statistics, Data Analysis, and Probability

3.    The Smithburg town library wanted to see what types of books were borrowed most often.

­­
pie chartAccording to the circle graph shown above –

A.  More Children’s books were borrowed than Romance and Science Fiction combined.

B.  More than half of the books borrowed were Children’s, Mysteries, and Art combined.

C.  More Mysteries were borrowed than Art and Science Fiction combined.

D.  More than half of the books borrowed were Romance, Mysteries, and Science Fiction combined.

English-Language Arts

maya start Maya 1

maya
maya 8

California Standards Test

Grade 8/History-Social Science
CST grade 8 History-social science 1

California Standards Test

Grade 5/Science

CST science grade 5
National Assessment of Educational Progress (a.k.a., the Nation’s Report Card)

Grade 12/Economics

NAEP grade12 econ

Programme for International Student Assessment (PISA)
Administered every three years to 15 year olds in more than 70 countries, including the United States

Math

The approximate distance to stop a moving vehicle is the sum of:

– the distance covered during the time the driver takes to begin to apply the brakes
(reaction-time distance)

– the distance travelled while the brakes are applied (braking distance).

The ‘snail’ diagram below gives the theoretical stopping distance for a vehicle in good braking condition (a particularly alert driver, brakes and tyres [tires] in perfect condition, a dry road with a good surface) and how much the stopping distance depends on speed.

snail

QUESTION 1: BRAKING

If a vehicle is travelling at 110 kph, what distance does the vehicle travel during the driver’s reaction time?

QUESTION 2: BRAKING

If a vehicle is travelling at 110 kph, what is the total distance travelled before the vehicle stops?

QUESTION 3: BRAKING

If a vehicle is travelling at 110 kph, how long does it take to stop the vehicle completely?

California Basic Educational Skills Test (CBEST)
Must be passed by all California teachers before starting their certification program.

Reading

cbest

Math

cbest math

Writing

cbest writing

How should we measure our schools if not by API?

Ever since California and the federal government placed the weight of a school’s success on standardized test scores with the Public Schools Accountability Act and No Child Left Behind, there’s been a backlash against overreliance on high-stakes testing.

The question of what else should be considered in rating schools is the topic of this week’s forum, “Yes, but….”

Our opinion and policy makers are Darrell Steinberg, President pro Tempore of the State Senate; David B. Cohen, a National Board-certified high school English teacher; education lobbyist and legal counsel Fred Jones; former California Superintendent of Public Instruction Bill Honig; and Jeff Camp, chair of the Education Circle of the Full Circle Fund philanthropy organization.

We hope you’ll keep the conversation going with other readers, and use the comment section to ask questions of this week’s contributors.

Darrell Steinberg: Reflect a well-rounded education

Darrell Steinberg
Darrell Steinberg

The Academic Performance Index has served a worthy purpose over the past 11 years, but let’s face it: It is, at best, an incomplete indicator of student achievement and school performance.

Gov. Brown’s veto of Senate Bill 547 left in place a measurement tool that sends one signal, and one signal only, to our schools: Get your standardized test scores up. At the elementary level, the API is almost exclusively focused on scores in just two subjects, English language arts and mathematics. At the middle and high school levels, no credit is given for keeping students on track to graduation.

Striving for the perception of steady improvement under this narrow accountability regime, many of our schools have responded with a laser focus on bubble tests. Such focus comes at the expense of a whole range of offerings that parents, the business community, and students themselves value: college and career preparation at the high school level; science, history, arts, and music across the grades; physical education; and opportunities for leadership and community engagement.

We need an accountability system that reflects the elements of a well-rounded education, and that connects public education to the needs of the 21st Century economy. I sought to begin that work by replacing the API with a new Education Quality Index, balancing test results with other important measures of school success. I have invited the Govenor to join me in crafting a new approach for next year. At minimum, it should contain the following elements:

  • Rapid implementation of existing law, which already requires that the API include graduation rates. Their inclusion is critical to underscoring the importance of student engagement and support in both middle and high school;
  • Greater emphasis on student achievement in science and history, to temper the overemphasis on English language arts and math;
  • A shift away from the existing API decile system (ranking schools relative to one another from 1 to 10) in favor of a scoring system pegged to an absolute standard, which creates a more accurate representation of performance.

I have worked on few issues in my legislative career that garnered more support than this attempt to ensure the state sends more appropriate signals about what it wants schools to accomplish. Republicans and Democrats, business and labor, educators and parents, law enforcement and civil rights organizations have coalesced around the need for change. We need the Governor to work with us to connect our schools to the needs of the economy we hope to rebuild in California.

Darrell Steinberg has been President pro Tempore of the California State Senate since 2008, chosen by his colleagues to that leadership post two years after he was first elected as Senator for the Sixth District representing the Sacramento area. He earlier served three terms in the State Assembly. He’s a strong advocate for education reform, children and mental health issues, and received the “John F. Kennedy Profile in Courage” national award in 2010 for his leadership in resolving the state’s 2009 budget crisis.

David B. Cohen: Why rank schools?

David Cohen
David Cohen

Imagine for a moment that California used letter grades rather than the Academic Performance Index to rate schools. If I were a parent whose child attended a high school with a “D” on its state report card, I would be gravely concerned that this school would fail to provide my student with the skills to succeed in college, and a college education is vital to my child’s future. If I had a choice, I would certainly want to move my child to an “A” school. I know these report cards aren’t perfect, but there must be a world of difference between the “D” and the “A” rankings, right? And if the “A” school was also listed among Newsweek’s Best High Schools, so much the better, I’m sure.

Wrong. The “D” school is better.

Or to be more precise, the “D” school is better if the measure of quality is college preparation. Don’t believe me? Take a look at this study – “College- and Career-Ready: Using Outcomes Data to Hold High Schools Accountable for Student Success” – from Florida. Writer Chad Aldeman sums it up this way: “While [the “D” school] got dismal marks from state and federal accountability schemes, it was actually quite successful in a number of important ways. It graduated a higher percentage of its students than [the “A” school] and sent almost the same percentage of its graduates off to college. Once they arrived on college campuses, [the “D” school] graduates earned higher grades and fewer of them failed remedial, not-for-credit math and English courses than their [“A” school] peers.

In other words, D-rated [High School] was arguably doing a better job at achieving the ultimate goal of high school: preparing students to succeed in college and careers. But because Florida’s accountability systems didn’t measure college and career success in 2006, nobody knew.

The study concludes, as you might anticipate, with a call for more data going into accountability systems, and it’s hard to argue with that. But the catch is that any rating or ranking is going to miss something, and is going to create simplistic lists of winners and losers out of what should be a more complex view of school quality.

It is time to distinguish between having data and claiming to know what it means. If we were conducting chemical experiments, it might be different. With schools, we are “measuring” extended periods of highly complex interactions among hundreds or thousands of people (different combinations of people every year), operating under different combinations of influences, and we have yet to agree as a state or society about the outcomes that matter most in that complex setting.

Ultimately, I would argue that the state should be in the business of providing resources and guidelines, and leaving the final assessments of quality and success to professional and local agencies. These agencies must ensure transparency and protect the interests of all stakeholders. They should be comfortable examining widely varying types of data and appreciating the value of each. Their judgments and conclusions would be informed by data and observations, but expressed in words – reports that don’t hide behind the false certainty or pseudo-objectivity of final scores, points, grades, or gold stars.

California high schools already engage in an accreditation process similar to that description, carried out by the Western Association of Schools and Colleges. Why not make it more meaningful, but less intensive, and expand the approach to other levels?

If our citizenry can’t handle that shift, then we have a goal for our educational system, not to produce citizens, media, and political leaders who would prefer to have a meaningless “A” or “D” slapped on a school, rather than understand and express the complex realities of school quality.

David B. Cohen is a National Board-certified teacher in Palo Alto, where he teaches high school English. He helps to direct Accomplished California Teachers and writes for the group’s blog, InterACT.

Fred Jones: Hold schools accountable for reality

Fred Jones
Fred Jones

For good or ill, California’s K-12 public education system is driven by what Sacramento – and to a growing extent D.C. – requires, funds, and measures. The “measure” driver has led to the axiom: If it isn’t tested, it isn’t taught.

But the current fixation on a narrow bandwidth of ELA and Math via fill-in-the-bubble standardized tests has not proven to be a meaningful gauge of a school’s overall performance. Moreover, it has led to the narrowing of curriculum that so many have railed against.

We should be expecting much more from schools as they strive to prepare their students for successful lives.

Many have chosen to jump on the “college for all” bandwagon, feeling this is a higher means of holding schools accountable. We have seen many districts require the UC ‘s A-G coursework of all of their secondary students.

But college should not be considered an end unto itself. In this era of dwindling public resources and exploding student debt, college should more appropriately be considered merely a means to an end: one that provides students – and the taxpayers who subsidies them – the disposition, skills, and knowledge to provide a return on the private and public investment.

There is a growing chorus of intellectuals, industry leaders, and loan-conscious parents who have begun to question the financial returns of college. Regardless of the merits of those arguments, the economy clearly does not demand that all workers have 4-year degrees.

So what shall we hold schools accountable for delivering to every K-12 student? And how do we measure that?

In his veto of SB 547, Gov. Brown acknowledged the difference between quantitative data streams and qualitative considerations, and the difficulties in measuring the latter, often more meaningful outcomes.  Paradoxically, his veto actually undermined the effort to get a more relevant accountability system.

SB 547 was a good-faith effort to broaden the accountability matrix. It sought to include more than just standardized test scores, while attempting to keep the additional criteria objectively quantifiable.

Such additional criteria would have included a school’s performance in adequately preparing students for postsecondary education opportunities, access to career planning and training coursework, dropout rates, and other substantive and serious considerations. Einstein’s quip that “not everything that counts can be counted, and not everything that can be counted counts” certainly applies. Schools must begin to report what needs to be counted to adequately measure true success.

Fred Jones has nearly 20 years of policy experience in the State Capitol as both a legislative staffer and, since 2000, as a registered lobbyist and legal counsel to several education-related clients. His primary CTE-related client is the California Business Education Association, which is also a founding member of Get REAL California, a coalition of employers, labor groups, educators, and others concerned about CTE in California schools.

Bill Honig: Provide information for school improvement

Bill Honig
Bill Honig

The first crucial question to be answered is what is the purpose and context of the measurement. Is the emphasis primarily on using test-based results for state accountability and intervention for low performance? Or is measurement primarily used as part of a broader strategy to provide useful information to schools and districts to help continuously improve teaching and learning while still supplying information to the public about school success? This second strategy requires a shift in emphasis from penalties and interventions to building a sophisticated local and state infrastructure to support school-site team building, coaching, and professional development.

The former “test with consequences” strategy rests on the assumption that setting standards, testing results, and penalizing low-performing schools, by itself, will cause major improvements. This approach does produce some beneficial results, but by neglecting the investment in building the capacity for growth, the overall effect has been found to be limited.[1] This strategy also engenders significant negative side effects such as narrowing the curriculum, lowering morale, and encouraging staffs to game the system. All the international world-class performers, as well as U.S. states such as Massachusetts and highly successful California districts such as Long Beach, Sanger, and the charter school network Aspire, have pursued the latter, more powerful, capacity building strategy.

Gov. Brown has warned of the danger of over-relying on narrow high-stakes testing in his quest to broaden measurement and the way it is used. We should explore his suggestion that the state develop local peer review as one method of feeding back useful information to guide continuous improvement.[2]

The second key question is what kind of measurements help drive the system in the right direction? Relying too heavily on reading and math or low-level multiple-choice tests has been problematical. It has motivated legislative leaders such as Sen. Steinberg to pursue legislation to broaden California’s Academic Performance Index both for accountability and instructional feedback. The API is a useful measure, but I agree that it should be broadened and deepened:

  • API does test a broad array of subjects at the high school level and some at middle grades, but needs to cover history, science, civics, and art in a more profound way, especially at the elementary and middle-grade level. This can be done in several ways. The weighting given to these subjects should be examined. Currently at the elementary level reading and math are weighted at 94%, science at 6%, and history at 0%. At middle grades it’s not much better – 85% reading and math, 7% history, and 7% science. These weights directly contribute to a narrow curriculum.
  • The state needs to add history, art, and more science to the elementary level tests, or at least embed those subjects in the language arts and math sections of existing tests, and add civic understanding assessments to the high school level.
  • While the new tests for California being developed by the SMARTER Balanced group will move away from over-relying on multiple choice for reading and math, I would also add matrix sampling of those and other subjects to the individual tests so that a broader curriculum and deeper learning, such as the ability to write essays or develop a science project, can be assessed more efficiently.
  • At the high school level, one major change would be to explore how to hold schools accountable not only for the number of students meeting A-G requirements but also for how many students at least qualify for entering a tech-prep program at community colleges. The  API would apply to a broad range of students: dropout rates, 4-year college prep rates, tech-prep rates, and course performance. I would also add some measure for the advanced students such as the number of AP courses passed.

Bill Honig began his career in education as an elementary school teacher before becoming a California State Board of Education member and district superintendent. He was elected in 1982 to serve the first of three terms as California Superintendent of Public Instruction. He subsequently published “Teaching Our Children to Read” (Corwin Press) and founded the Consortium on Reading Excellence (www.corelearn.com), which helps  schools, districts, and states implement best practices in reading and math. He is a Bay Area native, father of four, and grandfather of five.

Jeff Camp: Schools must produce an economic return (broadly)

Jeff Camp
Jeff Camp

[NOTE: An article posted here earlier today was a draft and not intended for publication.  This is the correct article.  We apologize for error]. The success of schools must not be our primary concern. Schools, after all, are only a means to an end. The center of the proverbial target is simpler, but even more difficult: prepare EACH child for adulthood.

The effort to provide opportunity for each student is a costly undertaking, and public education is its biggest component. Spending on universal K-12 education in California adds up to about $65 billion annually when all the sources (state, local and federal) are counted. To put this number in human context, taxpayers in California invest on the order of $140,000 in each student’s thirteen years of K-12 education – roughly equivalent to paying about two dollars above minimum wage for every hour a student spends in class.

As with any big investment, success must be measured in terms of Return on Investment (ROI). Measuring success on these terms requires long-term data about each student’s long-term success, viewed broadly and over a time frame spanning decades, not just school years.

What is the long-term economic payback on that $140,000 investment for each student? Today, we don’t really know. Evaluating the return requires estimating both value produced and costs avoided. Education produces value by helping each student find his or her place in the world, including work that earns enough to pay taxes. Education avoids costs by helping students grow into self-supporting, resilient and law-abiding adults. Our system is set up to track neither.

For the last decade, the Academic Performance Index (API) has been the dominant tool for summarizing a school’s performance in California. This score, distilled annually from a changing assortment of annual tests, serves as a shorthand metric of academic achievement at the school and district levels by grade level. Unfortunately, the API only measures the academic success of those who show up. If every struggling student in a school were to drop out, the API score for that school would, perversely, rise.

The state’s system of measurement for education should be built online, in a manner that allows students to show what they know regardless of their nominal grade level. If this seems like whimsy, take a look at the coaching module of Khan Academy for an early example of what the future of measurement may look like, at least in math.

The public has grown accustomed to the idea that products and services should be evaluated, rather frequently, and that evaluation should lead to action. In order to sustain public support for investing in education, California needs to make a set of serious investments to systematically provide everyone involved with better, more personally useful information over a more meaningful arc of time. We rely too much on summary numbers partly because that is all we have at present. California should do better.

For starters, California should invest in modern data systems to track and support investments in human development including education. In the age of Facebook, it is no longer OK for California’s education system to operate with outmoded data systems.

California needs a platform that usefully connects parents, students and teachers, including accurate data to inform the work they do together. This is not an investment that each district can or should pursue on its own; it is far too difficult, much too important, and frankly its implications extend beyond education.

Jeff Camp chairs the Education Circle of Full Circle Fund, an engaged philanthropy organization cultivating the next generation of community leaders and driving lasting social change in the Bay Area and beyond. He is the primary author of Ed100.org, a primer on education reform options in California. Since leaving a career at Microsoft to work for education change, Jeff has served on multiple education reform committees including the Governor’s Committee on Education Excellence.

Sobering NAEP scores

Fresno Unified Superintendent Michael Hanson chose to have his district join San Diego, Los Angeles and other urban districts participating in the biannual analysis of scores in the nation’s report card, the National Assessment of Education Progress. The results for 2011, released this week, were unsettling.

In both reading and math, Fresno ranked among the lowest scoring of 21 urban districts, along with Detroit, Cleveland, and Washington, D.C. In one measure, reading scores of fourth grade Hispanic students, who comprise two-thirds of district students,  Fresno was at the bottom.

Hanson doesn’t have second thoughts about his decision, nor does he distance himself from NAEP, as other superintendents and leaders in California sometimes do. They often claim that NAEP scores are unimportant, because the national test isn’t aligned with California’s academic standards – or with any particular state’s, for that matter.

“I put us in so that we would have a national benchmark independent of what we do in California,” he said yesterday. “It’s a sobering reminder of how far we have to go.”

NAEP 8th grade math scores for 21 urban districts in 2011
NAEP 8th grade math scores for 21 urban districts in 2011. Click to enlarge.

Fresno Unified certainly has company. California ranked between 46th and 49th among the states this year in reading and math. And Los Angeles Unified’s scores are in the bottom third to half of the urban districts. Only San Diego Unified, the state’s second largest district, with fewer low-income children, was in the top third of urban districts.

What Fresno also has in common with Cleveland, Detroit and Washington, D.C., is poverty. Ninety-three percent of its students qualify for subsidized lunches, and, according to the latest Census figures, 44 percent of its children are in poverty – the highest rate among California’s districts with at least 25,000 students; Stockton is next with 37 percent. Nearly one in six adults in Fresno is unemployed.

Grinding poverty is not an excuse – it’s a reality. Between budget cuts to the district of nearly $100 million over the past few years and state cuts to mental health and other social supports, “it’s a pretty stiff wind we are sailing into,” Hanson said.

NAEP asked eighth graders how often they read for fun outside of school. The 40 percent of Fresno students who responded very seldom or never was highest among children in urban districts (29 percent on average); 10 percent said they read daily, compared with the 16 percent average – in itself alarming.

Fresno joined the Trial Urban District Assessment group for the 2009 tests. Fresno showed no significant improvement in either math or reading in 2011, as was the case with most urban districts. Only Charlotte’s scores rose in reading, and only four cites rose in 4th grade math (Atlanta, Austin, Baltimore, and Philadelphia) and six in 8th grade math (Atlanta, Chicago, Charlotte, Detroit, Washington, and Jefferson County, KY). However, Hanson, who’s president of the California Office to Reform Education – the nonprofit collaborative of the seven Race to the Top districts – says there have been encouraging measures of progress not detected by NAEP. The district’s graduation rate has increased 3 percentage points in three years; dropout numbers fell 9 percent, and the 46 point rise in API scores – a state measure – in three years was fifth out of a dozen high-poverty districts, according a comparison group that the district measured.

San Diego and Los Angeles, which have been in the urban assessment group since 2003, well exceeded the average increase for the nation and for large cities during that period in math (14 points for San Diego and 16 points for Los Angeles, on a 500-point scale in eighth grade, compared with seven points for the nation and 12 for large cities). But some of that growth came at the expense of the achievement gap, as higher income and white students excelled at a faster rate. The difference in scores in San Diego between white and black students grew by a third in 4th grade math to 36 points on a 500-point scale and by a third between higher and lower income students. In eighth grade reading, the gap between higher and lower income students in Los Angeles grew from 17 to 30 points from 2003 to 2009.

Here are the math results for the three districts from the 2011 NAEP. Proficiency rates on NAEP and California’s standardized tests can’t be compared, because definitions differ (proficiency on NAEP is more rigorous, and to be categorized as advanced on NAEP is exceedingly difficult.)

Demographics

Nationwide: 24% Hispanic, 52% White, 16% Black, 5% Asian, 52% low income, 22% English learners

Fresno: 66% Hispanic, White 12%, 9% Black, 12% Asian, 93%  low-income, 30% English learners;

Los Angeles: 75% Hispanic, 9% white, 10% black, 5% Asian, 83% low-income, 34% English learners

San Diego: 44% Hispanic, 23% white, 12% black, 15% Asian, 65% low-income, 36% English learners.

4th grade math

National average: 240 pts, 18% below basic, 39% proficient and advanced

Large cities: 233 pts, 26% below basic, 30% proficient and advanced

Fresno Unifed: average 218 pts was 21st percentile for nation, 19th out of 21 urban districts; 44% below basic, 14% proficient and advanced

Los Angeles Unified: average 223 pts was 27th percentile for nation, 15th among urban districts; 37% below basic, 20% proficient and advanced

San Diego Unified: 239 was 46th percentile for nation, 3rd among urban districts; 20% below basic, 39% proficient and advanced

White-Black gap: 25 pts for nation, 29 for large cities, 24 pts Fresno, 28 pts Los Angeles, 36 pts San Diego

White-Hispanic gap: 20 pts for nation, 23 for large cities, 25 pts Fresno, 24 pts Los Angeles, 29 pts San Diego

8th grade math

National average: 283 pts, 28% below basic, 34% proficient and advanced

Large cities: 274 pts, 37% below basic, 26% proficient and advanced

Fresno Unifed: average 256 pts was 23rd percentile for nation, tied for 17th out of 21 urban districts; 57% below basic, 13% proficient and advanced;

Los Angeles Unified: average 261 pts was 27th percentile for nation, tied for 16th among urban districts; 51% below basic, 16% proficient and advanced;

San Diego Unified: 278 was 44th percentile for nation; 6th among urban districts; 34% below basic, 32% proficient and advanced;

White-Black gap: 31 pts for nation, 34 for large cities, 37 pts Fresno, 45 pts Los Angeles, 46 pts San Diego (third largest except for Washington and Atlanta among urban districts);

White-Hispanic gap: 23 pts for nation, 27 for large cities, 29 pts Fresno, 36 pts Los Angeles, 39 pts San Diego