A funny thing happened on the way to the forum

My literary betters, the likes of Democrat vice presidential candidate U.S. Sen. Joseph Biden and fellow Democrat traveler and Harvard overseer Doris Goodwin, can plagiarize far more skillfully than I, but my borrowing of the movie and stage play title, A Funny Thing to headline this column refers not to the ancient Roman Forum, but rather to a 2007 colloquium of educators in Nashville. The educators were on hand to discuss a less-than-favorite subject in public education circles. Unlike the Scopes trial in Tennessee some four-score of years earlier, the official educator forum in the Volunteer State in June 2007 went unpublicized and therefore widely unnoticed; I knew nothing of it until a few weeks ago when I contacted the U.S. Department of Education for an explanation of a puzzling subject in public education: the substantial discrepancy in student achievement test scores between the federal NAEP tests and all the state-preferred local tests. Since World War II,the feds have been sending to the public schools a tiny fraction of the taxpayer money they collect, and so the predictable federal strings have accompanied the dollars; in this case, a demand, since 1969, that a sample of students be tested every year in every state in various subjects, primarily math and reading, in grades 4, 8, and 12. Until recently almost no one paid much attention to the National Assessment of Educational Progress or NAEP tests (even though the resulting student test scores were uniformly quite dismaltypically in the low 200s out of a possible 500, meaning that about 2/3 of all test-takers couldnt make proficient; the three categories are basic, proficient, and advanced) and couldnt, therefore, function at grade level). All that began to change in 2001, with the adoption of Public Law 107-110, better known as No Child Left Behind, or NCLB, and that was not because of the federal testing, which had been in place and unprotested for some 32 years, but because of the line deep within the new law requiring that almost all students be proficient by 2014, and States now must demonstrate via test results that their students are making Adequate Yearly Progress towards that goal. As you may have noticed, were halfway there, calendrically, but not test-score-wise. Time is running out; each additional year of stagnant NAEP test scores makes it statistically more improbable for students to improve achievement enough that virtually all are proficient in only six more years. Such an intractable problem calls for a conference; or, if you prefer a forum. And then a funny thing happened on the way to the forum (or maybe once there, quisnam teneo? Who knows?) a solution to the problem was discovered, or created, or invented. It can be found on pages 8 and 9 of the conference oops, Forumreport, An Explanation for the Large Differences Between State and NAEP Proficiency Scores Reported for Reading in 2005. Lets take page 8 first, wherein is quoted a federal document from 2004, Standards and Assessments Peer Review Guidance: Information and Examples for Meeting Requirements of the No Child Left Behind Act of 2001 and the quote comes in two parts. The first part says The Proficient achievement level represents attainment of grade-level expectations for that academic content area. The second part says We remain committed to ensuring that all students can read and do math at grade level or better by 2014. That is the basic purpose and mission of the No Child Left behind Act. Or, in four little words, proficiency is defined as functioning at grade level. But then, a funny thing happened. On page 9 it says, NAEPs definition of Proficient is not bound by grade-level expectations or proficiency in a subject. In one brilliant etymological stroke, the Forum-goers solved the NAEP student-achievement proficiencyshortfall, simply by re-defining proficiency. Wow. Confirmation of this not-merely-proficient-but-advanced exercise in semantic re-definition comes from the Office of the State Board of Education in Utah, where a memorandum Using NAEP to Compare States or to Confirm State Test Results explains that henceforth, The percent of students at or above Basic is the most appropriate NAEP statistic for confirming State reports of Adequate Yearly Progress expressed in terms of the States percentage of students scoring Proficient or higher. In plain English, the Forum-goers decided, Basic on the NAEP test equates to Proficient on any State-preferred test. Now, all students have to do is make basic on their NAEP (or local equivalent) score, and theyll be counted as proficient as the 2014 deadline requires. More next week. Former Vermonter Martin Harris lives in Tennessee.

Vote on this Story by clicking on the Icon


Use the comment form below to begin a discussion about this content.

Sign in to comment