An article, “Taking a Fresh Look at Achievement Data For School Districts 65 and 202” in the June 9 issue of the RoundTable analyzed how the low benchmark to “meet standards” on the ISATs gives a misleading picture of student achievement and growth over time. This article focuses on another aspect of the ISATs: the abbreviated version of the Stanford Achievement Test, tenth edition, that is embedded in the ISATs.
Illinois legislation that authorized the development of the Illinois Standard Achievement Test (ISAT) required that the State’s achievement reports also provide comparative data showing how Illinois students perform in relation to students nationwide. The ISAT ostensibly does this by incorporating an abbreviated version of the Stanford Achievement Test, tenth edition (SAT-10), into the reading, math and science portions of the ISAT.
In 2006 the Illinois State Board of Education (ISBE) also used the SAT-10 in an effort to equate the “meet standards” cut scores for the ISATs at the 38th percentile rank nationally. In theory, this would have meant that a student would have to be achieving as well as or better than 38 percent of the students in the nation to “meet standards” on the ISATs.
While sound in concept, the national percentile ranks that are produced by the ISAT/SAT-10 are actually much higher than the national percentile ranks generated by the National Assessment of Education Progress (NAEP) and the Measures of Academic Progress (MAP). In addition, two national studies have found that the cut scores to “meet standards” on the ISAT are not set at the 38th percentile rank nationally, but that they are set below the 25th national percentile.
ISBE could not respond to the RoundTable’s questions asking why the SAT-10 that is embedded in the ISAT (the ISAT/SAT-10) generates higher results than both NAEP and MAP.
Moreover, in response to a freedom of information request, ISBE said it has no studies which analyze whether the ISAT/SAT-10 generates reliable normative data on a national basis.
For these and other requests for information, ISBE referred the RoundTable to Pearson, the company that owns the SAT-10. Despite requests by the RoundTable, Pearson did not respond.
NAEP and MAP Results Compared to ISAT/SAT-10
The National Assessment of Education Progress (NAEP) is given every two years to a representative sample of students in fourth and eighth grades from every state. The test, which covers reading and math, is sponsored by the U.S. Department of Education and is often referred to as the “Nation’s Report Card.”
The achievement of Illinois students on NAEP has more or less mirrored student achievement patterns nation wide. For example, the percent of Illinois fourth- and eighth-graders who scored at or above the nation’s average (the 50th national percentile) in reading and math on the 2009 NAEP ranged between 49% and 53%. Using that measure, Illinois students performed just about the same as all other students in the nation.
At the RoundTable’s request, Paul Zavitkovsky, of the Urban School Education Leadership Program at the University of Illinois-Chicago, analyzed the percentages of Illinois students who scored at the 25th, 50th and 75th percentiles on fourth- and eighth-grade NAEP exams in 2003, 2005, 2007 and 2009. What he found was that, “at those three points on the distribution, reading and math achievement for the State of Illinois and all students tested nationwide have been virtually indistinguishable since 2003.”
The analysis further indicates that Illinois students perform about the same on NAEP as all other students in the nation.
By contrast, the ISAT/SAT-10 produces national percentile rankings that are typically 15 to 24 points higher than NAEP, the Nation’s Report Card.
Mary Fergus, a spokesperson for ISBE, told the RoundTable in November 2009 that 30 questions from the full version of the SAT 10, are included in both the reading and math portions of the ISATs. She said these questions are selected from the full-length version of the SAT-10 form by Pearson and ISBE “as an abbreviated version that conforms to SAT 10 rules.” The number of questions answered correctly is used to obtain a score, she said, which is then “equated to those from the full-length form.” The score is then used “to look-up the percentile rank from the 2002 norms table.”
The “2002 norms table” was developed based on the test results of a sample of students in 2002.
The RoundTable subsequently asked ISBE in an e-mail and then in a freedom of information act request for a copy of the “rules” that govern the selection of questions for the abbreviated form of the SAT 10 embedded in the ISAT, documents showing how the questions conformed to the “rules,” and documents showing how a score on the abbreviated form is “equated” to a score on the fuller version. ISBE told the RoundTable it did not have copies of these documents and said to contact Adam Gaber, the media contact person for Pearson, for the information. Despite requests, Mr. Gaber did not provide the information to the RoundTable.
The table below, prepared by Mr. Zavitkovsky, shows the percent of fourth- and eighth-graders who scored at or above the 50th percentile nationally on the 2009 NAEP in reading and math, and the percent who scored at or above the 50th percentile on 2009 ISAT/SAT-10:
% Illinois Students Scoring Above 50th Percentile
on NAEP and ISAT/SAT-10
2009 NAEP ISAT/SAT-10 DIFF.
4th Reading 50% 73% 23
4th Math 49% 70% 21
8th Reading 53% 71% 18
8th Math 51% 75% 24
The ISAT/SAT-10 shows much higher percentages of students scoring at or above the national average than the NAEP does. Mr. Zavitkovsky says, “The NAEP reflects a broad national sampling conducted every two years by the national Institute of Education Sciences. The SAT-10 is based on a sample population that’s renewed only once every ten years by a single commercial publisher. When there’s a conflict, I think there’s good reason to privilege the NAEP.”
Mr. Zavitkovsky’s conclusion is supported by an additional analysis he conducted that compares District 65’s ISAT/SAT 10 results with results from the District’s Measures of Academic Progress (MAP) test. District 65 uses MAP results to inform instruction and to evaluate the effectiveness of its classroom teachers. The table below, prepared by Mr. Zavitkovsky, shows the percent of sixth-, seventh- and eighth-graders in District 65 who scored at or above the 50th percentile nationally on the MAP (given in the fall of 2008), and the percent who scored at or above the 50th percentile nationally on the 2009ISAT/SAT-10 (given in the spring of 2009):
% D65 Students Scoring Above 50th Percentile
on MAP and ISAT/SAT-10
2009 MAP ISAT/SAT-10 Difference
6th Reading 65 79 14
6th Math 67 85 18
7th Reading 70 83 13
7th Math 72 86 14
8th Reading 67 91 24
8th Math 72 80 8
As with NAEP comparisons, scores generated from the ISAT/SAT-10 show much higher percentages of students scoring at or above the national average than the scores produced by the MAP.
A year after ISBE used the SAT-10 in an effort to equate ISAT cut scores at the 38th percentile, the Northwest Evaluation Association and the Thomas B. Fordham Institute published a nation-wide study of state testing standards called, “The Proficiency Illusion” (2007). This study found that Illinois’ benchmarks for eighth-grade reading and math on the 2006 ISATs were actually set at the 22nd and 20th percentiles nationally, using national norms from the MAP as a yardstick.
A second study, “Mapping State Proficiency Standards Onto NAEP Scales: 2005-2007” (2009), was conducted by the National Center for Education Statistics and Institute of Education Sciences at the U.S. Department of Education. The study concluded that the ISAT benchmarks for meeting standards in eighth-grade reading and math corresponded to NAEP scale scores of 236 and 251 respectively, both of which were below the 25th national percentile.
Both studies found that the cut scores to “meet standards” for eighth-graders on the ISATs were well below the 38th percentile.
In its Freedom of Information Act request, the RoundTable asked ISBE for any study or analysis that specifically reports on, analyzes or discusses whether the ISAT/SAT-10 generates reliable normative data on a national basis. In addition, the RoundTable asked ISBE in an e-mail if it knew of any research-based reason why the ISAT-SAT 10 generates higher national percentile ranks than either NAEP or MAP.
ISBE responded that it did not have any studies or analyses concerning the reliability of the ISAT/SAT-10. In addition, ISBE did not provide a reason why the ISAT/SAT-10 generated higher national percentiles than NAEP or MAP. In both cases, ISBE referred the Roundtable to Mr. Gaber at Pearson; he did not respond to the RoundTable’s requests.
Meanwhile, ISBE continues to report results from the ISAT/SAT-10 as the state’s sole indicator of how students in grades three through eight are achieving compared to all students nationally.