One goal of the District 65 School Board is to increase the percentage of students who make “expected gains” on the Measures of Academic Progress (MAP) test. On Dec. 9, Kylie Klein, Director of Research, Accountability and Data, recommended that the District change the method it uses to determine if a student has made “expected gains” on the MAP test.

Under the model developed by the Northwest Evaluation Association, the owner of the MAP test, the “expected gain” for students during a school year is the “average growth” of all students in the nation, who are in the same grade and who started the year out at the same achievement level.

NWEA determined what it considers to be the average growth of all students in the nation, by grade and achievement level, in a 2015 study, which was based on a random sample of student test records and a post-stratification adjustment. The average growth is shown or can be determined by reference to various tables published with the study.

For example, the norm tables show that fourth graders who had a RIT score of 206 in reading on the Spring MAP test, on average, scored a 212 in reading as fifth graders on the spring MAP test. The expected gain between fourth and fifth grade for students who started out with a RIT score of 206 is thus 6 RIT points.     

District 65, though, added an extra requirement. Ms. Klein said under the District’s model a student must meet their expected gain under NWEA’s model, plus the sum of the standard errors on the pre-test and the current test. In January 2016, Peter Godard, then the District’s Director of Research and Evaluation, said the District added this requirement because they did not want to give themselves credit for a student having made gains if the gains could have been due to the standard errors in the pre-test and the current test.

The extra requirement made it more difficult to make expected gains.

On Dec. 10, Ms. Klein recommended that the District discontinue using the extra requirement.  She said some students who made their expected gains under NWEA’s measure were reclassified as not having met their expected gains because of the extra requirement. The extra requirement, she said, most often impacted students at higher achieving levels and students at higher grade levels, and that the requirement  reduced the percentage of students making expected gains using NWEA’s measure by 10 percentage points for math and 20 percentage points for reading.

She added that because a higher percentage of students in the higher achieving levels were white, multi-racial and Asian, a higher percentage of those subgroups were categorized as not meeting expected gains because of the District’s added requirement.

In response to questions, Ms. Klein said that eliminating the District’s requirement would show that higher percentages of all subgroups would make expected gains. As an example she said that 36% of black students met expected gains using the District’s approach, but that 51% would meet expected gains if the District’s added requirement were eliminated. She said, though, that the data would show that the gap between the percentage of white and black students making expected gains would look larger.

Ms. Klein gave additional reasons for eliminating the District’s requirement, including that it caused confusion, and took time and resources to make the calculations. She emphasized the reclassifications of students were not provided to individual students or their families.

Ms. Klein said she would present multi-year data showing student growth using NWEA’s model without the District’s additional requirement so the Board could see how growth was progressing over time using a consistent measure.

Board members discussed the proposed change and ultimately appeared to accept Mr. Klein’s recommendation. No vote was taken.