The District 202 School Board learned at their June 13 meeting how the Evanston Township High School administration will evaluate the new Freshman Humanities program over the next four years, assessing components such as rigor, curriculum, teacher and student supports, student performance and course selections. Board members engaged in discussion among themselves, sometimes contentious, about whether there needed to be specific numerical goals assigned to the evaluation.
The Assessment Plan
Dr. Judith Levinson, director of Research, Evaluation and Assessment, and Dr. Carrie Livingston, senior research associate, told the Board their research will address six questions:
• Is the new curriculum rigorous and aligned with the Illinois Common Core Standards?
• Is the curriculum implemented with fidelity?
• Do teachers have enough support to fully implement the curriculum?
• Are the academic support structures aligned with the new curriculum?
• Do students in the course perform the same or better over time than previous cohorts of students?
• Are more students, particularly minority and low-income students, enrolling in honors and AP English and History classes over the course of their high school career than previous cohorts?
The new curriculum to be developed this summer will this year, according to administrators, be evaluated for rigor and alignment with the Common Core standards by several outside consultants, among them Ann Johnson of Curriculum21 and Marilynn Kuliecke of Pathways Learning Group.
Dr. Levinson explained that the curriculum will also be evaluated for fidelity of implementation. This means that administrators will ensure that whatever is supposed to be taught according to the design of the curriculum is actually what is delivered in the classroom. Classroom visits, teacher surveys, administrator interviews and feedback from a group of “Critical Friends” will take place over the next two years, and an additional teacher survey is planned for the 2013-14 school year in order to provide data to evaluate fidelity of implementation. Similar analysis, inputs and time frames will be utilized to evaluate professional development for Freshman Humanities teachers.
Another area of evaluation will be the alignment of support structures such as Project EXCEL, AVID, STAE and Freshman Reading to the new curriculum. External experts will provide a review of this alignment in the first year; teacher and student surveys and achievement data will be analyzed in the first three years of implementation.
Over the four-year evaluation period, administrators will assess whether student performance has improved and whether more students – especially minority and low-income students – are receiving earned honors credit and are enrolling in honors and AP courses in their later years in high school. Dr. Livingston explained that each successive cohort of students that takes the Freshman Humanities course, beginning with the 2011-12 school year, will be followed throughout their high school career to document what courses they elect to take and how they perform in them. Other measures of student achievement included in the analysis will be graduation and college acceptance rates, as well as performance on standardized tests.
Finally, Dr. Livingston said, students will be surveyed each year about their satisfaction with courses. In the first year, student focus groups will be held to elicit ideas and suggestions to improve the program.
Board discussion ensued that centered on whether there should be actual enumerated goals set to determine whether the program was meeting a certain level of performance, or whether data should be collected and compared against previous experience, or some combination of the two. No decision was made by the end of the meeting, but many perspectives had been voiced.
“I’m disturbed by a lack of numerical goals,” remarked Board member Deborah Graham. “It will be difficult to evaluate success [without them],” she said.
Dr. Levinson explained that it was not up to her, as the program evaluator, to set the goals. “You’re talking about Board goals,” she said. “That wouldn’t be something that program evaluators do.”
“I think the community will have greater comfort if we set numerical goals and we meet them or we don’t meet them,” Ms. Graham said. “We need to determine what success means.”
Board member Jonathan Baum agreed with Ms. Graham.
“How are we going to demonstrate that this is a success?” he asked. “Our credibility is on the line. … We have to have a measure of success.”
Board President Mark Metz did not object to the idea of setting numerical targets, but said that to do so at this point was “premature.”
“For the time being, we’re on the right track,” he said. “First, we’ve got to make sure we do it right – then we can evaluate it. Any number we set now will be arbitrary.”
Gretchen Livingston supported the idea of quantifying the success of the program.
“My view is that we need some sort of target,” she said. “This goes to my previous complaint about our goal-setting process in general. … We sit back and then we let things unfold. … [The] danger is if we are ascribing success in an after-the-fact fashion.”
Mr. Metz agreed. “I’m not against putting numbers on this if we can figure out a way that makes sense,” he said.
Vice President Martha Burns presented another perspective. “We have to begin thinking about this in a different way – they are all on par – this is seeing how everyone in the class is doing [together],” she said. “I don’t see where the quantitative measures come in right now. … I am very concerned about making sure that rigor is in these courses [and] that the best teachers are teaching these classes. I am concerned about whether students are having a great or poor experience in the classroom.”
Board member Rachel Hayman expressed a similar perspective. “[The new Humanities program] had to do with removing barriers for a large number of students. [Making honors work] accessible – how do you measure that?” Ms. Hayman remarked.
“I am all in favor of measurable results and good metrics,” said Mr. Metz. “We’re looking at exactly the right things. What we’re going to get is a result. We’re going to be able to compare that result to what we got before we started doing things in this new way. Then we can see where that is, and then we can begin setting metrics for improving that result incrementally.”
“Typically … in good research you have a comparison group, which is what we’ve put in place here,” said Dr. Levinson. She explained that the results from the new program will be compared with the results from the old program and the question asked, “Is it the same or better?”
Board members agreed that the topic needed more discussion and that they would address it again at a future meeting.