February 17, 2010

Students Reject Web Evaluations

Print More

One year ago, the College of Agriculture and Life Sciences and the School of Industrial and Labor Relations both made the results of their end-of-semester course evaluations available online. Now, students and teachers weigh in on the vastly different response rates that the colleges’ divergent approaches to conducting these evaluations have produced.

“Course evaluation is a messy thing since every college has latitude to do what they want,” said Mike Hammer, director of data management for the College of Engineering.

CALS and ILR have taken two different approaches to conducting evaluations and publicizing their results. CALS asks students to voluntarily complete evaluations online, although professors still have the option of administering an in-class evaluation by hand, said Prof. Donald Viands, plant breeding and genetics.

ILR students, on the other hand, are not given an online option. They must fill out two different evaluations in class, one created by the professor or the department and the other designed by the school’s Student Government Association, said Heather Levy ’10, SGA president.

The SGA evaluation questions, which students can answer on a one-to-five scale, address topics such as the effectiveness of the professor, the relative emphasis on class participation and the relevance of readings and assignments. Although using this evaluation is voluntary, a majority of professors have shown a willingness to administer it to students in their classes, Levy said. She added that the number of professors involved has increased from nine to 14 since the SGA first started issuing the questionnaire.

“We try and address the top ten questions that students not only want the answers to but that they can also answer quickly,”administer the online evaluation unless it is their first time teaching the course. But according to Viands, the numerical results that are available online are geared toward information that faculty and department chairs are seeking, rather than information to guide students.

These differences have resulted in a more consistent response rate for the ILR evaluation because nearly all students complete it unless they are absent from class, said Levy. The CALS online response rate, in comparison, is only around 50 percent, said Prof. George Boyer, labor economics.

Because of this statistic, the CALS Faculty Senate recently decided to exclude the evaluation results of courses receiving less than a 60 percent response rate, Viands said.

“It’s rather bizarre,” said Boyer. “We have this weird combination of students who want to see the results and then not do the [online] evaluation.”

Amanda Hill ’12 said that completing the evaluations outside of class is an inconvenience for her and that she was unaware that CALS posted the results of the evaluations online.

This lack of awareness may be because CALS has not been actively publicizing the public evaluation site, which also includes a copy of the course syllabus, although a link is posted on the CALS website, said Shawna Lee Lockwood ‘05, assistant director of the CALS registrar office.

However, the CALS website has attracted many students despite this lack of promotion. The site received around 5,200 hits in its first year from about 2,200 different students, according to Lockwood. In comparison, the College of Engineering has provided its evaluation results online for the past eight years, with their site receiving around 10,000 hits a year, Hammer said.

In contrast, ILR sent out e-mails to all students taking courses through the school at the beginning of the semester with a link to their evaluation results and an explanation about the goal of the site, said Levy. The ILR website has received a lot of positive feedback from current students who are choosing classes for next semester and prospective students who appreciate “having access to an upperclassman network,” Levy said.

Katie McDonough ’12 said that sending out a mass e-mail was probably more feasible for ILR because the school is much smaller than CALS.

“We just don’t want to flood [CALS] students with e-mails,” said Lockwood. “It might be something we want to do in the future, but we’ve mostly been relying on word of mouth.”

CALS and ILR have both decided to exclude open-ended responses from the evaluation results posted online because the feedback is often rude or irrelevant and may generate legal issues that the school does not want to handle, according to Viands.

“There might be a rare comment that a faculty member feels is slanderous and that Cornell could be sued over,” said Viands, “and unless you have someone screening all of the comments then you shouldn’t do it at all.”

Students have reacted differently to this omission. McDonough said that she would not view the open-ended responses even if they were available because she believes most students do not take enough interest in the evaluations to write something helpful.

On the other hand, if CALS were to provide open-ended responses, Hill said she would utilize them and take the time to contribute her own ideas.

“If I have real comments that’s where I’m going to put them because the bubble-in sheets don’t leave room for us to express our opinion,” she said.

The SGA may look into adding open-ended response questions to the ILR evaluation in the future, Levy said.

But she worries that this may exclude students who do not have strong opinions about the class, but whose responses are still important.

“You have people who love the class and people who hate the class, but people who are in the middle may not even write a response,” said Levy, “but everyone’s going to circle a number and that’s what’s going to make the [ILR evaluation website] an important and accurate resource for current and future students.”

Original Author: Samantha Willner