In 2006, The University began conducting and storing the results of teacher evaluations online - a move that paved the way for making the results of evaluations available to students. But the move also caused student participation to fall off dramatically, causing some instructors to question the validity of the evaluations.
"I'm disturbed by the fact that with the system being online, simply whoever's interested can go and make a response instead of the old system of pencil and paper with everyone in the classroom," said Michael Leff, chair of communications.
Traditionally, students filled out paper evaluations in class near the end of the semester, and the evaluations were given to the instructors and their department chairs. But in a 2006 move to save money, The University switched to an online format. Paper-based evaluations cost about $60,000 per year, while online evaluations could be completed for about $10,000.
But an unintended consequence of the move online was decreased participation. In the fall of 2004, 55 percent of students completed instructor evaluations. But by fall 2007 student participation had fallen to 29 percent.
Jim Redmond, chair of the journalism department, said he disregards the results of the evaluations due to low student participation.
"There may be only two responses from a class of 35, and they're both negative, so what?" Redmond said. "We need input from the average student, not two outliers in a class of 35 ... Only students who either love or hate a teacher fill them out."
Faculty Senate member John Petry studied the effects of the shift to an electronic format. He said some of the concerns from faculty members are unfounded.
"There have been questions of whether these things are used or not, but in all fairness, they should be," Petry said. "There was some concern from the senate that the electronic would have lower participation and therefore only the students who didn't like the class would fill out the surveys, but that has not been found to be the case."
University of Memphis Provost Ralph Faudree said studies have shown that while student participation may be lower, the results of the evaluations are still credible.
"This idea that if it's voluntary, only the extremes - only the people who love it or hate it will fill them out - that is not borne out by studies," Faudree said.
But despite assurances from the administration, some faculty members have suggested ways to increase participation, such as making evaluations mandatory for registration or offering incentives, such as allowing students to view their final grades early if they complete course evaluations. In the past, The University has entered students in drawings for free prizes, like iPods, if they filled out evaluations.
But the information a student can get from compiled student evaluations isn't perfect either. The SETEs are not mandatory, so fewer and fewer students fill them out each semester. Many administrators and faculty members claim only outliers - students who were either angry or very pleased with the class - even bother to fill out them out.
To improve reliability, Redmond suggested making the evaluations mandatory, perhaps as an addition to the registration process.
But Richard Ranta, dean of the College of Communication and Fine Arts, disagreed, likening a mandatory evaluation process to a dictatorship.
"It's like communism, where everybody votes or you get shot," said Ranta, whose college had the lowest rate of SETE participation at The University last fall.
But many students said they'd be more than happy to fill out the forms if it meant the administration would pay more attention to their complaints.
"They're no big deal. I'd be fine with filling them out," said Brandon Belk, freshman chemistry major.
Taylor Marshall, junior nursing major, said a low participation rate shouldn't be used as an excuse for suppressing student comments.
"I think everyone should have an equal voice, whether two people fill them out or an entire class does," Marshall said. "Those two people put time into those, and their opinion should be acknowledged. Otherwise, why should any of us even bother?"
Warder also disagreed with categorically dismissing evaluations due to low average participation, saying that in many situations, a high turnout in a single class can add credibility to the survey.
"If everyone in the class comes in and they laud the professor or everyone in the class has major concerns, that tends to be more credible evidence," Warder said.
Ranta defended the CCFA, saying many students may not fill out evaluations because they're fairly happy with their institution and don't have any reason to do them.
But Richard Warder, dean of the College of Engineering, the highest participating college on campus, expressed a different idea. He said students would be much more willing to participate and be candid if they believed the instructors would take their comments seriously.
Statistics suggest Warder might be right. While taking SETE, approximately 85 percent of more than 20,000 responses said they thought their responses would be taken seriously.
Louis Franceschini, statistician in the College of Education, said the number of responses, even though some see them as low, could be enough of a representative sample to prove a valid set of data.
"I can't, from a statistical standpoint, say that 30 percent sounds good," he said. "But who's to say that 30 out of 100 students isn't representative?"
Even though overall SETE participation is down, there are some groups who have maintained consistent response rates. The College of Engineering had a 50 percent completion rate in fall 2007. The Audiology and Speech Pathology program, a small program with only 325 classroom seats filled, had the highest rate at 71 percent.
Warder said instruction is one of his college's top priorities and high student participation in teacher evaluations was a result of that.
"I think those numbers reflect the attitude that our faculty have towards providing good instruction," Warder said. "As a college, I'm really proud of our faculty because I think they take student advising and classroom instruction very seriously. That may be the reason why the participation is better."
Sarah Lowder, a graduate speech pathology student, credited the tight-knit nature of students in her program and the faculty's receptiveness to criticism for their high participation rate.
"The professors usually ask us to fill them out," Lowder said. "But for the most part, we feel like our opinions really do make a difference for the classes after us."
Lowder said her program takes two years, and there are only 21 people in her graduating class. She said graduate students in the program get to know the first year students pretty well, and the students are fairly critical of their instructors in hopes that it will improve the program for the younger students. She said they feel like the faculty is receptive to the feedback.
Some faculty members said this concept could be spread to the rest of The University. Faudree said that by making evaluation records available to students as a resource for picking classes, more people will contribute because they know it will be useful to other students.
"In talking with other universities that have made more information available, is that ... it does tend to increase the level of participation because the students feel like the information is going to be used and for them, they are much more willing to participate in the survey," Faudree said.
Ann Harbor, director of academic affairs administration, said The University plans to adopt a system similar to the University of Mississippi, which provides a "carrot" for students to fill out their evaluations.
"Once the faculty member posts a grade, the online system will do a search for the students who filled out their evaluation for that section," she said. "The student can then see his or her grade for that section, instead of waiting the eight or nine days for grades to become official."
Currently, students at The U of M can view a limited set of SETE statistics online through the Spectrum Portal, but only instructors and administrators are able to view student comments. Faudree said concerns from the faculty senate played a part in the decision to omit student comments from SETE reports.
"I think the faculty senate has some concerns that if you put all the comments out, they will cause faculty members to change how they grade and things like that to try to get better evaluations. And therefore not show the same requirements and high expectations that they should," Faudree said.
But while some administrators fear instructors will resort to giving away grades in order to get high evaluation scores, others supported making the entire evaluations - including comments - available to students.
Faudree said even though participation is down, studies have shown that better students are evaluations.
"When you go voluntary, you tend to get higher performing students participating," Faudree said. "If you look at the grade point averages, A students fill them out at a much higher level than B's C's and so forth."
During his research, Petry said he found some students may use evaluations to choose classes based on the rating of the instructor, but there are other factors that play a much bigger role - namely, time and place.
"At Harvard, they instituted a system of public evaluations, and they found that students still went to the earliest class at the closest building," Petry said. "If we identify the outstanding professor, and we schedule him for 2 in the afternoon, most people won't go to that class. The majority of people want to go to class from 8 a.m. to 1 p.m."



