“Are your instructors good teachers?”
That’s it. That is the question that is apparently going to define Stony Brook University. The Statesman published an article with the headline “Stony Brook Ranked Fifth Worst Professors By Princeton Review” but the proper context and acknowledgement of flaws in the rankings were not pointed out.
The article mentions that the rankings were based on student surveys, but what did those surveys entail? First of all, the student survey was 80 questions long. The average student’s attention span is about 15 minutes long, according to opencolleges.edu. Now, assuming you answer at the pace of a question every half minute, that survey would take about 40 minutes to complete. I don’t know about you guys, but I don’t have time for that.
The second problem with the article is that it didn’t properly attribute information, which makes it extremely misleading how the survey was conducted. The author mentions that “for the professorial rankings, students were asked how strongly they agreed with the following statement: Professors are interesting and bring their material to life.” This suggests that that was the actual question asked, when in reality, that information was given by David Soto, the co-author of the compilation.
In addition, this doesn’t match with what is said on the Princeton Review’s own website, which says that “Are your instructors good teachers?” was the actual question asked. But regardless, who at the Princeton Review thought of using such a vague, horribly-subjective question like “Are your instructors good teachers?”
There are complexities to every situation, and I’d say that 90 percent of my professors at Stony Brook University are very solid teachers. Some weren’t great people and some were even worse communicators, but most were solid instructors.
The third issue I take is the fact that, fundamentally, college rankings are flawed, especially student surveys. In the article, it was mentioned that Soto believes that students are the best gauges, but no we aren’t. Alumni are the best gauges. People with no invested interest are good gauges. Students are inherently stressed. Current students do nothing to further the point. I know there have has been plenty of times where I’ve been upset with a professor most of the semester but realize over break that they taught me a great deal.
To further that point, college rankings, especially ones like this, don’t ask everybody how they feel. I most definitely did not get a memo from the Princeton Review asking me to fill out this survey, and I am going to assume the majority of students that attend Stony Brook didn’t either. So, who is to say this random grouping of Stony Brook students is worthy of this attention? Actually, better yet, let us know the number of people who answered the survey. A simple sample size calculator, run by suveysystem.com, tells us that in order for a population of 25,000 — Stony Brook University — to be accurately polled with a 4 percent margin of error and a 95 percent confidence level, there would need to have been 378 participants in the Princeton Review survey. Simply put, we have no idea if they even got close to asking the right amount of people.
The last big problem with the article is it doesn’t ask enough professors how they feel. There are 1,000 sides to every story, but only two that often matter. In this case, I would, as a reader, have liked to see how more professors respond to this ranking. Especially because, at least in my opinion, I see professors work their butts off to make sure the class works for everybody. Like always, never judge someone based on shoes you have never filled.
College rankings suck. They suck for the university, they suck for the students, and they just make you feel miserable. Don’t put too much stock into what the Princeton Review has to say. Heck, if it was the Stony Brook Review, no one would care, so don’t care if it says “Princeton” on the front.