College rankings have taken a big PR hit in the last
year. Some of it is deserved. However, as has become the norm in American
public discourse, people eager to jump on the “bash the latest unpopular thing”
bandwagon have demonstrated a remarkable lack of critical thinking skills. This is especially sad when the topic is
higher education – an institution that should be dedicated to encouraging
critical thinking.
An article recently published in Boston Magazine by Max Kutner purports to be about how
Northeastern University, located in Boston, managed to rise in the rankings of
the U.S. News and World Report by
“gaming” the system. Before going on, it
might be useful to note that “gaming the system” is typically defined to mean
manipulating the rules in such a way as to gain an advantage. It is generally understood that the entity
“gaming the system” is not breaking
the rules. Rather, the entity is
typically following the letter of the rules but not the intent. Breaking the rules would be subject to
disciplinary action of some kind.
“Gaming the system” generally is not.
Despite that, it is also generally understood that “gaming the system”
is an unscrupulous act designed to obtain an advantage unfairly.
The article opens with a description of the state of
Northeastern University in the early 1990’s.
Their situation was dire: the
school was under-enrolled and under-funded.
There was a real danger that if they could not turn things around, they
might have to close their doors. Enter
one Dr. Richard Freeland who is charged by the author with making “gaming the
U.S. News ….part of the university’s DNA.”
Here is a list of the things the university did under
Freeland’s administration that resulted in a rise in the school’s ranking from
162 to 98:
- Reduce class sizes
- Begin accepting the Common Application, which made it easier for students to apply
- Constructed new dormitories because studies showed that student who lived on campus were more likely to graduate
- Do some PR to boost the school’s image
- Report the number of students each year differently to reflect the number of students on campus instead of including co-op students
Wow. How nefarious of
them. The author emphasizes that Dr.
Freeland kept his eyes on the rankings throughout the improvement process. What he fails to acknowledge is that, while
the college rankings are far from perfect, they do include some measures that
legitimately affect the quality of education.
Smaller class sizes are not only linked to better learning outcomes,
they are also a measure that potential students and their parents find
interesting. Surely no one thinks that
making the application process more convenient and accessible is a bad
thing. And if graduation rates needed to
be raised (and it seems they did) then building dormitories sounds more like
“data-based decision-making” than “gaming the system.” Oddly, the author carefully avoids telling us
how much the graduation rate rose, but rise it must have – the subsequent
increase in ranking could not have been obtained otherwise. On what planet is that a bad thing?
The one item in the list that sounds like it might be shady
is the last bullet point in the list.
Northeastern changed their reporting methods. Dr. Freeland realized that the metric being
used by U.S. News hurt schools with
strong co-op programs. Northeastern
counted significantly more students each year than were actually on campus,
which made it look like the school was spending a lot less per student. He took his case to the U.S. News statisticians who declined to change their metric, but
who explained what they were doing with the numbers and why. As a result, Northeastern stopped including
co-op students who weren’t on campus in their numbers. Is that dishonest? I don’t think so. I think it makes for a more accurate picture
of their situation.
The article includes a list of schools caught flat-out lying
on the numbers they report to U.S. News. While the author acknowledges that this does
not fall under the category of “gaming the system,” he does offer this as
evidence that the rankings are irretrievably broken – an accusation the
magazine denies. Including them tends to
– intentionally or otherwise – give the impression that the measures
Northeastern has taken are as dishonest as these examples.
Not until the last few paragraphs do we find any evidence of
actual “gaming,” and these were introduced after Freeland retired in 2006. Northeastern stopped requiring foreign
students to submit SAT scores. Foreign
students, for whom English is often a second language, can have lower SAT
scores. Not to require scores from
foreign students may be a bit shady, although it has recently come to my
attention that taking the test represents a true hardship for many foreign
students by requiring an overnight trip to a distant city. Some might consider dropping the requirement
an effort to be more understanding. Then
in 2007, the school began a program whereby students could begin at NU in the
spring – thus excluding their data from the reporting. The author states that these excluded test
scores and GPA’s are “lower” but offers no evidence for this statement. They certainly could be, and if they aren’t,
one wonders why NU would begin the program.
As a final jab, the author points out that the measures
taken to improve educational quality at NU – and quality has undeniably been
improved by increasing retention and graduation rates, if nothing else – has
cost money, making the school more expensive.
This is undoubtedly true, but it’s an odd accusation to make. Typically the complaint is that we don’t
spend enough on education, or that when we do spend more, we don’t see an
improvement in results. Here is an
example of a school that spent more – and it paid off. The alternative was to close their
doors. Does anyone wish to argue that
they should have chosen that as the more honorable course of action? The price increase does, indeed, make
Northeastern one of the more expensive options out there, but price is one of
the factors every family should weigh in making decisions about where to send a
student to school.
In over 30 paragraphs of writing, the author only mentions 2
possibly unscrupulous methods Northeastern may have used to improve their
ranking. He mentions several instances
in which the school used metrics in the ranking to guide decisions that led to
improved outcomes. At one point the
author quotes Lloyd Thacker as saying, “Have rankings contributed to anything
beneficial in education? There’s no
evidence. There’s lots of evidence to
the contrary.” As a refutation of that
statement, I would point to Northeastern University.