In the world of admissions officers, August is an incredibly stressful time. The source of this anxiety isn’t the students moving in, the recruitment travel restarting, or the summer ending. It’s THE RANKINGS.
Most of my colleagues describe their feelings towards THE RANKINGS as ambivalent. Ambivalent means they hate them, they really hate them. Even when their college or university is blessed by THE RANKINGS, they still hate them, even as they brag about them.
Shameless (clearly hypocritical!) plug: Mason again ranked in the top “schools to watch” in U.S. News and World Report. Last year we were number one in this category, also called, “Up and Coming national universities”, this year we are number two. My boss remarked that this makes us the “Up and Second-Coming” institution, but as a public university we have to steer clear of such religious overtones.
This passionate distaste for THE RANKINGS, in between bragging opportunities, always seems bizarre to me since clearly students, parents, and most of society find them at least moderately useful, as judged by massive internet traffic and magazine sales. On the other hand, I think it’s really important to put THE RANKINGS into some reasonable context.
With all due respect to Bob Morse, my longtime acquaintance that runs the U.S. News rankings, the rankings are, for the most part, hooey. That’s a technical term meaning, “a lots of statistical data that doesn’t actually mean a thing if you’re trying to determine the quality of a school.”
U.S. News, of course, starts with a massive survey of experts on college and university quality with no vested in interested in manipulating the survey results, and by that, of course, I mean exactly the opposite. In reality, university presidents, provosts, and admissions deans (that’s who fills out the survey) don’t have all that much time to brush up on everything going on at the several hundred other colleges and universities in the survey, and, as has been reported in recent articles, they have pretty strong motivations to adjust their responses to favor their own institutions. Fortunately, I genuinely feel that Mason is the best university – ever – so I have no ethical risk in how I respond…which should give you some idea of how these things work.
These surveys are the most influential part of the U.S. News ranking, but those surveys are balanced by statistical data that is completely accurate, impossible to manipulate, and corresponds exactly to the quality of each institution, and again by that I mean the exact opposite. THE RANKINGS, for instance, love the SAT and ACT. Even while you try to convince us that you are more than a test score, THE RANKINGS assume that an incoming class is just that – an average test score. The only thing more important than the incoming class is how much money each school spends and earns.
I can hear those logic gears turning in your head as you wonder, “What the heck does how much money a school earns and spends have to do with whether it’s the right school for me?” Good question. With money as a huge factor, of course, it guarantees that the rankings won’t change all that much from year to year, which is great if you’re, say, selling magazines to people who expect to see the same names at the top of the list each year.
Recognizing these tiny, wee flaws in their methodology, U.S. News also offers a bunch of other rankings, including a survey of guidance counselors and some specialty rankings (Did I mention, Mason again ranked in the top Schools To Watch?) based on the same entirely fair and unbiased survey of presidents, provosts and deans they use for the overall ranking. Princeton Review and Forbes (which ranked Mason the top public university in the D.C. area!) use student surveys. Of course, students have no bias and are a great source of statistically sound data, and by that I continue to mean the exact opposite.
Very slowly there are some better tools being developed. The National Survey of Student Engagement does some great work trying to look at outcomes, what actually happens to students while enrolled at colleges and universities, and U.S. News has been publishing some of their results as well. There are also some interesting specialty rankings being developed for green schools, religious institutions, and gay-friendly campuses, just to name a few, that are likely to be a lot more help as you try to navigate your college search.
The bottom line is that the rankings can be an interesting shortcut to developing your interest list, but don’t get sucked into thinking there’s a lot of substance behind them. My suggestion: build your own ranking based on the things you think are most important. Send me your suggestions for what should go on that list and I’ll post them in a future column. Who knows – maybe we can control THE RANKINGS of the future! Be seeing you.