Every year, US News and World Report releases its famous College Rankings lists, including “National Universities Rankings,” “Liberal Arts Colleges Rankings,” and about 50 more. Parents, high school guidance counselors, students, and even the colleges themselves eagerly await the rankings so they can brag and attempt to impress others with the numbers and hierarchies.
We’re here to tell you that these lists from US News are useless.
Unproductive. Meaningless. Even harmful.
The hype seems to be unavoidable, but the wide audience that US News targets is largely unaware of the major flaws of these rankings. Not only are the metrics used to calculate arbitrary, but the resulting competitive, rankings-focused nature of colleges across the country is detrimental for students and institutions alike.
We want students, parents, and all those affected by the rankings turmoil to ignore these pointless lists in order improve their college application experience. It is time to expose the shortcomings. Here, we outline the main problems with the US News College Rankings, along with examples and consequences of the nonsense we’re talking about.
The Major Problems with the US News College Rankings Are:
- They are inconsistent
- They are missing critical pieces of information
- They use a bogus “Reputation” measure
- They encourage cheating and schools game the system
- They lead to overall negative outcomes
Let’s look at each of these areas in detail and explore the 14 reasons why these rankings are meaningless.
Inconsistency in the US News College Rankings
The metrics used by US News College Rankings are changing constantly, so it’s impossible to make meaningful comparisons between schools from year to year. Is this a tactic to get rankings to shift slightly to keep people interested, or is it really necessary to change the ranking system from year to year?
Reason #1 – Unstable rankings and irregular shifts continue to be a problem for US News.
People in higher ed have been complaining about these unreliable measures for over 20 years. Here is a personal letter from the president of Stanford University in 1996, directly criticizing the USNWR college rankings for their inconsistencies.
Reason #2 – Inconsistencies cause problems for students.
Imagine you are a high school student looking for colleges to apply to. You have a few in mind and want to see what kind of trends are taking place at those schools. US News trends tell you that one of your schools has dropped 10 spots in the past couple years, but due to changing metrics, you have no idea why, or what this implies about the college. This Washington Post article highlights specific ways that US News changes the ranking methodology each year, and why this is a problem. You can even look at their own explanation for changing the metrics, yet the differences are still not solidly justified.
Reason #3 – Flawed metrics have persisted for over a decade.
This study, all the way back in 2002, offered constructive fixes for most of the flaws in the US News rankings, with suggestions for improved methodology and examples of better metrics to adopt. The same metrics flaws have persisted at US News for nearly 15 years, giving the impression of significant changes in quality year-to-year where there are none, and that the criticisms of those metrics are the same today as they were in 2002 – which means any changes in metrics aren’t aimed at “improvement” in measurement, or if they are, the people working to improve them are not successfully doing so. In short, no. They have not fixed the problem.
Do you find this article interesting & helpful? Then help us spread the word!
The College Rankings Have Missing Pieces
There is no measure of “quality of education” in the US News rankings, nothing to show what students actually learn, no measure of outcomes after graduation. No measure of student debt. And, perhaps, the biggest missing piece? No input from students.
Reason #4 – Moving up the rankings has nothing to do with improving the student’s experience.
This “success story” about Northeastern University climbing the US News rankings actually points out flaws in the system. Tangible actions and investments that would lead to more attention per student, a more diverse campus and different mix of residence/commuter students, better facilities, more resources, and new amenities didn’t actually budge the ranking. What did, the article shows, was networking between institutional presidents. The implication: Just as the US News rankings reflect factors that don’t seem to impact student experience, they don’t seem to move to reflect changes in the actual student experience.
Reason #5 – Failure to adapt.
This article from 2014 points out a specific flaw in the U.S. News rankings. The rankings editors ignored suggestions from members of Congress to include sexual assault data in its ranking factors. “Campus safety is not among the factors U.S. News believes is directly tied to academic quality, and we believe that it should not be part of our main ranking methodology, even if it could be measured,” said US News‘s Data Director. We’d think students would want to know about safety on campus, as these colleges serve not only as “academic” institutions, but homes for students.
Reason #6 – Failure to consider student debt load.
Student loans are an inevitable part of the college experience for many students, therefore student debt is as well. The US News college rankings, however, leave this measure out completely. They publish a “Short List” where you can find which schools will leave you with the most debt, but this is ultimately unhelpful when not factored into the overall rankings.
Reason #7 – Doesn’t take into account the student experience.
Who cares about the students’ opinion? The answer is, very clearly, not in the US News college rankings. The monster list of schools leaves out what we think to be an important measure: how the students feel about the school. Though difficult to obtain, current student opinions are vital for prospective students and parents, and one would think that a powerhouse like US News would be able to gather this information. The Voice of the Student Survey, which allows current students to give honest opinions, resulted in the list of Liberal Arts Colleges Rankings, with everything from alcohol’s influence on the social scene to student satisfaction.
The Bogus “Reputation” Measure
US News asks for “peer assessments” from various college presidents, high school counselors, admissions directors, etc. They ask about “intangibles” such as “faculty dedication to teaching.” This can turn the rankings into a popularity contest, fueled by existing prejudices, and allowing prestigious reputations to work in a positive feedback loop to reinforce themselves. This reputation measure comprises more than 20 percent of the overall ranking.
Reason #8 – The popularity contest.
Who is qualified to objectively measure reputation? The simple answer is no one. In this short statement, Richard F. Wilson, President of Illinois Wesleyan University, shares the Annapolis Group’s 2007 decision to no longer respond to the reputation portion of the US News surveys. Wilson characterizes the group of 123 liberal arts colleges and universities as condemning the measure, saying, “There are very few presidents, provosts or admissions directors who feel they know enough about other institutions to rank them. Lacking an intimate knowledge of the impact that an institution has on its students, we resort to proxies like size of endowment or selectivity of the entering class to infer quality.”
Reason #9 – Skewed rankings by insiders.
When forced to rank, higher ed professionals have no choice but to skew rankings. Presidents and admissions directors face unfamiliarity with other institutions by looking at which institutions spend the most money, producing lists of which universities are most financially exclusive, rather than which give students the highest-quality education. This article gives specific issues with the reputation factor, as well as some better ranking alternatives.
Schools Learn How to Cheat the US News Rankings
Some schools will deliberately spend money on the things that US News ranks important, simply to rise up in the rankings, and it works. This allows wealthier schools to climb the list easier than others. Not only is this unproductive, it’s a practice that has led to outright cheating exactly where it shouldn’t take place: in the world of higher education.
Reason #10 – When colleges refuse to participate, US News penalizes them.
When schools don’t report their numbers, USNWR simply plugs in their own measures and assigns a lower rank… despite nothing actually changing on campus. Colin Dover, who has served as president at Reed College and Dean of the University of Pennsylvania Law School, details his experience with the rankings, as a participant and a bystander.
Reason #11 – The US News rankings cause institutions to cheat.
Cheating isn’t just happening among students. While US News claims misreporting is rare, this article lists some schools which have admitted cheating to increase their ranking on the list, using a variety of methods. Cheating is frequently like an iceberg; what you see is a small proportion of what’s actually occurring, and the implication is that if the admitted cheaters are substantial, there’s a great deal of hidden dishonesty influencing the rankings into uselessness — a reflection of who’s cheating best, not who’s teaching best.
Reason #12 – Schools know how to work the system… and they do.
In his article, Robert Woodbury describes 10 ways to climb up the rankings (and he proves how skewed they are in the process) and he says, “The ranking of colleges and universities by neat formulae and dubious statistical measures is distorting, illusory and, ultimately, harmful to democratic values we all share.” This witty and brutally honest article speaks for itself and is worth the read.
US News College Rankings Result in Negative Outcomes
The extra spending (caused by thirst for higher rankings) is increasing college costs all around the country. It also induces anxiety in high school students (and parents) who feel they must be accepted into a top-ranked college to earn the respect of their peers.
Reason #13 – Rankings lead to higher tuition.
In this piece about George Washington University, the author connects the way the US News rankings reward financial resources spent per student, alumni donation, and other monetary metrics with the preferential shift toward wealthy applicants over middle-class or high-needs ones, and the pricing-out of less wealthy students. Aptly pointing out that the kinds of amenities US News privileges (an Olympic-sized swimming pool, co-ed sauna, juice bar, golf simulator, and climbing wall at the University of Pennsylvania) have to be paid for somehow, he connects recent tuition spikes to that capital spending — crowding out students who can’t afford higher tuition or saddling them with ever-growing student loan debt, which shadows one’s entire adult working career.
Reason #14 – When students choose colleges based on US News rankings, they’re in trouble.
Malcolm Gladwell, a respected social scientist who has often spoken on the subject of higher education, details here some overall problems with the rankings. Ultimately, students are people, and the very act of ranking universities as the best is useless if you don’t discuss for whom – or at what – they are best. And rankings drive “elite-ness” of schools but, as Gladwell says, “Elite institutions screw us up.” The “best” school for an upper-class white male student may be a tough experience for minority students from lower income brackets. Purportedly objective rankings can cause students to choose or feel pushed toward “best” colleges that will not necessarily be the best learning environment for them and, in turn, lead them to either change their major or, even worse, drop out of school.