Among the many methods people use to judge colleges and universities, few are as widely used as simple rankings. These lists, usually ordered by numerical scores or letter grades, are inherently appealing because they're easily understood and they tell us exactly where the authors think a given school stands. But they can also be confusing because some magazine, book and online publishers use vastly different criteria when determining academic status.
So which ranking is correct, and how are they calculated? What are some common criticisms of college rankings, and can you trust them? In this article, we'll answer these questions and more as we explore the ever-popular system of college rankings. We'll also take a look at some alternatives to college rankings that many schools are pursuing.
The most prominent set of college rankings in the United States is published by U.S. News & World Report. Every year since 1987, it has published its rankings in magazine form, and more recently in accompanying paperback guidebooks. The magazine now ranks graduate schools, too. Many schools use these rankings as part of their promotional materials, trumpeting a rise in standing, hanging celebratory banners or posting the good news on their Web sites.
Some companies simply offer rankings of what they consider the best schools, with a variety of criteria used to calculate an overall score. Others break down lists of top schools into categories like academics, social life, small colleges, big colleges, liberal arts schools, public schools, undergraduate experience and happiest students. Besides U.S. News & World Report, other popular rankings include Princeton Review guidebooks and those produced by the Center for University Rankings, which rates research universities.
Books like "College Prowler" offer an insider view from current students and recent graduates. These guidebooks present information about topics as varied as a school's party scene. Several Web sites, like studentsreview.com, provide college rankings completed by actual students, and these sites also claim to offer a behind-the-scenes view with information not found in traditional guidebooks.
In other countries, newspapers often provide rankings, such as Maclean's annual guide to Canadian colleges and universities. The European Union has also published reports on universities that included rankings. In addition, Newsweek published a list in August 2006 of the "Top 100 Global Universities." The list focuses on a school's international makeup, global impact, connections to other parts of the world and research accomplishments [source: Newsweek].
In the next section, we'll take a look at how publications calculate college rankings.
Calculating College Rankings
Inevitably, most rankings are based on raw data, but the way that data are calculated and weighted varies significantly between publishers. Sometimes publishers receive the data directly from schools, as is the case with U.S. News & World Report. Others rely on data drawn from university Web sites, research foundations or academic organizations. For example, Vanguard's college rankings, which focus on faculty quality, rely on data from the National Research Council.
When examining college rankings, it's important to look at what data the publication used and how it used the data. Many publications use other data sources or their own specialized surveys.
Data that are commonly used in rankings include:
- SAT and ACT scores of incoming students
- Students' high school GPAs
- Acceptance rate
- Alumni donations
- Student-to-faculty ratio
- Graduation rate
- Financial aid
- Transfer rate (also called student retention)
- Average class size
- Quality of faculty, which may be measured by research grants and prizes awarded and the frequency of publications, among other factors.
- Results from surveys completed by students or administrators
Although it's clear that ranking methods differ between publications, some use more unusual criteria to determine college standings. For example, Washington Monthly, a political magazine, says that its rankings -- echoing John F. Kennedy -- "ask what colleges are doing for the country" rather than "what colleges can do for you." This list focuses on how schools contribute to "social mobility," or raising people up from poverty, as well as how they promote "an ethic of service to country" and pursue "scientific and humanistic research" [source: Washington Monthly]. The Washington Monthly rankings also focus on how taxpayer money, such as in federal research grants, is used, and whether they consider that money well spent.
Of course, the Internet now holds some influence over rankings, both in how they're calculated and how they're publicized. Some companies provide additional college and university information on subscription-only sites. Other organizations collect data from nontraditional sources like a school's number of Google hits and links to the university's Web site from the sites of other universities. This method of ranking is often called the G-Factor.
Any discussion of how rankings are compiled inevitably leads back to U.S. News & World Report. For reasons that will be discussed later in the article, its rankings attract a lot of controversy. Let's look at how they rank schools.
U.S. News presents each school with a numerical score and ranks them accordingly, with schools divided into separate categories. They are:
- National universities
- Liberal arts colleges
- Master's universities
- Comprehensive colleges
- Business programs
- Engineering programs
The magazine provides further rankings for some categories based on region or if a school awards doctoral degrees.
In calculating each score, U.S. News relies on data supplied to them by the schools they're ranking. Each piece of data is measured differently in calculating the overall score. The composition of a score given to a school by U.S. News is as follows:
- 5 percent: alumni donations
- 5 percent: graduation rate (for liberal arts and national universities)
- 10 percent: financial aid
- 15 percent: faculty resources (which is a collection of factors like average class size and student-to-teacher ratio)
- 15 percent: acceptance rate
- 20 percent or 25 percent (depending on the school type): student retention
- 25 percent: peer assessment of the performance of other schools, performed by the top three officials of each school
That last part, the peer assessment, is the trickiest bit -- and it's one of the big reasons that U.S. News & World Report is one of the chief targets of the campaign against college rankings. In the next section, we'll look at some common criticisms of college rankings.
Criticisms of College Rankings
In terms of sales and the attention they receive, college rankings are as popular as ever, but they've also never received as much criticism as they do now. College administrators are some of the most vocal critics of college rankings. They accuse magazines and newspapers of padding their sales by producing rankings. After all, Americans, and people in general, have a tremendous appetite for top-10 and best-of lists. Administrators and professors also claim that rankings are inherently unfair and oversimplified, using a simple letter grade or numerical score to represent a school's worth. Some administrators also complain that they are forced to do work for the ranking bodies by providing all the data.
The ranking practice motivates some colleges to spend lots of money on areas that will improve rankings but not necessarily the quality of education provided. Because of this -- and the frequent use of rankings in marketing materials -- U.S. News has accused some now-critical schools of hypocrisy. Many of these schools have responded by pledging not to use U.S. News' rankings as a promotional tool.
Another concern is the way in which data are used. Some criteria appear overemphasized and are not a clear representation of what makes a good or bad college experience. One frequently used example is the alumni-giving rate, which can be important to a school's ability to erect new buildings or hire prestigious professors but may not have much of an impact on the overall quality of education that the institution can offer.
No piece of data is as controversial or considered as subjective as U.S. News' peer assessment survey. Twenty-five percent of a school's overall score comes from this survey, in which a school's top three administrators rate their peer institutions from one to five, or "don't know." Critics point out that with several hundred schools in the U.S. News rankings, administrators are placed in a position to score schools that they may not be familiar with, therefore skewing the ranking system. And although there is the "don't know" option, schools may be rated by incomplete or irrelevant information, or simply on hearsay.
To some, the entire idea of rankings is flawed. Critics claim that rankings are an oversimplification of a school's worth -- they give the impression that only a select group of schools matter. U.S.News ranks colleges on tiers, and a tier three or four ranking can greatly impact how a school is perceived in the eyes of a potential student. Rankings also seem to place emphasis on private schools when, in fact, 75 percent of college graduates graduate from public schools [source: NPR].
College counselors express concern that rankings ignore the diversity of schools. Not every student is suited to the same schools, and every year, they point out, students who could get into a supposed top-10 school instead choose to attend college somewhere else. This could be because they want a small liberal arts college, lower tuition or the large community of a state school. Others may desire a school that's solely focused on undergraduates, opportunities to do research with graduate students, or study with certain professors. College counselors claim that rankings neither make these important distinctions nor show the range of extracurricular options or specialized academic programs available.
As the face of college rankings in the US, U.S. News & World Report bears the brunt of this criticism. The president of Earlham College said that U.S. News was "unresponsive to serious professional concerns," despite the magazine's claims that it's open to improving or changing its system [source: NPR]. And the president of Sarah Lawrence College wrote in the Washington Post that "U.S. News benefits from our appetite for shortcuts, sound bites and top 10 lists" [source: L.A. Times].
In the next section, we'll look at how colleges and universities are responding to the ranking system.
Backlash Against College Rankings
Many schools have threatened to drop out of the U.S. News & World Report rankings by declining to provide the magazine the data it requests. But schools hesitate to do so because the rankings still carry a lot of influence. School officials don't want to do a disservice to prospective students by not providing them the information they need. However, some colleges and universities decided that they were ready to take that step. On May 5, 2007, 12 schools sent out letters to 1,000 liberal arts institutions asking them not to fill out the U.S. News peer-assessment survey. Many have agreed, led in part by the Annapolis Group, a coalition of liberal arts colleges. These schools will no longer provide U.S. News the information it requests or fill out the peer assessment. Justifying the decision, Neil B. Weissman, provost and dean of Dickinson College in Pennsylvania, called the U.S. News rankings misleading and "lame science" [source: Washington Post].
In addition, Sarah Lawrence College no longer asks prospective students for SAT scores, one of the pieces of data used in many college rankings. The decision also reflects a questioning by many educators, students and parents about the value of standardized testing.
The protest against rankings has spread to other disciplines and other countries. Business-school administrators worry that business-school rankings published by media like BusinessWeek, U.S. News & World Report and The Wall Street Journal typecast schools or fail to illuminate their worth.
In 2004, the business schools of the University of Pennsylvania and Harvard withdrew from the BusinessWeek rankings. In Canada, dissent is rising against Maclean's popular rankings. After meeting with Maclean's to share concerns about the magazine's ranking methods, 25 of Canada's more than 90 universities stopped providing information to the magazine. Schools of all types joined the boycott, including the University of Toronto, one of the most well-regarded universities in the country. In order to make their data more accessible (and free), many of these schools then posted the data requested by Maclean's online.
Defenders of rankings say that they do serve an important purpose, whether introducing students to schools they may not know or giving them some idea of where a college stands. Even so, the critics of college rankings are determined in their mission.
On the next page, we'll look at what schools are doing -- besides ending cooperation with publications like U.S. News & World Report -- in order to enact change.
Efforts by Colleges to Spread Information
Though some colleges have turned away from rankings and no longer use them in their marketing materials, these schools say that it's important not to deprive prospective students of the data they need. As a result, colleges are trying to present more comprehensive information that better reflects their institutions. In September 2007, a consortium of schools expects to launch a Web site on which prospective students will find information about colleges and universities. It will be useful information like the real cost of a year at a school, rates of acceptance , matriculation and graduation and demographic information. The site will also feature descriptive graphics and comparison tools.
The National Association of State Universities and Land-Grant Colleges, which represents more than 600 public colleges and universities, is considering its own Web site. It would include even more information like survey results showing how much students feel they are learning. Another organization, the Association of American Universities, a group of 62 schools, claims to be considering a site as well.
Schools dropping out of the U.S. News ranking will likely generate some confusion and controversy at first but should also bring about some welcome changes. Numerous interscholastic associations, education nonprofit organizations and other groups are meeting, discussing initiatives, lobbying for changes to ranking systems, and brainstorming more comprehensive materials and more accurate rankings. The process should facilitate better communication between colleges, administrators, publishers of rankings and past, present and future students.
If this ranking- and information-sharing revolution comes about, the Internet will play an important role. Nearly every proposal calls for sharing information online and making it descriptive, easy to access and useful. Then, if publications still want data for rankings, they can just go to a school's Web site, the same place where many prospective students will get their information.
Already, some of these changes are taking place. Sites like collegeboard.com, which also handles SAT registration, allow students to learn about and conduct side-by-side comparisons of the colleges that interest them.
So with all of this controversy and potential change, how should rankings be used? Most college counselors say that they have their uses, but they're only one of many tools and are best considered as starting point. Pay attention to the facts and data they provide, but know that the rankings may not be entirely accurate. In the past, schools have, sometimes intentionally, submitted incorrect information, throwing off their scores. And some publishers may change how they determine the rankings of schools that no longer provide data.
College counselors recommend visiting schools whenever possible. Talk to current students and alumni. E-mail a professor if you can and find out what truly distinguishes this school from another. After all, Americans are lucky: We have hundreds of diverse schools from which to choose, and college counselors generally agree that there is more than one "right" school for a person.
For more information about college rankings and related topics, please check out the links on the next page.
Related HowStuffWorks Articles
More Great Links
- "College Rankings." MSN Encarta.
- "Colleges Pull Out of 'U.S. News' Rankings." NPR. June 22, 2007. http://www.npr.org/templates/story/story.php?storyId=11277622
- "Colleges Resist Pull of Published Rankings." NPR. June 26, 2007. http://www.npr.org/templates/story/story.php?storyId=11422186
- "G-Factor." Global University Rankings. University Metrics. April 29, 2006. http://www.universitymetrics.com/g-factor
- "Maclean's." http://www.macleans.ca/education/index.jsp
- "The Top 100 Global Universities." Newsweek. Aug. 13, 2006. http://www.msnbc.msn.com/id/14321230/site/newsweek/
- Damast, Alison. "More Opposition to U.S. News Rankings." BusinessWeek. May 20, 2007. http://www.businessweek.com/bschools/content/may2007/bs20070520_011612.htm?chan=top+news_top+news+index_b-schools
- Editors. "The Washington Monthly College Guide." Washington Monthly. Sept. 2005. http://www.washingtonmonthly.com/features/2005/0509.collegeguide.html
- Editors. "The Washington Monthly's Annual College Guide." Washington Monthly. Sept. 2006. http://www.washingtonmonthly.com/features/2006/0609.collegeguide.html
- Finder, Alan. "Colleges Join Forces on a Web Presence to Let Prospective Students Research and Compare." The New York Times. July 4, 2007. http://www.nytimes.com/2007/07/04/education/04rankings.html?ex=1184904000&en=b31923f8afeea0be&ei=5070
- Johnson, James A. "Vanguard College Rankings." http://www.postmaterial.org/collegerankings/index.html
- Lavelle, Louis. "A Rank Offense to B-Schools?" BusinessWeek. Aug. 5, 2005. http://www.businessweek.com/bschools/content/aug2005/bs2005085_1796.htm
- Lee, Stephanie. "College rankings losing their relevance." San Gabriel Valley Tribune. http://www.sgvtribune.com/ci_6389616
- Remke, Andrea. "Looking beyond the rankings." Cincinnati Enquirer. July 17, 2007. http://news.enquirer.com/apps/pbcs.dll/article?AID=/20070717/NEWS0102/707170371/1058/NEWS01
- Samarasekera, Indira. "Rising Up Against Rankings." Inside Higher Education. April 2, 2007. http://www.insidehighered.com/views/2007/04/02/samarasekera
- Selingo, Jeffrey. "What the Rankings Do For 'U.S. News.'" The Chronicle of Higher Education. May 25, 2007. http://chronicle.com/free/v53/i38/38a01501.htm
- Skube, Michael. "The No. 1 reason to rank colleges." The Los Angeles Times. July 8, 2007. http://www.latimes.com/news/opinion/commentary/la-op-skube8jul08,0,6137792.story?coll=la-news-comment-opinions
- Strauss, Valerie. "'U.S. News' rankings? No thanks, they say." The Washington Post. Concord Monitor. May 21, 2007. http://www.concordmonitor.com/apps/pbcs.dll/article?AID=/20070521/REPOSITORY/705210368/1013/NEWS03
- Tung, Michael. "Rankings Colleges Using Google and OSS." Unnaturally Long Attention Span. Jan. 30, 2006. http://vcmike.blogspot.com/2006/01/ranking-colleges-using-google-and-oss.html
- Weissman, Neil B. "College Rankings Are 'Lame Science.'" The Washington Post. June 30, 2007. http://www.washingtonpost.com/wp-dyn/content/article/2007/06/29/AR2007062902135.html