November 4, 2004

Economists Compile Alternative Rankings

Print More

For years parents, students and University officials have fretted over the latest U.S. News & World Report’s college rankings. Recently, however, a quartet of economists have released an article challenging the validity of those and other well-publicized college rankings.

The economists, Prof. Christopher Avery, Harvard University; Prof. Caroline Hoxby, Harvard University; Prof. Mark Glickman, Boston University School of Public Health; and Andrew Metrick, the Wharton School of the University of Pennsylvania, submitted their paper to the National Bureau of Economic Research on the week of Oct. 12. The researchers also offered an alternative ranking based on high school seniors’ college choices. Harvard and Yale took the top spots.

But other schools who have surged in recent U.S. News rankings fell, some harder than others. The authors of this study argued that some schools were manipulating formulas that the well-publicized rankings use.

“We are not trying to measure the same thing as U.S. News, so the rankings are not directly comparable, despite implications otherwise in press articles,” Metrick said. “We are trying to measure ‘desirability’ using a market-based measure. … U.S. News and other popular rankings are using their own ‘secret-sauce’ formula to measure overall quality.”

The authors do not blame the colleges for doing what is necessary to achieve high rankings, but rather they criticize most ranking systems for placing undue “pressure” on colleges that feel the need to manipulate numbers.

The paper begins with an explanation that the authors seek to outline the negative effects of early admission that has placed the admission process “in a bad equilibrium.”

It is through the manipulation of two key formulas that most ranking systems use that universities are able to artificially inflate their rankings, the authors state. The first formula is the number of students who matriculated divided by the number of students admitted. The authors argue that the admission pool is not a fair indicator of the actual quality of the school. Early decision is the easiest and most effective way to manipulate the matriculating rate, because it gives a rate of 100 percent, which they argue explains the rise of early admission acceptances.

“The early-decision craze is in part driven by colleges’ desire to raise their matriculation rates. Such manipulation helps nobody — least of all the students,” Metrick said.

The paper states that by emphasizing early admission, these universities are actually creating a pool of less meritorious students to generate a rise in the rankings. Colleges then have less and less room for high-quality candidates and therefore are actually less selective while appearing to be more selective.

The authors singled out Princeton for manipulating figures via early decision. In their rankings Princeton is ranked sixth instead of first, as it is in the U.S. News rankings. The second formula that they argue is manipulated is the number of students admitted divided by the numbers who applied. Once again the authors argue that neither of these two figures carry any significant information about the quality of the school.

The numbers do not represent the quality of the admission pool, nor do they account for the quality of those accepted, the study states. In schools that have begun to manipulate their early admission acceptance rate, the actual quality of students is likely to be of a lesser quality.

Another way schools can boost their number of applications is by encouraging underqualified students to apply by sending out letters and invitations.

The authors’ solution to these malleable figures and formulas was to look at the college choices of 3,200 “high-achieving” high school seniors in the class of 2000. The economists asked the students to pick from two colleges head-to-head. After assembling all the data from numerous head-to-head battles, they ranked the top 100 colleges as if they were chess players in a tournament.

“In our measure, the only way to increase your ranking is to make yourself more “desirable,” a laudable goal we think,” Metrick said.

Cornell ranked 15, a drop of one from the U.S. News rankings, but three liberal arts colleges that were excluded from the U.S. News ranking — Amherst, Wellesley and Swarthmore — ranked above Cornell.

Other surprises included the drop of the University of Pennsylvania from four in the U.S. News rankings to 12 in their rankings. Perhaps the biggest stumble was the fall of Washington University in St. Louis from 11 on the U.S. News rankings to 62 on the economists’ ranking system.

President Jeffrey S. Lehman ’77 said the absolute ranking of universities is a difficult and dangerous enterprise. Comparing colleges, he said, was like comparing different children. “You wouldn’t advise your best friend to change who they are; you would advise them to be comfortable in their own skin,” he said.

Students are still at a loss for how to interpret college ranking systems.

“I don’t see the harm in using them, but I understand that certain schools definitely receive advantages for being higher up in the rankings,” Jesse Siegel ’05 said.

The authors have no plans to follow up the study.

“The data gathering for this project was a massive undertaking, and we have no plans to repeat it,” Metrick said.

Archived article by Michael Margolis
Sun Senior Writer