Rankled by Rankings
College Presidents Describe How They View U.S. News & World Report Rankings at Hechinger Higher Education Seminar in New York City
The Hechinger Institute on Education and the Media invited an array of current and past college presidents to discuss the way U.S. News & World Report ranks colleges and publishes the results. The presidents shared their views in New York City in November 2007 at the seminar for higher education reporters, sponsored by the Lumina Foundation for Education.
Panelists included Peter Likins, former president of both the University of Arizona and Lehigh University, William Durden, president of Dickinson College, Judith Shapiro, president of Barnard College, Patricia McGuire, president of Trinity (Washington) University and Brian Kelly, editor of U.S. News & World Report. Education consultant Steven Roy Goodman was also on the panel.
Scott Jaschik, one of three founders of the Web site Inside Higher Education - http://www.insidehighered.com - moderated the session. Here are some excerpts from the discussion:
Scott Jaschik: Identify where you stand on the issue of rankings and if your position has evolved.
Judith Shapiro (Barnard)
I've been at Barnard since 1994, and before that was provost at Bryn Mawr for eight years. I guess I would say that my attitude has evolved from indifference to an unwillingness to collaborate in any way, speaking just of U.S. News & World Report first. What's interesting about Barnard is, I think, that we have the luxury of really not having to care about [the rankings] at all. ... The issue about choosing to not collaborate any more as opposed to simply going along ... we provide these data. Barnard has long had our institutional data up and available on our Web site. To be sure, that does not serve comparative purposes for families.
William Durden (Dickinson)
I think U.S. News should not be the sole focus of this discussion of rankings. There is an industry and it extends well beyond U.S. News. And I know that many of the competitors of U.S. News are delighted that U.S. News is getting the majority of the heat, but I believe they deserve some focus also. I think the story has really moved on from the rankings. ... The real interesting thing, at least for me, is that we have moved on. There's been a moment now to bring alternatives on to the scene. The American free market system is beginning to work. And we are starting to get competition, and I think healthy competition, and that is good. In my mind, U.S. News is not going to go away. ... There are a couple of things that give me concern and account for my position. There is a considerable fear factor in the college process now. It is not about what is the best match, which it should be. It is about, "If I don't get into a certain number in the U.S. News survey, my life will be ruined and I will have no money and no prestige." Now, U.S. News might say that "That's not our problem, that's the public's problem." ... The college process is not a contest, it's a game to be won and that's bad. The second element is accountability. This is very interesting. Suddenly, out of nowhere, rankings are now accountability. Where did this come from? And we that do not advertise their product are not accountable; we are "afraid" to face accountability. That is unacceptable. Because colleges and universities have been giving out to the feds, other sources, the exact data U.S. News uses for the most part in its formula, in its ranking.
... Maybe if U.S. News drops the peer assessment and drops the numerical rankings, we'll be working together and that'd be great.
Peter Likins (Lehigh, University of Arizona)
When you rank institutions, it's all about what your criteria are and how you weight them. ... I [had in the past] suggested that [Mel Elfin, then editor of U.S. News & World Report] use alumni giving as a surrogate for satisfaction of past students and graduation rates as representative of the success of students who are currently there. I went through this list of proposed changes, and to my shock and surprise, many of them showed up the next year. Lehigh University went from 52nd to 32nd in one year because the criteria changed. And that is the primary message I would leave with you.
... I was very, very pleased, of course, with the progress we made - the institution got a whole lot better in just one year - and it was visible because he spelled out what the process of determining the rankings was and let us all know which institutions were regarded as national and which as regional. I was very pleased until I moved to the University of Arizona after 15 years at Lehigh and realized that all of the things I was advocating favored Lehigh but disadvantaged Arizona. Arizona has a higher reputation among peers than Lehigh, but Lehigh has all this other stuff: graduation rates, small class sizes -- it's a private institution -- alumni giving. I couldn't very well go back to U.S. News and say, "Please reverse field." But that juxtaposition of my public and private university experience really forces the importance of weightings of metrics. And I believe that Americans love lists, love rankings. Whether it's the top football teams or the top cars in Consumer Reports, we love rankings. And as far as I'm concerned, it's not inappropriate for a publication to respond to the aspirations and desires of the readership. What I do ask of the editors, however, is that they be clear about their criteria. ... If people will look at the subtext and draw conclusions about where they want to go to school or where they want to be a professor on the basis on the subtext, then I think it's a very healthy communication.
Pat McGuire (Trinity University)
I have evolved on the whole issue of whether you can compare institutions with any real integrity or not on comparison in ordinal rankings. ... We have to do different things than other institutions do, and our outcomes look very different. There is not a single student attending Trinity today who is attending any less than a first-rate institution. I don't even know what our ranking is in Mr. Kelly's magazine because I have not participated for a while in that, as he knows. Some institutions care about that and that's great. I'm not here to pass judgment on them. But what I know is the students who choose my institution choose it because it is a gateway to success. ... My general beef with the idea of ordinal comparison of institutions is several-fold: first, the whole concept is premised on what I believe is an increasingly narrow and small middle-class to upper-middle-class notion of traditional higher education, that mythical ideal of the four-year college as the mainstream of higher education today. Guess what - it's not. Seventy-two percent of undergraduates in higher education today are nontraditional by some measure. ... The second significant issue is the so-called rankings of any kind are all about institutions. When you really think about it, they're not about the students; they're about institutions. In many ways, they encourage behaviors that are misleading for students. The best college for a history major or a physics major or a nurse is a very different kind of institution than what might be the best college for the football player, the soccer player, or the pharmacologist. What makes a student successful in college is not a ranking. It is the quality of the student experience matched to whether the college can, in fact, provide the educational program and the total campus environment.
Rankings are not accountability. Accreditation is on the hot seat these days, I know, but accreditation is the method by which we hold each other accountable in higher education. ...The reality is the peer-driven system of regional accreditation in American higher education is in fact a laborious, rigorous and detailed process to hold each other accountable across a broad range of indicia. I'm an advocate for publishing our accreditation reports ... warts and all.
Scott Jaschik to Brian Kelly: How much does U.S. News make off of the rankings and how important are they to U.S. News' economic well-being?
Brian Kelly: Not enough. We are not making enough on the rankings. I wish we'd make some more. When I came to the magazine - when I took over the rankings about five years ago - there was a certain kind of furtive, apologetic feel about them. And I said, "That's crazy, I mean, these are great, let's run with them." And I really tried to turn them into a much more robust business, and I've largely failed. It's fine, we get a lot of traffic on the Web, a lot of people come there, but it doesn't really translate to advertising because nobody wants to advertise to parents who are about to lay down $40,000 [for tuition]. There's just not that good of an audience there. Colleges don't advertise, for the most part. They advertise regionally. The New York Times does a lot better than we do.
Scott Jaschik: I wasn't hearing a number.
Brian Kelly: If I had one, I wouldn't give it to you. We rank a lot of things. We rank hospitals, we rank graduate schools, we rank cars, we have a whole giant data project, it's ingrained in the magazine. We don't break it out and say, "Geez, what did we make on that?" It's integrated in the entire magazine. ... We want to keep the rankings. We like them. We think it's good journalism. There is a business component to it, certainly. ... They are fabulous. Data in general is a fabulous journalistic resource. We are firm believers in it. It is particularly suited to the Internet - the ability to give people enormous amounts of information that they can sort and make choices about on their own.