The goal of most college-goers is to find work that is meaningful and pays well enough for a middle class life. (There are of course other goals as well.) And getting a “good job” rather than just any job may have more to do with social capital—being part of and using social networks, knowing the ropes of job application and interviewing protocols, being aware and armed with strategies to counter racial, class, and ethnic discrimination—than with where you go to college and what you know about college rankings. All the information in the world does not make up for an empty checking account, bad credit, or a weak high school education. But rankings continue to proliferate.
Last week, the U.S. Department of Education released the long awaited redesigned College Scorecard. It was announced as an alternative to the promised college rankings, which was met with protest from the higher education sector. According to the lead researcher who built the redesigned Scorecard, the purpose is “to provide the clearest, most accessible, and reliable national data on college cost, graduation, debt, and post-college earnings…and to highlight colleges that are serving students of all backgrounds well.” Great idea, and important purpose, but if you read the serious press about it, you will quickly discover that while this is a step in the right direction in terms of holding colleges accountable for outcomes, the data available to the Department of Education are still too limited to do the job needed.
For example, according to the American Association of Community Colleges, 55 percent of community college students are not covered in the data because they are not first-time, full-time students, but attend part-time or stop in and out. The loan repayment rates are for students who entered with federal grants and loans, but do not take into account whether they graduated, nor whether they received non-Federal aid. The salary data are based on students who had a federal loan or grant in college, not on all students. These are the numbers to which the federal government has access, but unfortunately because the data are incomplete, the results can be misleading. And although not a fault of the government, people who really need valuable information may not understand the limits of the Scorecard data—it took me going to the technical paper to understand the metrics—or more likely, not use it at all.
The New York Times is just releasing an updated index which is somewhat better than the Scorecard in that it ranks diversity at least by using the percentage of students who received Pell and graduated, not just entered, but their sample is “179 top colleges.” The Times index is based on “the share of students receiving Pell grants (which typically go to families making less than $70,000); the graduation rate of those students; and the net cost after financial aid that a college charges low- and middle-income students. And the civil rights oriented Education Trust just released a study comparing graduation rates of students with and without Pell at a range of four-year institutions, showing that while the average grad rates are not many points apart (good news), some institutions show for example, 20 points difference in grad rates when put up against a comparable institution. Not exactly a scorecard, but similar in intent.
Washington Monthly has an index that ranks institutions by their “contributions to the public good in three broad categories: Social Mobility (recruiting and graduating low-income students), Research (producing cutting-edge scholarship and PhDs), and Service (encouraging students to give something back to their country). These are noble intentions and important ones. The University of Texas El Paso, a Hispanic serving university I am currently writing about ranks as #10. I’m profiling its stunning results with early college students who complete their Associate’s degrees in high school and go on to get their BAs at age 18 and 19. There are over 1,100 such students. But UTEP has a graduation rate around 38 percent—problematic in absolute terms, but it outperforms Washington Monthly’s prediction given its demographics by 10 percentage points. The president of UTEP, Diana Natalicio, just received a well-deserved award from the Carnegie Corporation for her outstanding leadership—she has been at the helm over 25 years. Having recently visited, I saw firsthand the power of long-term strong leadership in a community with many needs.
There are also all the other rankings—U.S. News, Forbes, the Chronicle of Higher Education, and Money magazine—that take into account many factors besides retention, graduation, and salary information. It’s the wild west out there, but that does not mean that information is not useful. I read it, as do my colleagues who are experts in understanding and improving higher education. It is just hard to figure out what importance it has to those who are most at risk economically and least able to deal with the daunting costs of college and the challenge of finding a good job with or without a degree. So, is more information, what’s needed?
Economic immobility in the U.S. is in a steady state—about 40% of the bottom quintile has remained “stuck” at the bottom since the 1970s. While the goals of ranking systems are noble—no one is against more transparency about costs and other factors to help families make decisions about college—the rankings may obscure the very problem they seek to illuminate: Those who aren’t savvy consumers need better academic preparation, better experience understanding the range of careers at an earlier age, better financing, and better understanding of the value of education beyond projected average income. There is an implicit message between the lines of the rankings that can do harm. It is that all you need is a college degree and the rest—a job with a career ladder, entrée into the middle class—will follow. The American dream is hard to realize for the majority of Americans, and getting a college education, while extremely important, does not always have a happy ending.