Thoughts on USNWR's Plans to Rank Online Education Programs
Thoughts on USNWR's Plans to Rank Online Education Programs
How Lemony Snicket captures my feelings about USNWR's planned expansion of its rankings empire into online education...
A featured character in children's author Lemony Snicket's A Series of Unfortunate Events books is Esme Squalor, who bills herself as her city's "sixth most important financial advisor" - a fact which is utterly irrelevant to the storyline and most everything else. This pithy bit of satire sums up my opinion about rankings.
Unfortunately, US News & World Report's plans to rank online education programs will be far more consequential; it will surely find appeal in our ranking-crazed society as exemplified by USNWR's current college rankings and even more asinine ranking systems such as the one which rules Division I college football, the Bowl Championship System or BCS (pronunciation reminder: the "C" is silent).
One commenter to the Chronicle of Higher Education article which reported this initiative recommended not blaming USNWR for doing this since they are merely a "symptom" of the need for more information about the quality of online education. This need is real, and ventures such as College Choices for Adults, geteducated.com's listings, and the Sloan-C Quality Checklist Scorecard are starting to fill this void in a variety of ways. Unfortunately, this planned USNWR initiative is also a symptom of something else: the profit motive and its distorting effects at work. A quick glance at USNWR's "University Directory" makes it pretty clear that profit is their motive, which only makes sense given that USNWR's status as an actual news organization is questionable at best; after all, they've gotten out of the print news business altogether, and their rankings empire has become their bread-and-butter. As their marketing materials state,
"And because U.S. News & World Report is the leading ranking resource for anyone seeking an on-campus or online degree programs, you know you’re getting the best advice and information available for your on-campus or online education."
Although I would never consider USNWR to have the best available advice for online or any type of education, many other conditioned consumers will feel and respond differently. Indeed, its clout and reputation in this area is precisely what USNWR is banking on -- which is also the problem because their track record in ranking institutions and programs is dubious at best. Their rankings have been rightly criticized for a host of reasons, such as being based almost exclusively on "fame, wealth, and exclusivity," using highly manipulable criteria, and falling way short of providing a comprehensive institutional picture, just to name a few. Also, as my colleague Mark Halsey of Virginia Tech notes, evaluating programs as individual, stand-alone units ignores an institution’s overall assets and strengths and the synergies they create, which also distorts the rankings.
Beyond that, ranking systems create an illusion of false precision that belies their reliance on subjective criteria which ultimately produce highly questionable results. When the University of California Santa Barbara is ranked the 29th best university in the world but only the 39th best "National University" in the US, or when Traverse City, MI is ranked the 2nd best place to live in the US (or is it really Honolulu? Or Ogden, UT?), it's pretty clear that one can make rankings do pretty much what one wants.
So as with USNWR's other college ranking systems, there is much potential for mischief and little potential for anything good to come of this venture. Creating a comprehensive listing of online education programs with USNWR's reputational clout might help online education's visibility -- but the prospect of USNWR promulgating standards or assessing the quality of online education programs may more than wipe out the benefit of a more visible online program listing. As Russ Poulin commented, ranking systems can't capture individual students' needs, and as John Bear noted, the proposed ranking system will increase pressure on online education programs to warp themselves to fit external ranking criteria rather than to improve based on their own practice-generated experience and expertise. Other system-induced distortions are equally predictable; for instance, institutions will game the system by discovering system quirks and instituting practices which help their rankings irrespective of their actual value; the system will eventually stabilize and rankings will change little, and the system will become institutionalized and self-legitimizing in the process, as has happened with the current USNWR ranking system.
Moving into the online education space also creates new opportunities for mischief. With so many for-profit institutions in the higher education space, will this new rankings game will become even more market-driven because for-profits will be playing much more strongly with much higher stakes on the (bottom) line? That other higher stakes ranking system, the BCS, offers a cautionary tale: since its inception, one BCS winner (University of Southern California) has had to vacate its title, another (Ohio State University) is under investigation, a third (Auburn University) was under severe scrutiny, and the entire enterprise is viewed with increasing suspicion. No wonder -- since there is big money involved, and the BCS is structured to legitimize and institutionalize the flow of athletics-generated dollars to its most influential members (the "Big Six" conferences, as Boise State or Texas Christian University fans will be more than happy to tell you). While USNWR has its "scientific formula", the BCS has its computer rankings, both of which are susceptible to degradation by GIGO (garbage in, garbage out).
Some may argue that the use of reasonable criteria will produce a useful set of rankings, but in fact this is where the real mischief will likely happen, if the planned criteria as described in USNWR's announcement are any indication:
"Each survey will be organized by seven subject-specific sections: background, admissions, tuition, course delivery, faculty, retention and graduation rates, and career outcomes."
Lots of opportunity for GIGO mischief here -- will a higher percentage of tenured faculty vs. adjunct faculty be seen as better? Will higher regionally-adjusted faculty pay be seen as better (as USNWR does with its current methodology)? Will higher retention/graduation rates be seen as better, disfavoring those institutions which serve stopouts and swirlers?
"The questions used to determine the rankings will partially resemble, when appropriate, those asked in U.S. News's annual Best Colleges and Best Graduate Schools surveys."
USNWR's current questions are based largely on reputation and other forms of "fame, wealth, and exclusivity," while online education programs have been historically been defined by access provision. How will they decide which questions are appropriate and which ones are not? This has distortion written all over it.
Also included are some evaluative questions measuring online student engagement and determining the academic integrity of the online education process.
More opportunities for mischief -- the premier initiative for measuring student engagement, the National Survey of Student Engagement, fastidiously avoids rankings precisely because they distort perception and reality. And they're only considering regionally accredited institutions and don't bother with this measure for campus-based academic programs, so why is "academic integrity" even on the radar screen, and how the heck will they measure it? This is a marketing criterion to ease the concerns of prospective consumers; it's not a quality criterion.
It will be interesting to see what unanticipated consequences will ensue from this venture (because I am resigned to its inevitability). For example, what will happen to online education's reputation if the top-ranked programs turn out to be a very different list from the list of campus-based programs? Will this cause online education's reputation to suffer ('what did you expect, it's online education')? Will it help some high-quality schools to gain some much-deserved visibility, or will it further legitimize a few who learn how to play the game the quickest?
Still, there might be some ways to make something sweet out of this pile of lemons looming on the horizon. Here are two suggestions:
1) Make the ranking criteria explicit, visible, and transparent, including the standards behind each criterion. Currently USNWR makes their methodology criteria explicit, for instance for their undergraduate rankings, but the "scientific formula" for determining the ultimate rankings is proprietary and thus not transparent.
2) Allow consumers to make their own rankings. This has happened with the long-established "Places Rated" system and its copycats; instead of merely accepting that Pittsburgh is the most livable city in the US (or is it Tallahassee?), systems are now being developed which allow consumers to create their own Places rankings based on criteria they get to choose. A more sophisticated version would allow consumers to give various weights to different criteria, or even customize the criteria themselves when appropriate.
I'm not holding my breath that USNWR will implement either of these, but doing so might result in a tool which would actually be useful to consumers.
More likely is that USNWR's online education program ranking system will produce a lot of misinformation, or simply encourage consumers to use a snap decision-making process which assumes that a higher program ranking is actionable information. Such decisions will be based on the equivalent of current factoids such as Washington & Lee University being the 14th ranked National Liberal Arts College, which is about as useful as knowing that Esme Squalor bills herself as the sixth most important financial advisor in her city. Unless, of course, one truly believes that she is notably superior to the 7th and 17th and 27th most important financial advisors in her city...