About a month ago, Cathy Davidson wrote about threats to higher education. While she mentioned several items, from de-funding and a narrow focus on STEM to the persistence of disciplinary silos, she left off an important one — unbundling.
Unbundling, as you recall, is the process by which things that used to have to be purchased together can now be acquired separately. Unbundling is a major disruptor, as Clay Christensen would put it, and forces business models to change. The newspaper, which put news, sports scores, movie reviews, and lots of classified ads in one bundle, has been disrupted to the edge of the abyss as people now go to craigslist for ads, espn for scores, epicureous for recipes, and rottentomatoes for movie reviews , all on the web. The iTunes store disrupted the musical bundle of the album, allowing people to easily buy just the tracks they want.
Discussions of unbundling in higher education aren’t new, For several years, the edublogosphere has pondered what it will mean if/when teaching is unbundled from assessment, or learning is unbundled from football games and fitness centers. A kind of unbundling that gets less attention is the possibility of curricular unbundling, which might be the biggest threat of all to higher education as we currently know it.
At some point in their academic careers many journalism majors ask (at least to themselves), “Why do I have to take biology?” Many computer science majors ponder why they are required to take art history. After all, these things won’t help me on the job market, will they? Despite the ample evidence of the ways a broad education contributes to employability, particularly given the likelihood of several career changes over a working lifetime, many students, focused on getting that first job in their field, would take only the classes they believe to be economically relevant but for the fact that curriculum design requires coursework in other areas.
Distribution requirements are older than this notion of workplace flexibility. They date back to a time when the focus of a college degree wasn’t about finding a job, but instead about becoming a well and broadly educated person. At this early stage, most college students were well-enough off that their employment prospects weren’t dependent on their degree.
This mechanism has a beneficial side effect for the institution. It allows courses with lower instructional costs (several hundred people in a lecture hall learning macroeconomics) to cross subsidize courses with higher costs (sciences, allied health etc.) Christopher Newfield has written in more detail about how the cross subsidy works.
Since traditional higher education institutions are the source for most credentials, this works. There is, for most students, no alternative to the traditional degree with its distribution requirements if they want to make headway in the job market. What if that weren’t the case?
What happens when, because of the growth of things like digital badges (sorry, Dr. Davidson) a computer science major can create a “credential” that employers view as valid without having to bother with art history? Here’s my near worst case scenario:
- Students, freed to study only what they feel is economically viable, focus narrowly in a way that makes the silos of the traditional university department look positively interdisciplinary.Departments that have depended on their place in general ed requirements wither and die.
- Cross subsidy dries up and aspiring occupational therapists, etc. have to pay tuition rates tied more closely to the actual cost of instruction in their field. Degree fields that lead to “good jobs” become more expensive, and only those already well off can enter them. The decline of subsidy sharply limits the university’s traditional research and service missions,