Unbundling Higher Ed & Assessment
The growth of alternative education and training providers continues. Companies like Udemy, Udacity, Codecademy, Fulbridge and General Assembly appear to be settling in for the long run and are expected to be a significant component of the expanding learning eco-system for adults. (See my post on the subject from January 2014, here.)
Critics are beginning to ask how these alternative providers should fit into the regulatory and loan systems – questions raised by Andrew Kelly and Michael Horn in a very useful report, “Moving Beyond College: Rethinking Higher Education Regulation for an Unbundled World“.
Horn and Kelly define these providers as evidence of the unbundling of higher education. Colleges and universities bring together a wide range of services under one roof: learning, research, housing, career services, social networking, credentialing and more. In contrast, the alternative providers offer a relatively specific value set – courses on Ruby on Rails, or digital marketing techniques, or verification of skills, for example.
The authors of the report rightly stress the importance of measuring and reporting on the quality and costs of these new providers as a key step in securing federal aid for students. Reporting on value has been difficult and often political in higher education, though most now recognize the importance of improved information in the hands of prospective learners.
” . . . the logic of the market discipline – where consumers “vote with their feet” by rewarding quality providers with their business – depends on consumers having sufficient information on providers’ cost and quality to make these decisions. The truth, though, is that not all colleges serve students equally well, and it is difficult for students to distinguish the worthwhile investments from the bad ones.” 4
Assessment of Learning Gains
The reporting on value in higher education has tended to focus on institutional performance as it relates to the student’s successful progression through an institution’s program of study: did the student graduate, how quickly, and did it ultimately lead to gainful employment.
These new providers, though, should (and likely will) place greater emphasis on learning gains, rather than progress through a program. Students enrolled in narrowly defined educational experiences bring a different set of needs and expectations to the investment; they are more interested in how quickly and effectively they acquire specific skills and knowledge than students enrolled in traditional four-year programs. Are they able, upon completion, to write code at the level promised by the educational organization? Systems that measure institutional performance are far less relevant.
This requires a different set of metrics and analytics to measure outcomes. Learning analytics, such as that provided by Acrobatiq, focus on how well students have acquired specific skills and knowledge. This is a different, altogether far more ambitious objective, as it calls for careful and rigorous course design, as well as a deep integration of curriculum and software.
Providers like General Assembly or Codecademy would be wise to seek out analytics software and services that can help them demonstrate the actual learning gains that take place over the relatively short duration of their courses and programs, in order to generate the kinds of evidence demanded by students, regulators and other stakeholders.