Thursday, May 17, 2018

What’s the difference between an identity-verified certificate and course-credit?

It seems like an obvious enough question. Imagine arriving at Walmart and being presented with a pre-filled shopping cart at a price far beyond what you had wanted to spend and containing numerous items you could see that you didn’t want. You could alternatively take your own cart, walk up and down the aisles and select merchandise yourself, and it would cost 1/100th of the pre-fill price to get the products you actually wanted.
Everyone looks up to you when you're
pushing this shopping cart 

Sounds like a ridiculous choice, right? But try to imagine that the pre-filled shopping cart is the socially-prized way to shop. You had to compete through a tortuous and uncertain process to have the right to purchase it, and your family and community congratulated you when you got that chance. It represents belonging to something. It comes with a bumper sticker and a sweatshirt that tell everyone you’ve got this cart. And it’s so overpriced and laiden with unnecessary extras, that it seems like only people of a higher caliber are pushing this kind of cart around. 

If you’re the parent of a 16-18 year-old, you know what I’m talking about.

Not that a la carte shopping in higher education is anything new. Udemy was founded in 2009, Udacity in 2011, edX, Coursera and FutureLearn in 2012. Identity-verified course certificates soon followed, along with subscription-based specializations. Technology that tracks students' eye movements holds promise to improve student engagement and identity-assurance. The potential unbundling of how young people study, explore and certify is more than six years old, but the preference for the pre-filled shopping cart stubbornly remains.

Let’s consider some prices. University of Michigan offers a 5-course specialization in Applied Data Science with Python on Coursera. With each course fairly intensive and requiring 4 weeks, let’s assume equivalent of 8 “credit hours” for the whole specialization. Coursera offers the first 7 days free, and then charges $49 per month to continue. So that’s just under 250 bucks. On campus in Ann Arbor as a part-time undergrad in the Computer Science Department, you’d pay $6,832 as a Michigan resident and $17,880 as a non-resident for those 8 credits (you’d save only slightly by enrolling full-time). MIT, the incubator of edX, has put a huge stock of courseware onto MIT OCW for free. But 8 credit hours physically in Cambridge, MA would cost just under $25,000. The student could also have chosen Codecademy, Hackerrank, or Udemy as pathways to mastering Python and machine-learning, making great strides for well under $100.

The continuing preference of families/government/large employers/schools/communities that teens should be channeled toward a 4-year undergraduate degree based on competitive selection, full-time enrollment, and single-institution loyalty seems like an irrational consumer decision. But its persistence shows us how powerfully social cues drive our behavior. The fact that University of Michigan, Yale, Harvard, MIT and scores of other high-priced universities give away their course content free or nearly free, indicates how strongly they expect young people and future employers to differentiate an online, identity-verified course certificate from “credit,” and that from “degree-granting credit”. High schools establish the basis for this difference with Advanced Placement tracks, usually a very narrow selection of courses promising potential college “credit” if students perform well on exams. The same credibility is not given to an identity-verified certificate from any of the thousands of online university courses that a teen might take. The message is clear: those courses “don’t count”.

This view is underscored in American education policies. A 2017 Brookings study examines high-performing secondary students in a sample of states where high schools send students to community colleges to take advanced classes not available on their premises. The report concludes that the additional cost is not warranted, and that “the public cost for a high school student to take a three-credit class via dual enrollment was actually higher than if the student waited to complete high school and took the same three-credit class once she got to college”. The use of the term “credit” as a presumed currency deserves greater attention. Why had the researchers not compared the costs of students walking down the hall to an open room with wifi, logging into edX accounts, and taking any one of thousands of university courses, or tutorials on Udemy or Udacity for that matter? The policy implications go further. Families can only apply 529 educational savings distributions to “for-credit” tuition costs (and since January, up to $10,000 per year for K-12 school tuition). FAFSA assesses need-based financial aid of “for-credit” enrollment. Student loans are defined, and interest eligible for tax credit, when applied to “for-credit” study. “Credit” is more than just a word. It has become the underlying justification for the irrational, over-priced product bundle that is the “undergraduate degree”.

Bryan Caplan’s January 2018 critique of the 4-year undergraduate degree should be required reading for parents, counsellors and teens. He takes on the well-guarded myth that college is a transformative experience, repeated ad nauseum in college promotional materials, that it turns young people into critical thinkers or develops a lifelong love of learning. University promoters talk about some kind of alchemy in the classroom where smart young people and brilliant professors get together and "something magic happens" as one university president called it. But Caplan presents embarrassing data about how little effort most students put into it, how little is remembered, how little is applied later, and how overwhelmingly more stock is placed in the diploma-paper than the entire process. 

Understanding our strange preference for bundling makes us ask a few things about our motivations for learning in general. Do consumers find utility in the incremental process-- that is, conversations, ideas, histories, methods, problems? If we saw utility in each of these things, then we would want to assign value accordingly. We would be like cable customers who really want to know whether the phone, data and television plans serve us better individually or all together. After all, who would sign onto a cable bundle that they knew cost more than all the individual parts?

But then again, maybe if we saw value in these things, our teens would already be on Udemy and edX. We’d be inside of libraries more often. We’d be making and doing things that didn’t award “credit”.

But maybe many of us aren’t sure about the value of the incremental learning process, and that’s precisely why we continue purchasing bundles. If that’s the case, then the sweatshirt, the bumper
It feels good to have a college sweatshirt
in this group. 
sticker, the camaraderie of other elite shoppers, and ultimately resting our heads on our pillows at night knowing that our confused/unambitious/uncertain/wandering teen has “made it”, is really what we’re after. 

And that’s OK.

As long as I’m not the one paying for it.

No comments:

Post a Comment