Saturday, August 12, 2023

AI and the potential demise of higher ed

In newspapers across the state...

I teach an on-line Principles of Microeconomics course every summer. It is “asynchronous”: students work at their own pace (within my deadlines), using the text, a course-management system, short videos from me, and outside resources they find. In such courses, I interact with the students through “forums” to discuss articles; I give them feedback on assignments and tests; and they’re free to schedule Zoom appointments with me (although few do). So, there is little personal contact required—and thus, I barely know them at all. (I’ve also taught “synchronous” courses on-line—with classroom meetings through Zoom—which, with certain teaching methods, can be roughly equivalent to a classroom experience.)

In my Micro course, I use a “mini-paper” as an assignment to help them learn about demand and supply models. It’s a two-paragraph essay about a Wall Street Journal article related to changes in a market. They write a paragraph summarizing the article and a paragraph analyzing the market from the perspective of demand and supply. Then, they add the relevant demand/supply graph.

This summer, for the first time (I think), a student used AI to write the essay. It was well-crafted—too well-crafted, in fact, for 99% of undergraduates. Later work by the student was poorly-done, making it extremely likely that the student had gotten assistance. Teachers have access to AI that detects plagiarism. But so far, we don’t have AI that detects help from friends or from AI.

Students can also Google my test questions or use resources like Chegg to find (attempts at) answers. I change my questions to limit the effectiveness of these efforts. I use time-constrained exams, so they cannot search much and still complete the test. I minimize the course grade weight on assignments that can be gamed. As a result, those trying to game the system don’t know the material and usually sack themselves halfway through the course. Paraphrasing C.S. Lewis: the laziest student is often the one who works hardest in the end.

So, I work hard to limit shenanigans in my courses. I believe that my strategies are effective. But the nature of such things is that they are difficult to detect. I’ve done my best, but maybe students have found work-arounds to my best efforts.

More troubling: How common is this level of diligence among other professors? Does the level of effort deviate between regional teaching universities and larger research schools? Probably so. Does the ability to avoid “cheating” differ by field? Maybe English was largely unaffected before the most recent iteration of AI. But now, it has been profoundly challenged by the ability of AI to write good papers without detection.

Here’s what I do know: this is potentially a devastating problem for higher ed. Its reputation and credibility (in the eyes of many) have suffered for a handful of reasons over the last decade. Its mission and value have often been undermined by dual credit and on-line courses. But this new threat could be far more substantial.

Education serves two primary roles: “human capital” (building general and specific skills and knowledge) and “signaling” (a relatively low-cost way to distinguish between those who are likely to more/less productive in certain fields). If students can game the system effectively, then both the human capital and the signaling functions of higher ed will be (greatly) diminished.

One response by colleges we might expect soon: transcripts that will note when courses were taken on-line. Businesses will probably put pressure on colleges to provide this information, as they grow more concerned about the problem. Many universities will want to squelch such information. But others will gain by advertising how little they use on-line courses, giving them a competitive advantage.

And we can expect the market for technology to evolve as well, providing AI ways for teachers to monitor AI use by students. This will result in an “arms race” between the two sides of this fascinating market.

AI is not a universal problem for higher ed. For example, I have some accounting colleagues who say on-line is better because it holds students accountable to deadlines more effectively and the work can be self-taught easily enough. AI also provides efficiencies for some university functions. And of course, college education also serves other purposes: networking opportunities, socialization, and entertainment. But AI is a significant threat. Over the next decade, it will be interesting to watch the human intelligence of administrators and professors in their dance with artificial intelligence.

0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home