Learning GenAI Together (or Teaching Old Dog New Tricks)

I always end up spending a few weeks every teaching season researching and drafting a fact pattern suitable for a legal memorandum (the final assignment the students have to complete for my International Business Law course). It usually contains various elements that touch upon international sale of goods, carriage of goods (by land and sea), international commercial arbitration, and payment mechanisms, all intertwined with an assortment of societal issues and current trends.

Of course with the advances in GenAI, the temptation to rely on it to facilitate this (often laborious) task has been growing. Just last year, I lamented about the burdens of drafting dozens of recommendation letters each year and how ChatGPT might (not) ease my workload. However, in just one year, we’ve witnessed exponential growth, not only in GenAI’s capabilities (e.g. from ChatGPT 3.5 to 4.0), but platforms offering AI backed services have mushroomed as well. With the hype surrounding this development, I was excited to apply my limited prompting skills in asking ChatGPT 3.5 to draft a case fact for me, and below is what I got in return (much to my disappointment):

Now if you’re already a “prompting guru” and a master of GenAI, you’ll tell me that my prompt was not specific enough and that I need to learn how to prompt better (and you would be absolutely right on both counts). But here in lies my central question: where does a reluctant boomer with diminishing cognitive capacity go to learn how to use these emerging technologies? The answer was somewhat obvious given that we (supposedly) use a student-centered, problem based learning method at our university anyway. So I decided to ask students in my class to show this old dog some new tricks. I ended up devoting a week’s worth of tutorials to work together with (read: learn from) the students, who have been assigned to draft a legal memorandum (the facts for which I had to - once again - research and labor through on my own).

IBL students working with GenAI tools in class to find answers to their legal memorandum assignment.

During our tutorials, we played around with assortment of different platforms (e.g. ChatGPT 3.5 and 4, Copilot, Consensus, Typeset, Perplexity, Quillbot, etc.) and on each platform (some for free, some not), we tried different types of prompts and discovered which platforms were useful in drafting a legal memorandum, and which ones were not. What also became apparent was the differences in the level of familiarity and competence amongst the students on dealing with GenAI. Some had never really experimented with GenAI before, while others were quite familiar with all sorts of platforms and prompts. (This confirmed the survey results that we collected from 100+ students earlier this year from our Faculty of Law, School of Business and Economics, and the Faculty of Arts and Social Sciences).

The general consensus seemed to be though that GenAI can be really useful for language edits (e.g. Grammarly) and for some (non-legal) research (e.g. differentiating between lab grown cordyceps v. natural cordyceps etc.), but not necessarily for in-depth legal research. To be fair, some platforms were indeed better than the others. For example, ChatGPT 4 was indeed better than 3.5 (although whether it was worth the €20 a month subscription triggered a prolonged discussion) and platforms like Consensus or Perplexity returned “more academic” results that produced more useable content for the memorandum. What was interesting to note was that some of the better answers provided by these platforms relied on sources that were already available to the students in their course material, begging the question, would the students have been better off if they just read the course material in the first place, rather than going down the GenAI rabbit hole.

In the long rung, however, our reliance and dependence on GenAI feels almost inevitable (in some way, shape, or form) and those who cannot utilize it competently, may lose out to those that can. As one student noted in our survey:

It is unavoidable. It is better to learn how to use it than to avoid it. If you don’t act fast you will fall behind. Universities should never ban it. Rather, teach how to use it responsibly.
— Joost Hamers (IBL Student)

I agree with Joost 100%. The problem, however, is that many of the staff members (including myself) have not been adequately trained to use GenAI, and perhaps paradoxically, we have to learn from our students who were early adopters. In a way though, this presents a perfect opportunity to flip the script and have some of the students lead us in the learning process (as we did in the IBL tutorials). For whatever it’s worth, I learned a lot from my students in doing this and plan to continue doing this. So in sum, I am humbled by this learning opportunity, grateful to the students for teaching this old dog new tricks, and perhaps most importantly, I am sorry that the case for the memorandum this year is about zombie mushrooms and liquid cocaine. I only have myself to blame (and not GenAI).