By Craig Smith
Generative AI – technology such as ChatGPT that creates content when prompted – is affecting how solicitors, judges and barristers work. It’s also likely to change the work they are being asked to do.
This means that the way lawyers are trained needs to change, too. In education, there can be a tendency to see generative AI as a threat, including as a means for students to cheat. But if lawyers are using these technologies in practice, the training of future law graduates must reflect the demands of the profession.
Lord Justice Birss, a judge of the Court of Appeal of England and Wales specialising in intellectual property law, has described using ChatGPT to write part of a judgment, in particular, to generate a summary of a particular area of law. Finding the content generated acceptable, Lord Justice Birss described ChatGPT as “jolly useful” and explained that such technologies have “real potential”.
Specific generative AI technologies have been created for lawyers. Lexis+ AI can be used to draft legal advice and communications, and provides citations that link to legal authorities.
And as the use of AI grows, so too will the advice clients seek on AI-related legal issues. Areas of law already well established – such as liability or contract law – could be complicated by AI technologies.
For example, if generative AI is used to draft a contract, lawyers will have to be versed in how this works in order to address any disputes over the contract. It might, for instance, be inaccurate or lack important terminology.
It would be even more concerning if the generative AI had been used by a legal professional and the drafted contract not checked due to an over-reliance on the accuracy of the technology.
Adapting teaching
This change in the profession means that law lecturers must also bring generative AI into their teaching. This is necessary to expose law students to the types of situations they will encounter and the tools they may end up using in the profession.
For example, mooting – conducting a mock trial debating a point of law – could incorporate generative AI in the role of a judge, providing real-time feedback to students. Generative AI could also feature in debates, challenging a student’s understanding and position on a topic.
These examples foster not only the legal knowledge of the activity, but also the practical application of legal knowledge and critical thinking in an environment where generative AI is familiar.
- Human rights and AI: A much-needed discourse
- Unlocking the potential of generative AI: 3 key queries for state agencies
- How AI helped bring a trafficked child home
When assessing student knowledge, I give my own law students the option to use generative AI in answering an essay question. This, to a limited extent, simulates how the legal profession might use generative AI in their practice. The students must make sure that their response is accurate, and so the task requires academic rigour and digital skills but also the understanding that generative AI is both an opportunity and a risk.
I also ask students to reflect on their use – or decision to not use – the technology. This promotes discussions around the ethical and safe use of generative AI. It allows students to examine their use of a tool that is often viewed with suspicion in higher education.
Benefits and risks
Reflection like this is important because there are many potential pitfalls in using these tools. Guidance on artificial intelligence released for judges in England offers a reminder that generative AI is often public, meaning there is little confidentiality for any information entered.
The Bar Council has also supplied guidance for barristers on navigating the growing use of ChatGPT and other similar tools. The Bar Council says that using generative AI tools to enhance legal services isn’t wrong, but it is important that barristers thoroughly understand these tools and use them responsibly.
This gives more substance to the notion that being legally qualified and trained to practice law is essential, but so too is the digital skill needed to use such technologies.
The legal profession must balance the potential benefits and risks of the widespread use of generative AI, but also make sure that future lawyers have the knowledge to understand it. Law graduates and future legal professionals need to dedicate proper attention to digital skills around AI and the ethical considerations that arise from its use. They’ll need this to navigate the law in a society where the use of AI is already ubiquitous.
Craig Smith is a lecturer of Law at the University of Salford.