*/
Julie Ahadi responds to thought-provoking feedback on the use of AI in pupillage and explains why we should all be role-playing how the Bar might be affected
I’ve received some interesting feedback to my Counsel article on AI at the Bar (‘Living up to the hype? AI in chambers’, Counsel, November 2024). One email, in particular, caught my attention: from Joel Semakula, a barrister at Landmark Chambers. He made me stop and think not only about the answers to his questions, but the importance of getting into the habit of asking pointed, profession-specific questions about the impact of AI in the first place, and having the curiosity of mind to start flexing out different types of scenarios. Thinking about and role-playing what areas across chambers might be impacted by AI, and how, is just as (if not more) important than having any solid answers at this stage, especially as things move so rapidly.
Our own Chambers’ alumnus and Prime Minister, Sir Keir Starmer KCB KC MP made a major speech in January on the government’s preparation for the impact of AI on all our lives. In the last few weeks there has been an explosion of AGI (artificial general intelligence) and ASI (artificial super intelligence) predictions. The pace of actual adoption and integration of these next level AIs into society will likely be much slower than the theoretical capabilities, but asking probing questions, and playing out answers among ourselves within and across our respective chambers, seems the least we should be doing as an act of collective AI-preparedness. Getting across this topic and sharing information and views is important to protecting the future of the profession at large.
So to those questions that Joel put forward – each of which (and variations of them) I believe are crucial for a chambers to be asking. Disclaimer – my answers are as good as the next person who may take an active interest in this field, trials AI tools and binge listens to AI-related podcasts. Mastery of this topic relies on immersion and is open to anyone with the desire to take an interest.
The use of generative AI in legal work is inevitable and, when used effectively, it can be undetectable. Pupils and junior tenants may not openly disclose their use of AI unless specifically asked, and clients are unlikely to object if the final product is of high quality. The critical challenge lies not in whether people will use this technology, but in how it is used. Understanding the risks and fostering a commitment to responsible usage are essential.
Chambers and their pupillage committees have a proactive role to play in this context. Overreliance on generative AI, particularly among pupils, poses a risk of bypassing the development of fundamental legal skills. To address this, chambers may wish to consider emphasising that generative AI is used as a tool to augment, rather than replace, critical thinking, legal reasoning and drafting expertise. A balanced approach will not only mitigate risks but also equip future barristers to leverage this technology responsibly and effectively in their practice.
The question of whether AI use should be encouraged is significantly nuanced. While AI can offer tremendous benefits, including increased efficiency and accuracy, it also presents several risks and challenges that must be carefully considered. Different chambers and practice areas will face unique issues, and decisions around AI adoption must account for varying levels of risk appetite and understanding. This technology is still in its infancy, and the full extent of potential risks is not yet clear. There are also notable barriers to access, such as cost, IT literacy and the time required for barristers to become proficient. Therefore, while curiosity and experimentation with AI should be encouraged, it is crucial to approach its use with a balanced perspective that weighs both the potential benefits and the risks.
Chambers can play an active ‘educator’ role by circulating existing industry guidance, such as the Bar Council’s Guidance on Generative AI for the Bar, which, as of now, is about 12 months old and likely due for an update. To go further, chambers should consider developing their own tailored guidelines, reflecting their specific priorities and values. By defining clear principles for safe and effective AI use, chambers can foster innovation while ensuring that members uphold professional standards and preserve core legal skills.
In the not too distant future, I predict that pupils and junior tenants will need a blend of traditional legal expertise and modern technological skills to thrive. The fundamentals of legal practice – critical thinking, advocacy, drafting and legal research – remain essential, but will likely be complemented by an understanding of how to effectively and responsibly use tools like generative AI.
Asking an advanced AI model like ChatGPT this very question can yield thoughtful insights – a testament to how accessible and transformative these tools have become. (A subscription to GPT-4, for instance, is incredibly cost-effective.) However, the real challenge isn’t in generating answers; it’s in taking those insights and translating them into actionable strategies with clear agreement, planning, and champions to drive implementation.
Clinging to tradition for tradition’s sake – continuing to teach and work as things have ‘always been done’ – risks stagnation. If AI demonstrably improves quality, accuracy and speed while maintaining client confidentiality and reducing costs, resisting its adoption could leave chambers outpaced and even irrelevant. This does not cancel out the need to approach its adoption with caution, however, and to take the time to assess its risk v benefits. To remain competitive, chambers must strike a balance: preserve core legal skills while cautiously embracing the advantages AI offers as part of a forward-looking strategy.
The penny only really drops when people see the capabilities for themselves and compute how this could be of benefit/threat. Any conceptual advice or subjective case study never lands as well as someone seeing an AI tool in action for themselves and experimenting thereafter. The pace of change is going to be unsettling, so getting to grips with the basics now is a good investment, and something a chambers should put its weight behind. Having ‘humans in the loop’ to help guide this is really helpful but barristers cannot offload/outsource this learning curve entirely.
Grass roots peer-to-peer support and training internally is key; get a few champions leading the conversation. Collaboration between other chambers, the Bar Standards Board, Bar Council and third parties with knowledge of operating in a chambers environment is going to be important. Again, putting this question in as a prompt to an LLM (large language model) will give a good framework for consideration.
Probably. Ask an LLM. Get a first draft of ideas. LLMs are fantastic at brainstorming but you then need a discerning evaluation of those ideas plus an action plan/volunteers to bring to life – this is where the human still plays a vital role.
Here are a few examples:
You need to start asking them. If your CEO or senior leadership run client listening exercises for instance, add this question. Get clerks to ask too when the opportunity presents itself. Some of the solicitors our barristers work with have built AI into their workflows.
That’s a big question for which I could go off on several tangents, not least as generative AI is just one facet of the AI revolution. However, for a chambers over the next year, key things on my radar include:
There is much more to discuss, many questions to ask and lots of ideas from barristers and chambers professionals to contribute. If the conversation has begun and people are engaging, your chambers is making progress.
‘Living up to the hype? AI in chambers’, Julie Ahadi, Counsel November 2024
Bar Council guidance on generative AI for the Bar, January 2024
‘AI: the five biggest risks for barristers, Sam Thomas, Counsel October 2024
‘Pupillage special: Using AI safely’, Sally McLaren, Counsel September 2024
I’ve received some interesting feedback to my Counsel article on AI at the Bar (‘Living up to the hype? AI in chambers’, Counsel, November 2024). One email, in particular, caught my attention: from Joel Semakula, a barrister at Landmark Chambers. He made me stop and think not only about the answers to his questions, but the importance of getting into the habit of asking pointed, profession-specific questions about the impact of AI in the first place, and having the curiosity of mind to start flexing out different types of scenarios. Thinking about and role-playing what areas across chambers might be impacted by AI, and how, is just as (if not more) important than having any solid answers at this stage, especially as things move so rapidly.
Our own Chambers’ alumnus and Prime Minister, Sir Keir Starmer KCB KC MP made a major speech in January on the government’s preparation for the impact of AI on all our lives. In the last few weeks there has been an explosion of AGI (artificial general intelligence) and ASI (artificial super intelligence) predictions. The pace of actual adoption and integration of these next level AIs into society will likely be much slower than the theoretical capabilities, but asking probing questions, and playing out answers among ourselves within and across our respective chambers, seems the least we should be doing as an act of collective AI-preparedness. Getting across this topic and sharing information and views is important to protecting the future of the profession at large.
So to those questions that Joel put forward – each of which (and variations of them) I believe are crucial for a chambers to be asking. Disclaimer – my answers are as good as the next person who may take an active interest in this field, trials AI tools and binge listens to AI-related podcasts. Mastery of this topic relies on immersion and is open to anyone with the desire to take an interest.
The use of generative AI in legal work is inevitable and, when used effectively, it can be undetectable. Pupils and junior tenants may not openly disclose their use of AI unless specifically asked, and clients are unlikely to object if the final product is of high quality. The critical challenge lies not in whether people will use this technology, but in how it is used. Understanding the risks and fostering a commitment to responsible usage are essential.
Chambers and their pupillage committees have a proactive role to play in this context. Overreliance on generative AI, particularly among pupils, poses a risk of bypassing the development of fundamental legal skills. To address this, chambers may wish to consider emphasising that generative AI is used as a tool to augment, rather than replace, critical thinking, legal reasoning and drafting expertise. A balanced approach will not only mitigate risks but also equip future barristers to leverage this technology responsibly and effectively in their practice.
The question of whether AI use should be encouraged is significantly nuanced. While AI can offer tremendous benefits, including increased efficiency and accuracy, it also presents several risks and challenges that must be carefully considered. Different chambers and practice areas will face unique issues, and decisions around AI adoption must account for varying levels of risk appetite and understanding. This technology is still in its infancy, and the full extent of potential risks is not yet clear. There are also notable barriers to access, such as cost, IT literacy and the time required for barristers to become proficient. Therefore, while curiosity and experimentation with AI should be encouraged, it is crucial to approach its use with a balanced perspective that weighs both the potential benefits and the risks.
Chambers can play an active ‘educator’ role by circulating existing industry guidance, such as the Bar Council’s Guidance on Generative AI for the Bar, which, as of now, is about 12 months old and likely due for an update. To go further, chambers should consider developing their own tailored guidelines, reflecting their specific priorities and values. By defining clear principles for safe and effective AI use, chambers can foster innovation while ensuring that members uphold professional standards and preserve core legal skills.
In the not too distant future, I predict that pupils and junior tenants will need a blend of traditional legal expertise and modern technological skills to thrive. The fundamentals of legal practice – critical thinking, advocacy, drafting and legal research – remain essential, but will likely be complemented by an understanding of how to effectively and responsibly use tools like generative AI.
Asking an advanced AI model like ChatGPT this very question can yield thoughtful insights – a testament to how accessible and transformative these tools have become. (A subscription to GPT-4, for instance, is incredibly cost-effective.) However, the real challenge isn’t in generating answers; it’s in taking those insights and translating them into actionable strategies with clear agreement, planning, and champions to drive implementation.
Clinging to tradition for tradition’s sake – continuing to teach and work as things have ‘always been done’ – risks stagnation. If AI demonstrably improves quality, accuracy and speed while maintaining client confidentiality and reducing costs, resisting its adoption could leave chambers outpaced and even irrelevant. This does not cancel out the need to approach its adoption with caution, however, and to take the time to assess its risk v benefits. To remain competitive, chambers must strike a balance: preserve core legal skills while cautiously embracing the advantages AI offers as part of a forward-looking strategy.
The penny only really drops when people see the capabilities for themselves and compute how this could be of benefit/threat. Any conceptual advice or subjective case study never lands as well as someone seeing an AI tool in action for themselves and experimenting thereafter. The pace of change is going to be unsettling, so getting to grips with the basics now is a good investment, and something a chambers should put its weight behind. Having ‘humans in the loop’ to help guide this is really helpful but barristers cannot offload/outsource this learning curve entirely.
Grass roots peer-to-peer support and training internally is key; get a few champions leading the conversation. Collaboration between other chambers, the Bar Standards Board, Bar Council and third parties with knowledge of operating in a chambers environment is going to be important. Again, putting this question in as a prompt to an LLM (large language model) will give a good framework for consideration.
Probably. Ask an LLM. Get a first draft of ideas. LLMs are fantastic at brainstorming but you then need a discerning evaluation of those ideas plus an action plan/volunteers to bring to life – this is where the human still plays a vital role.
Here are a few examples:
You need to start asking them. If your CEO or senior leadership run client listening exercises for instance, add this question. Get clerks to ask too when the opportunity presents itself. Some of the solicitors our barristers work with have built AI into their workflows.
That’s a big question for which I could go off on several tangents, not least as generative AI is just one facet of the AI revolution. However, for a chambers over the next year, key things on my radar include:
There is much more to discuss, many questions to ask and lots of ideas from barristers and chambers professionals to contribute. If the conversation has begun and people are engaging, your chambers is making progress.
‘Living up to the hype? AI in chambers’, Julie Ahadi, Counsel November 2024
Bar Council guidance on generative AI for the Bar, January 2024
‘AI: the five biggest risks for barristers, Sam Thomas, Counsel October 2024
‘Pupillage special: Using AI safely’, Sally McLaren, Counsel September 2024
Julie Ahadi responds to thought-provoking feedback on the use of AI in pupillage and explains why we should all be role-playing how the Bar might be affected
Now is the time to tackle inappropriate behaviour at the Bar as well as extend our reach and collaboration with organisations and individuals at home and abroad
A comparison – Dan Monaghan, Head of DWF Chambers, invites two viewpoints
And if not, why not? asks Louise Crush of Westgate Wealth Management
Marie Law, Head of Toxicology at AlphaBiolabs, discusses the many benefits of oral fluid drug testing for child welfare and protection matters
To mark International Women’s Day, Louise Crush of Westgate Wealth Management looks at how financial planning can help bridge the gap
Casey Randall of AlphaBiolabs answers some of the most common questions regarding relationship DNA testing for court
Marking Neurodiversity Week 2025, an anonymous barrister shares the revelations and emotions from a mid-career diagnosis with a view to encouraging others to find out more
David Wurtzel analyses the outcome of the 2024 silk competition and how it compares with previous years, revealing some striking trends and home truths for the profession
Save for some high-flyers and those who can become commercial arbitrators, it is generally a question of all or nothing but that does not mean moving from hero to zero, says Andrew Hillier
Patrick Green KC talks about the landmark Post Office Group litigation and his driving principles for life and practice. Interview by Anthony Inglese CB
Desiree Artesi meets Malcolm Bishop KC, the Lord Chief Justice of Tonga, who talks about his new role in the South Pacific and reflects on his career