*/
How to cross-examine your Gen AI tools and interrogate the outputs? Sally McLaren’s tips for using AI safely in legal research
Unless you’ve been completely off-grid for the past 12 months or so, you’ve likely encountered the deluge of news, articles, explainers, and enthusiastic LinkedIn posts about the wonders and/or terrors of Generative AI.
If you have been offline and missed it all, then congratulations! It’s been a lot! This bit is for you. The AI savvy/weary may skip ahead:
The term ‘artificial intelligence’ (AI) has been in use since the 1950s and refers to the simulation of human intelligence in machines, enabling them to perform tasks that typically require human-like understanding, reasoning, learning, and problem-solving.
Generative AI (Gen AI) is a type of AI that can create or generate content, such as text, images, or other data, by learning from large datasets and producing novel outputs based on observed patterns. Popular examples include: ChatGPT, Claude and Gemini.
There are probably as many Gen AI evangelists as there are prophets of doom, but in between the two camps is a DMZ populated by many more wary adopters, curious skeptics and AI casuals. It is increasingly unrealistic to think that students, pupils, barristers, or indeed law librarians, won’t be using Gen AI. Quite the opposite. Leveraging these new tools is fast becoming a marketable skill. However, as useful as Gen AI can be, there is a significant degree of risk attached to employing it in your studies and practice.
Here are eight tips to help minimise the risks associated with using Gen AI:
Gen AI is just another tool to be leveraged, albeit carefully. Investing time in mastering this new skill and learning more about risks and effective use is key. Explore a curated list of online courses, many of which are free, here.
Unless you’ve been completely off-grid for the past 12 months or so, you’ve likely encountered the deluge of news, articles, explainers, and enthusiastic LinkedIn posts about the wonders and/or terrors of Generative AI.
If you have been offline and missed it all, then congratulations! It’s been a lot! This bit is for you. The AI savvy/weary may skip ahead:
The term ‘artificial intelligence’ (AI) has been in use since the 1950s and refers to the simulation of human intelligence in machines, enabling them to perform tasks that typically require human-like understanding, reasoning, learning, and problem-solving.
Generative AI (Gen AI) is a type of AI that can create or generate content, such as text, images, or other data, by learning from large datasets and producing novel outputs based on observed patterns. Popular examples include: ChatGPT, Claude and Gemini.
There are probably as many Gen AI evangelists as there are prophets of doom, but in between the two camps is a DMZ populated by many more wary adopters, curious skeptics and AI casuals. It is increasingly unrealistic to think that students, pupils, barristers, or indeed law librarians, won’t be using Gen AI. Quite the opposite. Leveraging these new tools is fast becoming a marketable skill. However, as useful as Gen AI can be, there is a significant degree of risk attached to employing it in your studies and practice.
Here are eight tips to help minimise the risks associated with using Gen AI:
Gen AI is just another tool to be leveraged, albeit carefully. Investing time in mastering this new skill and learning more about risks and effective use is key. Explore a curated list of online courses, many of which are free, here.
How to cross-examine your Gen AI tools and interrogate the outputs? Sally McLaren’s tips for using AI safely in legal research
The beginning of the legal year offers the opportunity for a renewed commitment to justice and the rule of law both at home and abroad
By Louise Crush of Westgate Wealth Management sets out the key steps to your dream property
A centre of excellence for youth justice, the Youth Justice Legal Centre provides specialist training, an advice line and a membership programme
By Kem Kemal of Henry Dannell
By Ashley Friday of AlphaBiolabs
Providing bespoke mortgage and protection solutions for barristers
Joanna Hardy-Susskind speaks to those walking away from the criminal Bar
From a traumatic formative education to exceptional criminal silk – Laurie-Anne Power KC talks about her path to the Bar, pursuit of equality and speaking out against discrimination (not just during Black History Month)
Irresponsible use of AI can lead to serious and embarrassing consequences. Sam Thomas briefs barristers on the five key risks and how to avoid them
Yasmin Ilhan explains the Law Commission’s proposals for a quicker, easier and more effective contempt of court regime
James Onalaja concludes his two-part opinion series