*/
From January 2013, as QASA (Crime) starts to roll out across the country, judges will be the evaluators of the advocacy in Crown Courts. Here, Counsel looks at the training the judiciary is receiving and the criteria it will have to apply.
An essential element of any quality assurance scheme is confidence in the integrity of the process. When the Quality Assurance Scheme for Advocates (Crime) (‘QASA’) begins to roll out in January 2013, all Crown Court advocates who do trials will be evaluated by the judiciary. How can the judges, given the role of assessing the quality of advocacy in their courts, produce results which are consistent and based on the advocacy itself? To answer that question, and before the scheme goes ‘live’ on any circuit, every judge who has agreed to take part must undergo a seminar training.
Training the assessors
There is nothing novel about this in the advocacy assessment world. Before the HM CPSI Inspectorate launched its thematic review of prosecution advocacy in the criminal courts, those on the assessment team underwent a day’s consistency training. Those tasked by the CPS to carry out the CPS’s internal quality assurance scheme likewise underwent consistency training. This included watching taped performances of a fictional case study and discussing their evaluations according to the criteria they would be using in their actual assessments.
The judges will be undergoing something similar. It is not a question of sitting them down and teaching them what ‘good’ advocacy is, though it is fair to point out that before 1989 we were brought up with the solemn belief that advocates are born, not made. No one back then was taught advocacy in terms of the elements of good cross-examination or how to construct a speech. That began with the Bar Vocational Course and then spread. In due course we adopted methods of criteria-based feedback in which students and fledgling advocates were told what was undermining their performance, why it was wrong, and were shown how to put it right. We have been doing this for so long now that one forgets that the ranks of advocates and the judiciary are still full of people who never experienced it. Unless they have helped out with training in their Inn or on their Circuit, almost no judge over the age of, say, 48 has had to think about someone’s performance in relation to a set of objective criteria.
Why is this training necessary?
Why should they do so? Transparency. Anyone who is going to be evaluated needs to know that that evaluation is based on something objective and that it is the advocacy itself which is in issue, and not the advocate—not their age, seniority, race, gender, how well or badly they have done in the past, or how well or badly they get on with the judge. It can only happen if the judge is using a stated set of competencies upon which to base ‘this is why it went wrong/right, and this is the basis on which I have evaluated everyone else.’ Each trial is a snapshot in itself. Even the most junior advocate can demonstrate that they have the ability to take on more challenging cases, and even the most senior (and QCs are included in the scheme) can reveal there is some aspect to their advocacy which is not as good as it should be. In the face of the competency standards, everyone is equal.
What are the criteria?
What are the criteria the judges will be asked to use? The Criminal Advocacy Evaluation Form (‘CAEF’) sets out the nine potential standards which judges are asked to consider. Five by definition will arise in every trial; four (e. g. handling expert witnesses) may or may not. For each standard there is a series of competencies, divided according to the level of the case—all Crown Court cases will fall into one of three broad levels, 2, 3, and 4. The criteria, or competency standards, are cumulative, that is, someone on level 3 will be expected to demonstrate all the criteria on levels 1, 2 and 3. Together they add up to the accepted attributes of good advocacy.
Supporters of the scheme
The scheme itself has the backing of the Lord Chief Justice, the Senior Presiding Judge and the Council of Circuit Judges. The training has been devised by The City Law School (CLS) over a period, tested in pilots, and received considerable input from the Judicial College which is responsible for providing induction training and further education for the judiciary. This autumn the training is being delivered to judges on the Midland and Western Circuits. Each judge received an invitation, from the Senior Presiding Judge, from Lady Justice Hallett as Chairman of the Judicial College, and from the City Law School, to a seminar at his or her (or a nearby) Crown Court and given a choice of dates. Respectively 95% and 98% of those who will be sitting in crime replied positively. Groups of 8 or 10 will meet with a trainer for a morning. The trainers have all had years of experience in teaching and assessing advocacy, are themselves current or former practitioners and have attended a training-the-trainers course.
Practicalities and content
The training has been devised so that the face-to-face aspect would only take half a day. Permission was then given to deliver it on a Monday from 9.30 to 1, before the week’s work begins. This ensures that those participating would not be called away to deal with matters in court. The Circuit and Crown Court administrations were involved and consulted so that they could make arrangements for the session and the necessary adjustments to the listings.
The content of the training has been designed to mirror the method which is generally used for judges in their Judicial College training (‘tell/show/do’). There is a brief introduction to the scheme itself in order to set the context in which the judiciary will take part. The bulk of the morning is devoted to achieving consistency of approach in applying the criteria. Filmed and scripted performances in a fictional case study are played. The person being evaluated is usually played by an advocate who is not a criminal practitioner. In the first performance the trainer takes on the role of a judge. He or she goes through the relevant criteria, and sets out to what extent the advocate in the performance has met them. Finally he states whether he finds the performance to be Competent or Not in that standard. Meanwhile he (and the participants) fill out the CAEF as would happen in practice.
Thereafter the participants do it themselves. For the second performance they identify the relevant criteria and after watching the tape state whether the advocate has met them and why. The group decides whether the advocate was Competent or Not. Again the CAEF is filled out as would happen in practice. For the third performance the judges do it on their own: they watch it, fill out the CAEF and then explain and justify their conclusions to the group. Every evaluation must be related to a competency standard—‘I see this in court every day and we get by’ is not good enough. At the end, there is a summary of where the group has got. There is the further advantage that it all takes place in a small group of colleagues.
A training pack is sent in advance to the judges with the case study. After the session they are asked to do ‘consolidation training’, that is, to watch two further performances in the case study, to fill out a CAEF as they had just done in the face to face training and to return it to CLS for feedback.
Judges are not expected in practice to go through every criterion for every performance for every advocate who will appear in front of them. The seminar teaches the importance of grounding their conclusions in objective standards. In due course they will internalise the criteria which will become part and parcel of evaluating advocates under QASA in the same way that advocacy trainers internalise the elements of good advocacy and use them when constructing their ‘headlines’ in Hampel or NITA feedback.
Detailed information about the scheme is available on the QASA website, www.qasa.org.uk. Judges are encouraged to feed back on how the scheme is working from January 2013.
Training the assessors
There is nothing novel about this in the advocacy assessment world. Before the HM CPSI Inspectorate launched its thematic review of prosecution advocacy in the criminal courts, those on the assessment team underwent a day’s consistency training. Those tasked by the CPS to carry out the CPS’s internal quality assurance scheme likewise underwent consistency training. This included watching taped performances of a fictional case study and discussing their evaluations according to the criteria they would be using in their actual assessments.
The judges will be undergoing something similar. It is not a question of sitting them down and teaching them what ‘good’ advocacy is, though it is fair to point out that before 1989 we were brought up with the solemn belief that advocates are born, not made. No one back then was taught advocacy in terms of the elements of good cross-examination or how to construct a speech. That began with the Bar Vocational Course and then spread. In due course we adopted methods of criteria-based feedback in which students and fledgling advocates were told what was undermining their performance, why it was wrong, and were shown how to put it right. We have been doing this for so long now that one forgets that the ranks of advocates and the judiciary are still full of people who never experienced it. Unless they have helped out with training in their Inn or on their Circuit, almost no judge over the age of, say, 48 has had to think about someone’s performance in relation to a set of objective criteria.
Why is this training necessary?
Why should they do so? Transparency. Anyone who is going to be evaluated needs to know that that evaluation is based on something objective and that it is the advocacy itself which is in issue, and not the advocate—not their age, seniority, race, gender, how well or badly they have done in the past, or how well or badly they get on with the judge. It can only happen if the judge is using a stated set of competencies upon which to base ‘this is why it went wrong/right, and this is the basis on which I have evaluated everyone else.’ Each trial is a snapshot in itself. Even the most junior advocate can demonstrate that they have the ability to take on more challenging cases, and even the most senior (and QCs are included in the scheme) can reveal there is some aspect to their advocacy which is not as good as it should be. In the face of the competency standards, everyone is equal.
What are the criteria?
What are the criteria the judges will be asked to use? The Criminal Advocacy Evaluation Form (‘CAEF’) sets out the nine potential standards which judges are asked to consider. Five by definition will arise in every trial; four (e. g. handling expert witnesses) may or may not. For each standard there is a series of competencies, divided according to the level of the case—all Crown Court cases will fall into one of three broad levels, 2, 3, and 4. The criteria, or competency standards, are cumulative, that is, someone on level 3 will be expected to demonstrate all the criteria on levels 1, 2 and 3. Together they add up to the accepted attributes of good advocacy.
Supporters of the scheme
The scheme itself has the backing of the Lord Chief Justice, the Senior Presiding Judge and the Council of Circuit Judges. The training has been devised by The City Law School (CLS) over a period, tested in pilots, and received considerable input from the Judicial College which is responsible for providing induction training and further education for the judiciary. This autumn the training is being delivered to judges on the Midland and Western Circuits. Each judge received an invitation, from the Senior Presiding Judge, from Lady Justice Hallett as Chairman of the Judicial College, and from the City Law School, to a seminar at his or her (or a nearby) Crown Court and given a choice of dates. Respectively 95% and 98% of those who will be sitting in crime replied positively. Groups of 8 or 10 will meet with a trainer for a morning. The trainers have all had years of experience in teaching and assessing advocacy, are themselves current or former practitioners and have attended a training-the-trainers course.
Practicalities and content
The training has been devised so that the face-to-face aspect would only take half a day. Permission was then given to deliver it on a Monday from 9.30 to 1, before the week’s work begins. This ensures that those participating would not be called away to deal with matters in court. The Circuit and Crown Court administrations were involved and consulted so that they could make arrangements for the session and the necessary adjustments to the listings.
The content of the training has been designed to mirror the method which is generally used for judges in their Judicial College training (‘tell/show/do’). There is a brief introduction to the scheme itself in order to set the context in which the judiciary will take part. The bulk of the morning is devoted to achieving consistency of approach in applying the criteria. Filmed and scripted performances in a fictional case study are played. The person being evaluated is usually played by an advocate who is not a criminal practitioner. In the first performance the trainer takes on the role of a judge. He or she goes through the relevant criteria, and sets out to what extent the advocate in the performance has met them. Finally he states whether he finds the performance to be Competent or Not in that standard. Meanwhile he (and the participants) fill out the CAEF as would happen in practice.
Thereafter the participants do it themselves. For the second performance they identify the relevant criteria and after watching the tape state whether the advocate has met them and why. The group decides whether the advocate was Competent or Not. Again the CAEF is filled out as would happen in practice. For the third performance the judges do it on their own: they watch it, fill out the CAEF and then explain and justify their conclusions to the group. Every evaluation must be related to a competency standard—‘I see this in court every day and we get by’ is not good enough. At the end, there is a summary of where the group has got. There is the further advantage that it all takes place in a small group of colleagues.
A training pack is sent in advance to the judges with the case study. After the session they are asked to do ‘consolidation training’, that is, to watch two further performances in the case study, to fill out a CAEF as they had just done in the face to face training and to return it to CLS for feedback.
Judges are not expected in practice to go through every criterion for every performance for every advocate who will appear in front of them. The seminar teaches the importance of grounding their conclusions in objective standards. In due course they will internalise the criteria which will become part and parcel of evaluating advocates under QASA in the same way that advocacy trainers internalise the elements of good advocacy and use them when constructing their ‘headlines’ in Hampel or NITA feedback.
Detailed information about the scheme is available on the QASA website, www.qasa.org.uk. Judges are encouraged to feed back on how the scheme is working from January 2013.
From January 2013, as QASA (Crime) starts to roll out across the country, judges will be the evaluators of the advocacy in Crown Courts. Here, Counsel looks at the training the judiciary is receiving and the criteria it will have to apply.
An essential element of any quality assurance scheme is confidence in the integrity of the process. When the Quality Assurance Scheme for Advocates (Crime) (‘QASA’) begins to roll out in January 2013, all Crown Court advocates who do trials will be evaluated by the judiciary. How can the judges, given the role of assessing the quality of advocacy in their courts, produce results which are consistent and based on the advocacy itself? To answer that question, and before the scheme goes ‘live’ on any circuit, every judge who has agreed to take part must undergo a seminar training.
Chair of the Bar Sam Townend KC highlights some of the key achievements at the Bar Council this year
Louise Crush of Westgate Wealth Management highlights some of the ways you can cut your IHT bill
Rachel Davenport breaks down everything you need to know about AlphaBiolabs’ industry-leading laboratory testing services for legal matters
By Louise Crush of Westgate Wealth Management sets out the key steps to your dream property
A centre of excellence for youth justice, the Youth Justice Legal Centre provides specialist training, an advice line and a membership programme
By Kem Kemal of Henry Dannell
Mark Neale, Director General of the Bar Standards Board, offers an update on the Equality Rules consultation
Joanna Hardy-Susskind speaks to those walking away from the criminal Bar
Imposing a professional obligation to act in a way that advances equality, diversity and inclusion is the wrong way to achieve this ambition, says Nick Vineall KC
Tom Cosgrove KC looks at the government’s radical planning reform and the opportunities and challenges ahead for practitioners
By Ashley Friday of AlphaBiolabs