IN THE HIGH COURT OF JUSTICE
ADMINISTRATIVE COURT
Roya1 Courts of Justice
Strand
London WC2A 2LL
Before:
MR JUSTICE WALKER
Between:
-------------------------
Ealing London Borough Council
Claimant
and
The Audit Commission for Local Authorities and
the National Health Service in England and Wales
Defendant
-------------------------
(Transcript of the Handed Down Judgment of
Smith Bernal Wordwave Limited, 190 Fleet Street
London EC4A 2AG
Tel No: 020 7421 4040, Fax No: 020 7831 8838
Official Shorthand Writers to the Court)
-------------------------
Andrew Arden QC and Jonathan Manning (instructed by Eversheds) for the Claimant
Richard Gordon QC and Maya Lester (instructed by R.Hamilton) for the Defendant
Solicitor for Audit Commission
-------------------------
Judgment
MR JUSTICE WALKER:
The defendant, which I shall call the “Audit Commission”, is a statutory body created by the Audit Commission Act 1998. Section 99 of the Local Government Act 2003 imposes duties on the Audit Commission as follows:
“(1) The Audit Commission must from time to time produce a report on its findings in relation to the performance of English local authorities in exercising their functions.
(2) A report under sub-section (1) must (in particular) categorise each English local authority to which the report relates according to how the authority has performed in exercising its functions.”
By section 99(7) “English local authority” is defined so as to include the claimant, which I shall call “Ealing”.
The Audit Commission notified Ealing on 13.12.04 that it had categorised Ealing as “weak”. In these proceedings Ealing challenges that categorisation, complaining that it arose from a “pre-ordained and binding rule that an authority rated “zero star” by the Commission for Social Care Inspection could not in any circumstances be placed in a higher category.”
The categorisation process
In order to perform its duties under section 99 the Audit Commission, after consultation, published a document entitled “Comprehensive Performance Assessment Framework 2004.” In this judgment I abbreviate “Comprehensive Performance Assessment” to “CPA”, and I shall call this document the “2004 CPA Framework.” Features of the 2004 CPA Framework include the following (emphasis in the quotation below is as in the original):
The CPA framework brings together judgements about:
• core service performance in education, social services, housing, environment, libraries and leisure, benefits, and use of resources; and
• the council’s ability measured through a corporate assessment.
…
Each of the individual service judgements and the use of resources judgement are awarded a score of 1 to 4, with 1 being the lowest score and 4 being the highest. These are then combined into an overall core service performance score of 1 to 4.
Each of the themes scored within the corporate assessment (ambition, prioritisation, focus, capacity, performance management, achievement of improvement, investment, learning and future plans) are also awarded a score of 1 to 4. These are then combined to reach an overall council ability score ranging from 1 to 4.
The overall CPA category (‘excellent’, ‘good’, ‘fair’, ‘weak’ and ‘poor’) is reached by combining the overall core service performance and council ability scores in the form of a matrix (see below). Where a council has not achieved a specified level of performance on education, social care or financial management (or scores a 1 on any other service), rules apply which limit a council’s overall category, see paragraphs 29 - 30.
CORE SERVICE PERFORMANCE
Scores |
1 |
2 |
3 |
4 |
||||
COUNCIL |
1 |
poor |
poor |
weak |
fair |
|||
ABILITY |
2 |
poor |
weak |
fair |
good |
|||
3 |
weak |
fair |
good |
excellent |
||||
4 |
fair |
good |
excellent |
excellent |
Rules
Rules limit a council’s overall CPA category where a council’s score falls below a specified level on education, social care or financial standing, or scores a 1 on any other service.
The rules are as follows:
• [Rule 1] A council must score at least 3 (2 stars) on education, social services star rating, and financial standing to achieve a category of ‘excellent’ overall;
• [Rule 2] A council must score at least 2 (1 star) on education, social services star rating, and financial standing to achieve a category of ‘fair’ or above; and
• [Rule 3] A council must score at least 2 (1 star) on all other core services to achieve a category of ‘excellent’ overall.
The 2004 CPA Framework built on work which had been done by the Audit Commission prior to the Local Government Act 2003. The witness statement of Joanna Killian gives a helpful explanation of the Audit Commission’s role, and of the CPA scheme. It is convenient to set it out here:
“5. The AC [i.e the Audit Commission] is an independent public body responsible for the ensuring that public money is spent economically, efficiently, and effectively in the areas of local government, housing, health, criminal justice and fire and rescue services. The AC helps those responsible for public services to achieve better outcomes for citizens, with a focus on those people who need public services most.
6. The AC assesses the overall performance of every council in England, through the CPA scheme, which looks at, amongst other things, how well the council delivers services such as education, social care and housing. CPA is operated by the AC, working closely with the specialised services of inspectorates (which Ealing of course accepts that the AC is entitled to do). Its purpose is to ensure that local authorities are performing well in providing services, including by designing ways to improve services where weaknesses are identified, and has the corporate ability to deliver continuous improvement.
7. The AC has undertaken a programme of local authority inspections, since 2000. the CPA scheme was first announced in 2001 in the Government’s White Paper entitled ‘Strong Local Leadership – Quality Public Services’which incorporated a number of the AC’s recommendations for the creation of a “national framework of standards and accountability” and “regular coprehensive performance assessments for all councils, identifying how they are performing against these standards”. The White Paper was implemented by the Local Government Act 2003, which includes the duty at section 99 which is the subject of the present challenge.
8. The 2001 White Paper set out the basic features of the CPA scheme. The AC would work with specialised inspectorattes to bring together evidence “gathered from a wide range of different assessments” including “Ofsted, the Social Services Inspectors, the Benefit fraud Inspectorate, and other service based inspections and assessments …The result will be a balanced scorecard compiled by the AC with assistance from other inspectorates and bodies with an assessment role, and working with councils themselves”. The information gleaned from CPA would lead to targeting of councils’ weaknesses and needs, public information about councils’ performance, and early action to tackle it .
9. As for the performance rating for social services, the White Paper said “Comprehensie performance assessment builds on the development of social services performance ratings, to be published for the first time in Spring 2002-3. The social services performance stars will provide judments of performance for social services in a way that is understandable for the servie users and the general puhic. Social services performance assessment brings together evidence from indicators, inspections and in-year monitoring … As well as the single star rating for overall social services performance, judgments on services for children and services for adults will be presented. Judgments will be made on the basis of current performance but will also include prospects for improvement … The social services performance ratings will feed into the comprehensive performance assessment for all local authority services”
10. The Government implementation plan for the White Paper made it clear that it was for the AC, working with the other inspectorates, to develop and implement CPA. During 2001-2002 the AC embarked on an extensive series of consultations with local authorities as to the methodology that should be used to undertake CPA. The Commission held a number of conferences and seminars for local government representatives. It also brought together leading experts in the field of local government (Chief Executives, leading politicians, academics and professionals) to formulate and debate aspects of the emerging policy. The focus of this activity was the consultation document entitled ‘Delivering Comprehensive Performance Assessment’.
11. This comprehensive and exhaustive consultation resulted in a framework entitled “The final CPA assessment framework for single tier and county councils 2002”. It was clear to the AC that in order to fulfill its remit to report on local authorities’ functions and performance it would haveto gather together assessments made by expert bodies (such as Ofsted, CSCI and so on). This was fully in line with Government policy in the White Paper as set out above. Moreover, it would have to make a number of carefully considered policy decisions (after consultation with relevant parties) as to how best to make categorization decisions, and which functions of local authorities should have most weight in those decisions.
Paragraphs 3.18 and 3.19 of the White Paper were as follows:
Evidence on councils performance is currently gathered from a wide range of different assessments. The comprehensive performance assessments will draw these together. Each councils performance and capacity to improve will be assessed, taking into account local circumstances, bringing together:
performance indicator data (on current performance and past trends);
OfSTED, the Social Services Inspectorate, the Benefit Fraud Inspectorate and other service based inspections and assessments together with audit reports; and
a corporate governance assessment of the authority as a whole, undertaken in dialogue with the authority and incorporating an element of peer review.
The result will be a balanced scorecard compiled by the Audit Commission with assistance from other inspectorates and bodies with an assessment role, and working with councils themselves. This will identify each council as either:
high-performing near the top of the performance spectrum, with high performance in priority service areas, no poorly performing services and with proven capacity to improve;
striving not necessarily at the top of the performance spectrum but with proven capacity to improve;
coasting not at the top of the performance spectrum and with limited or no proven capacity to improve; or
poor-performing consistently near the bottom of the performance spectrum and with limited or no proven capacity to improve.
The social services performance rating |
Comprehensive performance assessment builds on the development of social services performance ratings, to be published for the first time in Spring 2002-03. The social services performance stars will provide judgements of performance for social services in a way that is understandable for the service users and the general public. Social services performance assessment brings together evidence from indicators, inspections and in-year monitoring. Each year, the Social Services Inspectorate meets with each council to review performance and identify key improvements for the year ahead. As well as the single star rating for overall social services performance, judgements on services for children and services for adults will be presented. Judgements will be made on the basis of current performance but will also include prospects for improvement. A range of freedoms will be available for the best performers. Three-star councils will have access to their share of the social services performance fund by right, for example. This approach will be extended to other grants and the Government is considering how, for those performing well, planning requirements could be reduced and a lighter touch inspection regime introduced. The social services performance ratings will feed into the comprehensive performance assessment for all local authority services. |
I return to the position after the Local Government Act 2003. In the event Ealing achieved for 2004 scores of 3 on each of core service performance and council ability. Applying the matrix at paragraph 14 of the CPA framework, it would, but for the application of Rule 2, be categorised as “good.” However, the Audit Commission has advised Ealing that because it received a zero star rating from the Commission for Social Care Inspection (“CSCI”) Rule 2 comes into play. This means that Ealing drops 2 categories, and is categorised as “weak”. Neither the matrix nor the Rules expressly refer to CSCI. However paragraphs 108 to 112 of the 2004 CPA framework explain the use made of CSCI’s judgments and star ratings:
Social Care – CPA Assessment 2004
The core service score for social care is included in the overall score for core service performance in CPA.
The social care assessment is the responsibility of Commission for Social Care Inspection (CSCI) and is given as two separate judgements for both adults and children’s services. In both cases, a judgement for both current performance and capacity for improvement is given. Only the current performance judgement will be used in the social care core service score for CPA. The categories for judging current performance (serving people well?) are no, some, most, and yes.
For the purposes of the CPA assessment framework the current performance judgement is converted into a score of 1 – 4 as follows:
CSCI current performance CPA social care
judgement (serving people core service score
well?)
No 1
Some 2
Most 3
Yes 4
The judgements for adults and children are treated separately in the assessment framework.
CSCI also produce overall star ratings, given as 0* to 3*, with 3* being the highest. The overall star ratings are not used in the CPA social care core service score but will be used to determine whether a council will be held back by a rule limiting the CPA category they can achieve. (see paragraphs 29 – 30 in section one of this document for further details on rules).
The statutory functions of CSCI
CSCI is a statutory body established by section 42 of the Health and Social Care (Community Health and Standards) Act 2003. Chapter 5 of that Act deals with the functions of CSCI. By section 76 it has the general function of encouraging improvement in the provison of English local authority social services, and by section 79(1) in each financial year it must conduct a review of those services provided by each local authority in England. Section 79(2) provides that after conducting that review CSCI must award a performance rating to the authority in question.
I have been much assisted by the statement of David Behan, the Chief Inspector of CSCI. This explains that CSCI took over the functions of the National Care Standards Commission (in part), the Social Servies Inspectorate (SSI) and the SSI/Audit Commission Joint Review Team. However, CSCI has a wider remit than its predecessor bodies. CSCI registers and inspects adult and older people’s care homes, home care services, nurse’s agencies, adult placement schemes, children’s homes, fostering and adoption agencies and residential family centres in England. CSCI inspects, reviews and assesses the performance of all English local authority social services and awards an annual star rating in accordance with section 79(2) of the Act. It has the statutory role of advising the government on any matter connected with social care services in England. In accordance with section 83(2) of the Act, “CSCI and the Audit Commission must co-operate with each other with respect to the exercise of their respective functions under section 82 [of the Act, this confers on CSCI functions as to the promoting or undertaking of studies] and sections 33 and 34 of the Audit Commission Act 1998 [which require the Audit Commission to undertake or promote comparative and other studies designed to enable it to make recommendations for improving economy, efficiency and effectiveness and for improving the financial or other management of bodies subject to audit, and to enable it to prepare reports on certain aspects of these matters].”
Relevant provisions of Chapter 5 came into force on 1 April 2004. Prior to this on 13 January 2004 the Department of Health Social Services Inspectorate (“SSI”) published a document entitled “SSI Performance Assessment 2004 Operating Policies.” I shall call this document the “CSCI Policies.” Relevant passages will be found in the Appendix to this judgment.
I draw particular attention to annex 1 of the CSCI policies. This comprises a table to be used in determining CSCI’s star ratings. It will be found at the end of the Appendix. In order to understand annex 1, it is necessary to explain that each row and column identifies two judgments: the first is as to whether the authority is serving people well (no/ some /most /yes), and the second is as to capacity for improvement (poor /uncertain /promising /excellent). It will be seen that an authority’s capacity for improvement has an important effect on the star rating. It will also be seen from paragraph 8.2 of the CCSI Policies that by giving a zero rating CSCI is able to make a council subject to special measures.
Ealing was informed of CSCI’s conclusions relevant to its social services CPA component by a faxed letter dated 15 November 2004 from Mr Mike Rourke, regional director of CSCI. This included the following:
“I am writing to inform you of the latest performance ratings for your Council’s Social Services. This rating constitutes the social services component of the comprehensive performance assessment for all local government services.
The judgments and rating for your council are as follows:
Services for Children: Serving people well: most. Capacity for improvement: uncertain.
Services for adults: Serving people well: some. Capacity for improvement: poor.
Star ratings: your social services performance rating is Zero star.
…
The final decisions on star ratings were made by the Chief Inspector. …”
The second sentence of Mr Rourke’s letter may have involved an over-simplification. The CSCI judgments on “Serving people well” were included in the determination of Ealing’s “Current Service Performance” scores. For reasons which I need not elaborate, CSCI judgments on “Capacity for improvement” did not feature in the “Council Ability” CPA scores.
Ealing’s star rating of zero notified by Mr Rourke on 15 November 2004 was consistent with the table set out at Annex 1 of the SCSI Policies.
Ealing’s complaint and the Audit Commission response
When Ealing realised that the effect of CSCI’s zero star rating under Rule 2 would be to prevent Ealing being categorised as higher than “weak” it took legal advice. This led to a letter from its solicitors to the Audit Commission dated 2 December 2004. Among other things, the letter described Rule 2 and stated:
“The CSCI rated the Council a “zero star” authority. The Council (whilst not accepting the justification of this rating) have decided not to challenge it for reasons which it is neither necessary nor relevant to set out here. Should the issue become material between the Council and the Commission, then those reasons can be elaborated upon at that stage. For the moment, it is sufficient to state that the Council consider it inappropriate to challenge that rating.
The effect of the rating, however, is the automatic – and indeed dramatic – relegation of the Council’s overall CPA categorisation from what would otherwise be a very high “Fair” (or even low “Good”) (depending on the outcome of the other pending service scores) to “Weak”, exclusively and directly in consequence of the application of a rule that no authority can be rated higher than “Weak” where it has received a zero star rating from CSCI. That is an effect which the Council is unable and unwilling to ignore, particularly given the consequences of categorisation as weak (which we do not need to spell out) and the work in which the Council has engaged over the last two years to advance its position from (originally) “Weak” to the edge of “Good”.
We consider the rule referred to above to be unlawful. Section 99 of the Local Government Act 2003 requires the Commission from time to time to produce a report on “its findings” in relation to the performance of English authorities. It is trite law that a body whose duty it is to exercise a statutory function cannot fetter itself by allowing an outside body to dictate to it, but must exerise the function itself (see e.g. Lavender & Sons v Minister for Housing and Local Government [1970] 1 WLR 1231, DC).
While we do not question the right of the Commission in making its “findings” under CPA to take full account of the views of and scores awarded by other inspectorates, the rule it has imposed is legally offensive because it binds the Commission (both in advance and subsequently), regardless of the circumstances or its own judgment, to categorise authorities as “Weak”, with all of the intended disadvantages and restrictions which such a classification imposes, purely because of the judgment of an outside body which is not the body charged by Parliamant to perform the CPA function. It is in this way that the Commission has fettered its judgment and allowed itself to be dictated to by an outside body.”
The Audit Commission replied on 13 December 2004, stating (among other things):
“As you acknowledge, the CSCI as the statutory regulator for local authority social services has awarded Ealing a zero star rating this year. According to CSCI’s Operating Policies this is a serious matter which means that the Council is now subject to special measures, the terms of which I understand will be determined in due course. The level and form of engagement or intervention will be approved by relevant Ministers on a recommendation from CSCI. In our view it would be odd indeed for the Commission to award the Council a “fair” or “good”overall CPA rating when it has performed so very badly in its social services as to attract special measures from government. Such a result would tend to bring the system of performance assessment for local government into disrepute.
As a separate but equally compelling point, by demanding the withdrawal of the rule in question you appear to be going so far as to say that the zero rating awarded by CSCI should have no impact whatsoever on the Council’s overall CPA categorisation due to be announced this week. Once again, in the Commission’s view this would be quite wrong.
Your contentions about delegation and fettering miss the point. In particular, you appear to misunderstand the way in which the Commission discharges its obligations pursuant to Section 99 of the Local Government Act 2003. I quite accept that the Act requires the Commission to make findings in relation to the performance of English local authorities. It is, however, a fallacy to imply any requirement for separate evaluation by the Commission of matters that are the subject of a comprehensive scheme of complementary regulation.
CSCI is required to award an annual performance rating to local authority social services, pursuant to Section 79 of the Health and Social Care (Community Health and Standards) Act 2003. That provision requires CSCI to exercise its functions according to criteria it has devised which are approved by the Secretary of State for Health. CSCI has developed a comprehensive methodology, set of procedures and criteria which have been approved by the Secretary of State for Health, and which are all transparently set out in its Operating Policies. CSCI was fully cognisant of the role that the star rating system would play in CPA, and we believe has devised its system appropriately. Assuming, of course, that it is not suggested that CSCI has performed its functions unlawfully, the Commission may take into account the CSCI rating in its CPA categorisation and attach significance to that rating in the way that it has. In the result the Commission has found that your client’s social services have performed so badly as to attract a zero star rating this year from the statutory regulator, CSCI. This is as much a finding for the purpose of section 99 as any other, and nothing in the rest of the Commission’s evaluation of the Council for the purposes of CPA tends to cast doubt on it.
It is of course, the Commission, not CSCI, which has evaluated and determined the role which CSCI’s star ratings should play in the matrix of scores and judgments which go to make up a local authority’s overall CPA category. In this way it is the Commission, not CSCI, which discharges the duty to categorise each local authority according to how it has performed pursuant to Section 99(2) of the Act. There is no question of CSCI being able to dictate a result to the Commission.
As regards the rationale for the rule in question, the Commission consulted on the role of other inspectorates’ judgments in its CPA framework. It reached the considered view that the rule served an important purpose, in that it gave an incentive to authorities to take corporate responsibility for and ownership of failure in key service areas that serve the most vulnerable client groups.
You have plainly indicated in your letter that you do not seek to challenge the CSCI rating. In those circumstances we cannot see how you can complain if the Commission regards that unchallenged CSCI rating as a relevant factor to take into consideration in the manner that it has. In any event, it is difficult to understand how you can base your argument on a criticism of CSCI’s methodology if in fact you accept its conclusion.”
Ealing having been notified accordingly that it was categorised as “weak”, these proceedings were issued as a matter of urgency. I have been told that an order giving effect to the categorisation is proposed to be laid before Parliament on 25 February.
Submissions of the parties
Mr Arden QC on behalf of Ealing said that the Audit Commission intended to subordinate their decision to the categorisation by CSCI. Indeed, in their Skeleton Argument they had gone to the extreme of saying that it would be irrational to do otherwise. He explained that Ealing primarily made what he described as a “dictation” challenge, rather than an “inflexibility” challenge. Ealing accepted that there could be rules for the CPA, but they had to be the Audit’s Commission rules. “Dictation” in this sense did not prevent having regard to the decision of another body, but was objectionable because it went beyond this. By saying, “our decision will be bound by the outcome of theirs” the Audit Commission had subordinated their own decision making function and fettered themselves by dictation.
The Audit Commission’s categorisation function was described by Mr Arden as raising constitutional issues in the sense that it dominated all aspects of Local Government. There were of course political implications. As to the strictly legal implications, categorisation of authorities affects the amount of external inspection to which they will be subject, and is used in order to determine the extent to which an authority is tramelled in its power to trade and the extent to which it is required to draw up a range of plans. Under Section 99(5) of the 2003 Act the categorisation is to be embodied in an order which the Secretary of State is to lay before Parliament, and when doing so the order may depart from the Audit Commission’s categorisation only in order to correct a clerical or typographical error notified by the Audit Commission. Mr Arden suggested that it was therefore critical to the parliamentary intention to maintain the integrity and independence of the categorisation exercise. This meant that the statutory language should be construed cautiously without the implication of additional powers - or freedom – on the part of the Audit Commission that Parliament cannot be said necessarily to have intended to confer on it.
The CPA exercise for 2004 could, Mr Arden submitted, be analysed in two stages. The process of determining the scores – in Ealing’s case, 3 out of 4 on each of current service provision and the corporate assessment – amounted to the making of findings for the purpose of Section 99(1). The conversion of scores in to a category was the performance of the duty under Section 99(2). Accordingly, both the matrix and the rules were not be be regarded as findings within Section 99(1).
As regards “findings” in this sense, Ealing’s statement of facts and detailed grounds for review at paragraph 50 said that as CSCI was a body with specialised skills, the Audit Commission – in the sense “a generalist body” – might treat their conclusions against certain criteria as “findings” in its own hands. Put another way, those conclusions were based on an objective process the validity of which the Audit Commission could assess for itself and which it might decide not to treat as findings in an exceptional case, e.g. if convinced by the authority that CSCI had erred. In oral submissions Mr Arden added that the use of the matrix without any opportunity for the exercise of discretion to depart from it was lawful, otherwise there might be eighty applications for reconsideration. (I shall return to this concession later.)
In response to a question from me Mr Arden said that he did not pursue any challenge to the fairness or proportionality of the Rules. Ealing did not have to deal with the question of what it might say if Rule 2 had come into operation on the basis of a star rating devised by the Audit Commission itself – it was enough for the purposes of the present challenge that the Audit Commission had subordinated its own decision to that of CSCI.
Returning to the suggested distinction between “findings” and “categorisation”, Mr Arden said that the words “in particular” found in Section 99(2) might refer to the structure of the report, rather than the findings. On the facts in the present case, the CSCI star ratings used in Rule 2 were very different from the other CSCI conclusions which were used in order to determine the scores, and in particular those used under the “current service performance” part of the CPA framework. The star rating involved a discretion and value judgment. Mr Arden pointed to a number of statements suggesting that the ultimate discretion in this regard lay with the Chief Inspector. Mr Arden acknowledged that Mr Behan’s statement denied this, but on any view, submitted Mr Arden, the process involved a strong element of value judgment.
In support of the distinction between a “finding” and “categorisation”, Mr Arden relied on dictionary definitions. These suggested that “findings” were conclusions based on “evidence” or “information” (“a conclusion reached as a result of an inquiry, investigation or trial”). By contrast “to categorise” is not to draw a conclusion from evidence or information but “to place in a particular category; classify”. How one categorised involved choices, this was where values came in. The choices were subjective in the sense that reasonable persons might take different views. Thus Ealing’s position was that the Audit Commission could adopt the “findings” of CSCI because they were objective findings, but the CSCI star rating was different in that it was a value judgment. Ealing did not say that CSCI could not assist the Audit Commission: the latter could use the product of CSCI’s investigative work in the same way as it could use the product of its own employees’ investigative work. Anticipating the Audit Commission’s arguments about the importance of the CSCI star ratings, Mr Arden submitted there was nothing irrational in an authority being categorised overall as “good” even though one department was poor. Ealing’s stance was that if the Audit Commission wished to have regard to CSCI’s star rating, then it had to identify a legitimate use for that star rating in the categorisation process, additional to the use already made in the CPA Framework of the findings on which it was based. The problem with the current approach was inflexibility. If the Audit Commission had simply said that they reserved the right to consider holding back councils from a categorisation higher than “weak” if CSCI awarded no star, that could properly be part of the methodology for determining categorisation. What CSCI had done amounted to an absolute rule that if the Audit Commission reach one conclusion and CSCI reached another, then the Audit Commission would subordinate their conclusion to that of CSCI.
Turning to legal principle, Mr Arden relied on cases about the fettering of discretion. In Lavender v. Minister of Housing and Local Government [1970] 1 W.L.R. 1231, DC, Willis J said (at 1240-1241):
“…[the Minister] has said in language which admits of no doubt that his decision to refuse permission was solely in pursuance of a policy not to permit minerals…to be worked unless the Minister of Agriculture was not opposed to their working… It seems to me that by adopting and applying his stated policy he has in effect inhibited himself from exercising a proper discretion (which would of course be guided by policy considerations) in any case where the Minister of Agriculture has made and maintained an objection… Everything else might point to the desirability of granting permission, but by applying and acting on his stated policy, I think the Minister has fettered himself in such a way that in this case it was not he who made the decision for which Parliament has made him responsible. It was the decision of the Minister of Agriculture not to waive his objection which was decisive in this case, and while that might properly prove to be the decisive factor for the Minister when taking into account all material considerations, it seems to me quite wrong for a policy to be applied which in reality eliminates all the material considerations save only the consideration, when that is the case, that the Minister of Agriculture objects. This means, as I think, that the Minister has, by his stated policy, delegated to the Minister of Agriculture the effective decision…where the latter objects…”
Applying this passage, because the CSCI star-rating overrides what would otherwise have been the Audit Commission's categorisation, it "eliminates" the considerations that led that categorisation, i.e. it prevents them having any effect. Wade & Forsyth, Administrative Law, 9th. edn., chapter 10 at p.326, describes the case as one of putting "the decisive power into the hands of the wrong minister". The principle has been applied in other cases, including Gardner v. London, Chatham and Dover Railway (1867) L.R. 2 Ch. App. 201 and Parker v. Camden L.B.C. [1986] 1 Ch 162, CA, in which the courts have held that where Parliament has expressly conferred powers and imposed duties and responsibilities on a particular body, it would be improper for the court to assume those powers and duties itself, by means of the appointment by the court of a manager (who would be an agent of the court). Citing Gardner, Sir John Donaldson M.R. in Parker, at 173/B-D, explained:
“Cairns L.J. ruled...that when Parliament expressly confers powers and imposes duties and responsibilities of an important kind upon a particular body, it is, as he put it, improper for the court by the appointment of a manager...itself to assume those powers and duties. That ruling clearly could be looked at again and, if necessary, overruled by the House of Lords, but its reasoning does not depend upon pre-1873 Chancery practices but on a clear view that parliamentary intentions so expressed should be respected.”
Also relevant was Wade & Forsyth Chap. 10, Retention of Discretion - under the heading Surrender, Abdication, Dictation, sub-heading Power in the wrong hands - records (at p.322):
“Closely akin to delegation, and scarcely distinguishable from it in some cases, is any arrangement by which a power is in substance exercised by another. The proper authority may share its power with someone else, or may allow someone else to dictate to it by declining to act without their consent or by submitting to their wishes or instructions. The effect then is that the discretion conferred by Parliament is exercised, at least in part, by the wrong authority, and the resulting decision is ultra vires and void. So strict are the courts in applying this principle that they condemn some administrative arrangements which must seem quite natural and proper to those who make them. ...”
Mr Arden concluded this part of his submissions by saying that unless the purpose was to bring the Audit Commission’s categorisation into line with the CSCI star rating, which would be an act of fettering (or subordination), there was no conceivable purpose served by (on the one hand) taking the CSCI scores into the Audit Commission’s findings, and categorising in accordance with those findings, and (on the other hand) “capping” its own categorisation by reference to CSCI’s star rating.
This led into an allied argument that the Audit Commission had misconstrued the statutory purpose. Its aim had been to achieve consistency between its categorisation and that of CSCI – as was evidenced by letter of 13th December 2004, asserting that any other result would tend to bring the system of performance assessment for local government into disrepute. However the Audit Commission had no brief to uphold "the system of performance assessment for local government" as a whole, only to discharge its statutory duty to produce its findings and categorise accordingly. The Audit Commission could have been charged with bringing together the performance assessment results of all the inspectorates, or to have reported on or categorised local authorities generally without limiting the exercise to its findings. It was, however, not absurd for the Audit Commission to be charged to make findings about and/or categorise local authorities as a whole, i.e. from an overall perspective ("holistic"), while other bodies are charged to deal with specific functions; to the contrary, this would seem to serve a more obvious purpose than statutorily to require authorities to be assessed and categorised as a whole, while allowing the result to be dominated by one or two service functions (assessed and categorised under different legislation). Put another way, it was more odd to set up two statutory regimes - one for an authority as a whole, the other for a specific function - and allow either one to dominate the other, than for an overall assessment to be different from the assessment of a specific function: those to whom authorities are accountable - their local taxpayers and service users - are not so simplistic that they cannot cope with such a straightforward difference.
Mr Gordon QC on behalf of the Audit Commission began by drawing attention to paragraph 3.18 of the White Paper as showing that it had always been the intention that the Audit Commission would draw together assessments of specialised inspectorates. He submitted that in enacting Section 99 of the Local Government Act 2003 it was not Parliament’s intention to change the essential nature of CPA. In any event, the Audit Commission had made its own judgment that local authorities’ functions covered by the CSCI star rating were so important that poor performance should result in a local authority being given a specific categorisation for a given year. Parliament had not envisaged that the Audit Commission would itself conduct reviews in areas where specialist bodies had a role: the Audit Commission’s role was to categorise on the basis of all the information before it.
As a matter of legal principle, Mr Gordon said that Ealing had confused exercises of discretionary power on the one hand and a duty to reach a judgment on the other. Its argument led to no clear conclusion as to what the Audit Commission could or could not do. It was inconsistent to say that the Audit Commision could accept the judgment of CSCI on current performance, but not the star rating: judgments on current performance included value judgments.
Mr Gordon distinguished the Lavender case on the footing that the Audit Commission was not exercising a discretion of the kind involved in that case. Its approach was that the local authority had every opportunity to raise matters with the specialist inspectorate, and there was therefore no need to have an opportunity to raise them with the Audit Commission. If having gone through that process the local authority complained that there had been improper or unlawful action by the specialist body, then (although this was not explicitly stated in the 2004 CPA Framework) the Audit Commission would think again.
I expressed a concern that Mr Arden’s concession may have gone too far. I could see no problem with the Audit Commission adopting the finding of another body, provided there was an opportunity for the local authority to draw attention to factors which might make it inappropriate to adopt that finding in a particular case. Mr Gordon referred me to the well known passage from the speech of Lord Reid in British Oxygen v Board of Trade [1971] AC 610 at page 625D to F: there is no objection to a rule if one listens to someone who has something new to say. Ealing had nothing new to say, they did not seek to challenge the CSCI rating and the Audit Commission had already considered whether Rule 2 should stand in its current form.
In support of his proposition that this case involved judgment rather than discretion, Mr Gordon refered to an article by Lord Bingham “Should public law remedies be discretionary?” [1991] Public Law 64. At page 67 the article cited from an earlier lecture by Lord Bingham as follows:
“…an issue falls within a judge’s discretion if, being governed by no rule of law, its resolution depends on the individual judge’s assessment (within such boundaries as have been laid down) of what it is fair and just to do in the particular case. He has no discretion in making his findings of fact. He has no discretion in rulings on the law. But when, having made any necessary finding of fact and any necessary ruling of law, he has to choose between different courses of action, orders, penalties or remedies he then exercises a discretion. It is only when he reaches the stage of asking himself what is the fair and just thing to do or order in the instant case that he embarks on the exercise of a discretion”.
As to Ealing’s complaint of misconstruction, Mr Gordon submitted that Ealing’s approach misunderstood the statutory scheme. The intention was that there should be complementary regimes, there was nothing to constrain the Audit Commission in attaching weight to CSCI findings, and the White Paper at paragraphs 3.18 and 3.19 showed how the CPA worked prior to the Local Government Act 2003.
Mr Gordon continued that the Audit Commission had to make judgments as to how to deal with information. Where an authority was really failing on core services, it could make a rule. The Commission had treated both the judgments of the CSCI on current service performance, and the star rating, as final, and under section 99 Ealing could not take one and not the other – it was “all or nothing”.
Mr Gordon relied upon a quotation from the judgment of Laws J in R v Secretary of State for the Home Department ex parte Hepworth [1998] COD 146, as set out in Fordham, Judicial Review Handbook 4th Edition 2004 para 50.4.8:
“As regards the question whether there is an unlawful fetter of discretion, I cannot think that a clear system for incentives within the prison can sensibly be expected to operated if its administrators have to consider whether in any individual case the schemes established criteria ought to be disapplied, or if this court were to hold that such criteria legally bad in the first place on the ground that there should be room for discretion in individual cases. There is no principle of administrative law which says in a milieu such as this, that there cannot be black-and-white rules.”
Mr Gordon stressed the degree of concern felt by the Audit Commission in relation to failing authorities. A zero star rating from CSCI indicated something very serious. Miss Killian explained in paragraph 34 of her witness statement that CSCI had assessed that Ealing’s prospects of improvement had declined from “uncertain” to “poor” for adults and from “promising” to “uncertain” for children. Only one other council in the country had merited such a low prospects rating for adults. Only five other authorities in the country had dropped both of their prospects ratings and two of them had dropped a star rating accordingly. Miss Killean described Ealing’s social services performance as “seriously poor” by reference to CSCI’s improvement recommendations.
After an overnight adjournment Mr Gordon put forward a tripartite classification. A first class concerned representations about the proposed framework, or after the framework had been issued, representions to the effect that there should not be a rule. These were governed by British Oxygen. A second class concerned representations about the application of an existing rule in an individual case. If the very basis of the rule, as here, was that it covered all cases, then such a challenge was a challenge to the rule. The only basis for such a challenge could be irrationality. A third class would be a representation about the basis of the CSCI star rating. Ealing was not shut out from making such representations to CSCI. The Audit Commission however as a general body did not have the expertise, capacity or statutory function of going into the basis for each of the conclusions of the specialist body.
In response to a question from me about Lord Bingham’s analysis, Mr Gordon said that the Audit Commission had a choice as to how to perform its duty, and that the matter should be regulated by reference to rationality. I drew Mr Gordon’s attention to the decision of the Divisional Court in R v Secretary of State for the Environment ex parte North Tyneside Borough Council [1990] COD 195. This was a case where the Local Government, Planning and Land Act 1980 required the Secretary of State to assess Grant Related Expenditure (“GRE”) for local authorities. The statute required that the Secretary of State determine GRE in accordance with principles to be applied to all local authorities, and that those principles be specified in the rate support grant report. The rate support grant report in question required that the Secretary of State use population figures certified by the Registrar General, and it was common ground between the parties that this was one of the principles applicable to all local authorities under the statute. North Tyneside complained that the certified figures were inaccurate. The Divisional Court held that to apply a principle that numbers should be certified by the national authority responsible for population estimates furthered the objects of the Act, and rejected a challenge by North Tyneside. Mr Gordon accepted that the present case did not involve a statutory requirement to proceed by reference to principles applicable to all authorities.
In reply, Mr Arden said there had been no concession by the Audit Commission that authorities could put in submissions as to why Rule 2 should not, on this occasion, apply to them. Had that been available, Ealing would have wished to do so. As to bringing together the results from all the inspectorates, that was not what section 99 said. Moreover, Rule 2 did not bring together the results of all the inspectorates – rather it played them off against each other, or against the result which had been based on bringing them together. The White Paper had said nothing about rules. There was nothing to suggest that Parliament was aware of the rules when the Local Government Act 2003 was enacted. Mr Arden adhered to the distinction previously drawn between findings and value judgments – the purpose of the findings was to get as close as possible to what may be called the facts. The suggestion by the Audit Commission that one could distinguish a judgment as something which could involve a rule, and discretion which could not be fettered, was unfounded. Mr Arden said that the passage taken from the judgment of Laws J in Hepworth was in a completely different context – it had nothing to do with abdicating responsibility.
At the end of last week I received from Mr Gordon written submissions on the North Tyneside case, and shortly afterwards written submissions in reply from Mr Arden. I need set out here only the points that were made by Mr Gordon. The reasoning of the Divisional Court could, he submitted, be applied to the present case. First, the Audit Commission has, properly, formulated principles (which must apply consistently and without discrimination to local authorities) as to the basis on which it will categorise local authorities. The absence of an express statutory provision stating that the Audit Commission must formulate principles that apply to all authorities makes no difference in this regard since it is self evident that the Audit Commission as a public body must carry out its statutory duty of categorisation by applying transparent principles in a fair and consistent manner.
Second, there was no unlawful fettering of discretion or inflexible adherence to the principle by refusing to depart from the Registrar General’s figures because the Secretary of State considered the representations made and concluded that he should adhere to the principle. The exact nature of the representations made and the Secretary of State’s response is unclear from the judgment, but the response quoted in the judgment strongly indicates that the Secretary of State’s response was that the principle prevented him from looking into the underlying basis for the Registrar General’s estimate. The Audit Commission had taken representations on its CPA into account in precisely the same way, and had responded to Ealing by pointing to the existence of the rule which prevents the Audit Commission from examining the underlying CSCI calculations at the categorisation stage.
Third, there was no failure to exercise a discretion to depart from the policy because the Secretary of State had acted in accordance with the principles he had set down for calculating grants.
Analysis
The crucial feature of the Audit Commission’s approach is that because CSCI’s star rating for Ealing brought Rule 2 into play, the Audit Commission downgraded Ealing and in determining to do so refused to apply its own mind to the reasons why CSCI gave that zero star rating to Ealing and whether those reasons warranted such a dramatic downgrading. The operation of the Rule was automatic. I have no doubt that the Audit Commission gave careful thought to the question whether it should adopt such an approach. Whether it was a lawful approach depends upon the true construction of s 99 of the Local Government Act 2003.
Where a statute permits a body to make a finding, judgment or discretionary decision ordinary principles of statutory construction suggest that in the normal course this must be done by the body itself applying its own mind to determinative questions. The same is true where a statute requires a body to make a finding or judgment, or to reach a decision which contains an element of discretion. Parliament has chosen the particular body to carry out the function in question. It is to be inferred that Parliament wished that body to make up its own mind on the point. In this way it will ordinarily be unlawful for the body to refuse to apply its own mind to such questions. Sometimes, as in the Lavender case, the statutory purpose will be infringed because the body has decided on a policy of doing what X wants. It seems to me that there is no difference in principle if the body decides on a policy of altering what it would otherwise do whenever X has reached a particular conclusion. The vitiating element in each case is that the body has refused to apply its own mind to a determinative question.
I reject the submission that the principle of the Lavender case is confined to the exercise of discretion. In the ordinary case the principle I have identified above will apply without distinction between the conferral of a discretionary power and the imposition of a duty to make a judgment, for in both instances the statutory intention will be that the body should make up its own mind. It has to be willing to consider for itself whether, having regard to relevant circumstances, the matters which led X to a particular conclusion should lead it to exercise its functions in a particular way. I do not read the observations of Lord Bingham as casting any doubt upon such a proposition.
It is for consideration as a matter of statutory interpretation whether Parliament intends something different from what would normally be expected. An example of such a case is North Tyneside. The statutory framework required principles to be determined and approved by resolution of the House of Commons. Thereafter the Secretary of State was required to apply the principles. Thus the determination of the principles was in many ways akin to a legislative function. This was part of a process for determining precise figures (rather than categorisation) in the context of complex and detailed statutory provisions concerning local government finance. The statutory context in North Tyneside is very different from the present case.
The context of the Hepworth case is also very different. Laws J was there dealing with a scheme for incentives. The scheme did not operate to disadvantage prisoners, simply to confer on them advantages which they would not otherwise have had.
Has Parliament intended something different from the norm in s 99(1) and (2) of the Local Government Act 2003? That section bears little resemblance to the complex structure which was enacted for local government finance. At the time of the White Paper the Audit Commission did not have a policy of automatically altering what it would otherwise do whenever a specialist inspectorate had reached a particular conclusion. I have not been shown anything in the White Paper to indicate that such a policy was envisaged.
Paragraph 50 of the Grounds for Review does not amount to a concession that the Audit Commission can lawfully decide that whenever a specialist body makes a finding it will adopt it, and thereafter refuse to hear representations about that finding and its relevance to categorisation. The last two sentences of that paragraph make it clear that no such concession was intended. While Mr Arden appeared to make such a concession orally, I do not accept that such an approach would be permissible in a case where the representations might lead the authority not to be downgraded.
It does not follow that the Audit Commission’s function of making findings and classifying into categories becomes unworkable. The law does not inhibit the Audit Commission from adopting the findings of specialist bodies in cases where local authorities make no representations to the contrary. Thus there is no requirement to take an “all or nothing” approach. In the vast majority of cases any concerns about the conclusions of specialist bodies will not be determinative of the Audit Commission’s own findings nor of categorisation, and in such cases any representations can be given short shrift. What the law requires is that in a case where a finding may be determinative the Audit Commission must be prepared to apply its own mind to the matter. This does not mean it has to start from scratch or develop its own expertise. It can require any authority that is minded to object to provide written representations within a matter of days explaining the discussion it has had with the specialist body on the point and showing good reason why the specialist body’s conclusion should not be adopted, or if adopted should not lead to downgrading of the finding or categorisation which the Audit Commission would otherwise make. On receipt of any such representation the Audit Commission should be able to decide speedily whether it has merit or not.
As to the points made at paragraph 34 of Miss Killian’s statement, these are points that might be highly relevant if the Audit Commission were prepared to hear representations from Ealing rather than automatically downgrade in response to CSCI’s star rating, and no doubt Ealing would have something to say about them. As it was, they simply do not arise so far as the present challenge is concerned.
Conclusion
In the light of my analysis of the arguments and evidence, I conclude that section 99 does not depart from the norm. It does not permit the Audit Commission to adopt a rule automatically downgrading an authority which received a zero rating from CSCI, and entailing a refusal to apply its own mind to the reasons why CSCI gave that star rating and whether those reasons warranted downgrading. This means that the approach taken by the Audit Commission in the present case was unlawful. I will hear counsel as to the appropriate relief.
Before leaving this judgment, there are some matters I wish to stress:
This claim for judicial review succeeds because the Audit Commission’s downgrading of Ealing was an automatic consequence of some-one else’s decision. In this case the someone else was CSCI, a body with specialist expertise in its area, but it was unlawful for the Audit Commission to refuse to consider whether there were circumstances which made an automatic downgrading inappropriate.
Nothing in this judgment inhibits the Audit Commission from adopting the findings of specialist bodies where local authorities do not object. It is entitled to limit objections to those which would be determinative of the Audit Commission’s own findings or categorisation. The Audit Commission can set short time limits for any such objection and need only consider whether there is good reason why the specialist body’s conclusion should not be adopted, or if adopted should not lead to downgrading of the finding or categorisation which the Audit Commission would otherwise make.
Ealing confined its challenge to points which it considered were obviously sound and could be dealt with speedily. That was commendable. I have not been asked to consider any argument that Rule 2 was unreasonable or disproportionate or prevented authorities from making representations which they were entitled to make under the rules of natural justice, nor any argument that the Audit Commission misdirected itself in law either in the way it had regard to capacity for improvement or in adhering to Rule 2 in order to give “an incentive to authorities to take corporate responsibility for and ownership of failure in key service areas that serve the most vulnerable client groups.” I would not wish it to be thought that I had considered and rejected such arguments.
Appendix to Judgment of Walker J in R (Ealing LBC) v Audit Commission
Extracts from the SCSI Policies:
INTRODUCTION
The following operating policies are intended to guide Social Services Inspectorate (SSI) Inspectors/Commission for Social Care Inspection (CSCI) business relationship managers and other staff through the annual cycle of assessment of councils with social services responsibilities, leading to the judgements that underpin published performance (star) ratings. They have been agreed between SSI and the shadow CSCI for the 2004 performance assessment cycle.
THE PERFORMANCE ASSESSMENT CYCLE 2004
OVERVIEW
Purpose
The purposes of performance assessment are to:
• promote improvement in the quality of care to service users;
• support effective performance management of social services in local councils;
• provide annual independent judgements of the performance of local councils with social services responsibilities;
• establish what action each council needs to take to improve the quality of their social services;
• provide information to service users and the general public about the performance of their local council in providing social services;
• assess progress in implementing the Government’s policies for social care.
Process
The assessment process comprises two main parts. Firstly, a continuous element, co-ordinated by the SSI inspector/CSCI business relationship manager who links with a small number of councils with social services responsibilities. This entails the assessment of agreed performance evidence to form an overall picture of performance over time, covering both qualitative and quantitative aspects of performance, and forms the basis of dialogue with the council about improvement. It includes an annual review. The second part comprises a performance review report and a published judgement about the standard of service over the preceding year, and the capacity for improvement in the coming year.
Summary
SSI/CSCI social care ratings inform each council’s comprehensive performance assessment (CPA) rating and their improvement planning for the council as a whole and for their social care provision. By the time of the CPA ratings for 2004, SSI/CSCI aims to have:
• agreed with each council their priorities for improvement leading to the production of delivery and improvement statements (DIS) by June;
• ensured that the SSI link inspector/CSCI business relationship manager has reassessed the evidence and that regional SSI/CSCI teams have validated the approach to improvement they are pursuing with the council;
• shared the evidence set with senior council managers and agreed it factually;
• met with senior council figures to review the latest performance position against improvement plans and new evidence;
• written up a performance review report and checked its factual accuracy with councils;
• refreshed records of assessment as necessary with each council, taking into account the 2003-04 performance indicators (PIs);
• made rating judgements, and carried out consistency checks.
…
THE PERFORMANCE ASSESSMENT CYCLE
The Overall Performance Assessment Cycle
The personal social services (PSS) performance assessment process has developed to fit with the timing and improvement emphasis of the CPA process. The internal analysis and assessment process, annual review meeting and performance reports is being developed and strengthened by a process of systematic and continuous contact between inspectors/business relationship managers and social services managers.
The “new year” begins in late 2003 with discussion between SSI link inspectors and council managers about the priorities for local improvement in the coming 12 months, stemming from the CPA process. It leads to the production of a set of (proportional) DISs, which are tracked (again, proportionally) during the year: through the judgement of evidence to the publication of star ratings in the autumn, prior to the CPA ratings for the council overall.
Sequence and timetable: December 2003 – December 2004
• following the publication of the PSS ratings and CPA in November 2003, SSI engages with social services managers from December 03 to agree priorities and strategies for improvement, linked to the corporate agreement on the inspection and audit plan for each council;
• the spring DIS is completed by the council to set out local improvements for 2004/05, and evaluated during June -July;
• in early June CSCI produces evidence maps for each council, collating and analysing the admissible performance evidence against a set of standards;
• in the period June – August 04, CSCI carries out a review of performance in 2003/04, working with key partners. Drawing on inspection results and the council’s DIS and from provisional statistical returns to Department of Health DH, CSCI meets formally with the council’s managers and other senior figures at a review meeting to confirm an understanding of past performance and to discuss the council’s stated improvements;
• a report is sent to the council (by the end of August 04) with a profile of the previous year’s performance, and highlighting the improvement priorities for the remainder of the current year. CSCI forms provisional judgements for the overall assessment ratings;
• provisional judgements are reached, and star ratings internally validated, taking account of new evidence (and meeting with councils to follow up as necessary). This includes the final set of PI data for 2003/04. Finally, ratings are determined by the chief inspector in October, prior to publication and feeding into the CPA;
• following publication of CPA and PSS overall assessment ratings, improvement-planning discussions continue in the new round and feed councils’ internal processes.
THE ASSESSMENT PROCESS - MAIN COMPONENTS
Councils’ Improvement Plans and Social Care Delivery and Improvement
Statements
SSI/CSCI policy is to make judgements against nationally applied standards and criteria for performance assessment, covering both current performance and capacity for improvement (see below for details): and to ensure that the assessment of social services dovetails with the CPA. At local level, this process takes into account, and begins with the agreement of overall council improvement priorities through discussion between external Inspectorates and each council on an annual basis. These improvements are then incorporated into the council’s own local plans and strategies and are published in summary in the best value performance plan in June.
…
PERFORMANCE (STAR) RATINGS
The ratings summarise the CSCI’s independent judgements of performance across all social services, on a scale of zero to three stars. Supporting this, separate judgements for services for children and services for adults are also given. The ratings are made individually against published standards and criteria (ie – they are not allocated against a pre-determined quota of performance bands).
Informing the Public
The ratings are intended to improve public information about the current performance of services, and the capacity for improvement at local, regional, and national levels. Social services have wide responsibilities for the care and support of families in difficulty, and the protection of children at risk of harm: for helping older people to live as independently as possible, and for supporting people with disabilities. People have a right to know how well their councils are performing in meeting these responsibilities, whether they are receiving such services themselves, have a family member receiving such services, or are a council tax payer. Central government needs to know how well each council is meeting the aims and objectives for improvement it has set for social services.
Relationship with Comprehensive Performance Assessment
The ratings contribute the most recent social care evidence available to the CPA of local councils, led by the Audit Commission. This is part of a wider strategy to align the programmes of public service inspectorates, and has led to a better co-ordinated assessment process.
What the Ratings mean for Councils
The ratings provide an objective starting point for planning, carrying out and reviewing improvements to services. This is important for all councils, whether their performance is good or poor. The best performing councils have an increased level of freedom in the way they use centrally provided grant funds. They also have a proportionate programme of inspection and monitoring, and reduced requirements for planning and monitoring information.
How the Ratings are Presented
As well as the overall star, judgements for children and adults services are given, and these carry equal weighting. In both cases, a judgement for both current performance and capacity for improvement is also shown. The categories for judging current performance (serving people well?) are – no, some, most and yes. The categories for judging capacity for improvement are - poor, uncertain, promising, and excellent. Current performance is weighted more heavily than capacity for improvement.
This results in a total of four judgements underpinning the overall rating, … Once the judgements have been reached, a set of rules is used to combine them with the weightings to produce a final star rating. …
…
How the Ratings are Produced
Performance ratings are a product of a wider performance assessment process bringing SSI/CSCI and the councils into continuous contact throughout the year. Assessment includes evidence from inspections and joint reviews, monitoring, performance indicators and other admissible evidence, to form an overall picture of performance over time on both qualitative and quantitative aspects of performance. The assessment culminates in an annual review meeting with each council, which focuses on the high-level challenges to the council in improving the quality of care to service users. Following the annual review, provisional judgements of performance are formed and then subjected to a series of consistency checks before final determination is made by the chief inspector of CSCI.
The ratings are produced annually for all councils with social services responsibilities, and are linked to CPA.
Ratings are built up from judgements against standards and criteria, placed on a summary matrix and then translated into a star rating. …Details of the rules for translating judgements to star ratings are at Annex 1.
CONSISTENCY AND VALIDATION OF PERFORMANCE RATINGS
Internal Procedures
In order to achieve fair and consistent judgements of performance, CSCI carries out a series of checks on the provisional rating judgements reached after annual review meetings have been completed. There are several stages:
• CSCI business relationship managers, inspectors and directors in regional teams review provisional ratings at regional level to check that the evidence against the standards and criteria supports the provisional judgements reached;
• provisional judgements are entered onto CSCI’s evidence database performance assessment data & information (PADI). An analysis of the main evidence components is carried out by a central team, to check for consistency the range of judgements from inspections, joint reviews, PIs and monitoring; average band scores for the PIs for adults and children are calculated for comparative purposes;
• any judgements that appear as inconsistent with other evidence are then reviewed by the regional director and business relationship manager: where appropriate, judgements are amended;
• CSCI directors in the regions are paired to scrutinise a selection of provisional judgements: this may lead to further amendments;
• provisional judgements are then re-charted prior to a final validation meeting. This meeting leads to a set of recommendations for the chief inspector, who takes the final decisions on ratings, independently of ministers.
…
In arriving at his final determination of judgements the chief inspector will take a holistic view of all the admissible evidence. The key and other PIs are taken into account within the rounded view of all the admissible evidence available to him.
…
STANDARDS AND CRITERIA
The standards and criteria in use for 2004 are a development of the version used by the SSI in 2003. They provide a guiding framework within which SSI/CSCI judges the overall performance of a council in delivering its social care functions.
SSI/CSCI standards are high-level statements describing overall performance for all service user groups that characterises the delivery of “good” social care. They:
• describe the quality of service expected;
• reflect the Government’s strategic priorities for modernisation;
• are derived from legislation and regulation, government policy and guidance (including national objectives and national service frameworks), and understandings of good practice; and
• are structured within the domains of the performance assessment framework; and elaborated by more detailed statements, referred to as criteria.
The six overall standards in the model reflect the existing five domains used in the performance assessment framework (PAF), (and for best value) plus a standard describing capacity for improvement or the potential to sustain good performance. The term “capacity for improvement ” has been adopted since 2003 for the sixth standard, and this focuses on the effectiveness of the social services infrastructure and the council’s partnership with others. This replaces the former “prospects” category used in 2002.
Criteria
Criteria are given to define each standard in more detail, and are intended to apply generically to all user groups and services. Each criterion is graded with descriptors designed to help refine judgements across the four bands. This is consistent with the scaling applied to the overall judgements of performance, prior to the calculation of star ratings. SSI link inspectors/CSCI business relationship managers link performance with the standards and criteria in order to demonstrate the basis for rating judgements.
…
The overall standards and criteria describe – except where it would be inappropriate to do so – four levels of performance to assist in this. But judgements will not be arrived by a numerical summation or formal weighting. The essence of the approach is that it includes an element of professional judgement. The important thing is that we arrive at judgements between councils and across the regions, which are consistent with each other, so councils are treated equitably. The consistency checking process is crucial to this.
…
FAILING PERFORMANCE
Note: These policies reflect current practice, but will be subject to change as both DH andDepartment for Education and Skills (DfES) policy on intervention and engagement in adults’ and children's social care is further refined: and when the relationship between both departments CSCI is defined.
SSI/CSCI works with failing councils to deliver effective improvement on behalf of government ministers. The nature and scale of engagement depends on the degree of service deficit and assessment of a council’s capacity to change. The SSI/ CSCI role is to ensure that the council understands its’ failings and is committed to addressing them, has a realistic improvement plan and sufficient capacity to deliver it, sign-post the council to external support where appropriate and monitor both the rate and extent of effective, sustained change. Where it is appropriate within an overall improvement plan, SSI/CSCI can negotiate additional support either through internal resources, or by recommending that Ministers appoint a performance action team to work alongside the council.
Councils to Whom the Policy Applies
All zero star rated councils are subject to special measures. The decision that a council should be so rated will normally be taken as part of the chief inspector’s annual performance rating determination. In exceptional circumstances – usually where the seriousness of the situation requires urgent remedial action - a council can be reduced to a zero star rating in year.
…
Making Judgements about Performance and Capacity.
As with all councils, the performance assessment process, including the overall performance standards and criteria, provides the framework for reaching a judgement on a council’s service performance and capacity.
SSI/CSCI aims to be specific about the nature and extent of service failings, because this will become the benchmark against which improvement – or further deterioration – will be measured. A judgement about a council’s capacity for change can be more difficult to describe in specific and measurable terms than service failings, but is essential because it impacts on:
• the scope of engagement;
• the targeting of improvement support;
• expectations of the rate of change over time – the change “trajectory”;
• the quality of advice on progress to ministers, and
• will be key factor in considering statutory intervention.
Timing and action when a Council is Placed on Special Measures.
A council is subject to special measures from the point at which their zero star rating is formally announced. This will usually be when the chief inspector's annual performance rating determination is publicly released or, for in-year changes, when a council’s reduction to a zero star rating is made public.
Subject to ministerial agreement, at the same time as the above announcement the chief inspector will write to the chief executive of the council notifying them of their zero star rating and confirming that the council is now on special measures. The letter will conclude with an invitation for the leader of the council, lead councillor for social services, chief executive and director of social services to meet with the chief inspector.
The purpose of this meeting is to assess whether the council understands the seriousness of the situation, is prepared to take action and has the capacity to do so. This meeting will inform a later submission from the chief inspector to the relevant minister on what action should be taken to improve services within the council in a timely way.