Is the government drive to increase the numbers of children adopted from care (as opposed to the speed of decision making and the journey time to an adoptive placement once decision made), wrongly affecting individual placement choice for children? (Part One)

This post is an update to earlier TP posts on this subject (see here and here).

Some campaigners claim the government (and some individual authorities) have adoption targets or even ‘targets for family destruction’. The Transparency Project has been trying to establish the facts.

With comments in the Lords last week about how wrong it is for government policy to over focus on one permanency option for children and to measure Local Authority success with reference to numbers of adoptions, this seems a good moment to publish some of the information we have been pulling together.

The post explains the picture emerging from our Freedom of Information (FOI) responses to local authorities so far (and related research) about central government measures that seem to be in place to raise absolute numbers of adoptions. It sets out where these are to be found and what may be of concern about them.  (We are specifically referring to central government measures that seem to promote increases to average absolute numbers of children adopted from care as distinct from measures about recruitment, assessment, approval, matching, support, ‘early placement’ and regionalisation of adoption agencies aimed at speeding up adoptions and improving efficiency).

 We have asked similar questions of Welsh local authorities, for completeness, but adoption policy and recording requirements differ between England (the Department for Education) and Wales (Welsh Government). The observations on policy below refer to England only.

 The FOI questions

The information here has informed our efforts to refine and and improve our FOI questions to local authorities.

 Future posts

Once we have the further responses to our refined FOI questions from authorities, we intend to look more at the second part of the question – whether and how these central government measures are affecting decisions to have individual children adopted, through Key Performance Indicators (KPI’s) and other local or regional performance management measures on absolute numbers of children adopted; and whether and how these may impact on the  individual decisions of Agency Decision Makers, children’s social workers and their team managers about placement choice for specific children.

 Part One – Relevant Central Government Measures

  • Published Scorecards for Children in Care Including Adoption  have for some time ranked LA’s nationally on their performance on the % of children that are adopted from care. (As well as on the child’s journey (average time care entry to placement; average time placement order to matching; % achieving 18 months or less care entry to placement) and on %’s of children leaving care by SGO’s, based on the figures supplied through the mandatory statutory returns process) 
  • (The published so-called ‘Adoption Scorecards and Thresholds’’ by comparison, simply measure and compare Local Authority’s to their closest statistical neighbours, on journey times and percentage of LAC adopted from care on numbers without actually ranking their performance).
  • With respect to both sets of figures, ‘thresholds’ (or targets) are set for journey times but not for %’s of children leaving care by adoption. (Current Threshold targets for 2013-2016 are: A1 Threshold (time from care entry to placement for adoption) – 14 months; A2 Threshold (time from Placement Order to Matching the child with the particular approved adopters) – 4 months; A3 Threshold (% care entry to placement in under 14 months) – no threshold or target yet).
  • However the Methodology and Guidance document 2014 that accompanies the published Adoption Indicators and Rankings (Children in Care Including Adoption: Performance Tables) effectively sets a national target of raising numbers of adoptions from care as high as possible, short of 100%, apparently irrespective of the actual permanence needs of individual children who might have, for example, suitable relatives:  “All tables include a rank to show the local authorities in order of performance (local areas with the best performance are ranked as number 1”) (page 3); and, with respect to “Adoption Indicators on percentage of looked after children who ceased to be looked after who were adopted”…”A higher percentage represents good performance, but percentages should not reach 100%. (page 9 page 8 – amended by AT on 22/11/16]).
  • Whilst numbers of children leaving care by way of SGO’s or rehabilitation home are collected by way of the statutory returns process and then published, there are no published national rankings nor national target figure on these. (See also related comments in the House of Lords on the Education and Adoption Bill, 17 November 2015, Hansard, Column GC43). Moreover on 2nd November the government announced intended new statutory measures to make it harder for children to be placed with relatives under SGO’s, alongside new measures to increase adoption numbers.
  • The overall picture can be hard to grasp quickly because of the way statistics are published. They are held in many different online locations without any overview. Looking at the adoption scorecards alone, for example, gives a misleading impression by comparison to looking at the Adoption Indicators deep within the published figures for Children In Care, labelled not as key adoption statistics but as Children In Care (including adoption) figures, within Looked After Children Statistical Collections.
  • In December 2014 the DfE announced through an Adoption Reform Updatea plan to procure specialist adoption advisors to work with a small number of local authorities that require support to improve their adoption services via Redimo e-procurement system.  
  • In June 2015 Anthony Douglas (Cafcass) told the London Adoption Board Conference  (The Changing Landscape in Adoption and Permanence) (in a presentation called ‘Adoption is Everyone’s Business’) that actions so far to ‘ensure all children with a robust and well evidenced care plan are adopted’ and  ‘to remove unnecessary delay from the system’, have included “scorecards, thresholds [and] interventions in 45 authorities, including brokering improvement support.”  This is in marked contrast to just 1 local authority FOI response disclosing an intervention by central government on their adoption performance, together with their reports on how they had addressed their adoption performance. The vast majority of authorities have not replied at all to this aspect of the FOI question. This is not helpful to anyone’s understanding of how policy is being applied and allows misunderstanding and misinformation to flourish.
  • It’s also hard to quickly understand the possible effect of the Adoption Reform Grant allocated for 2013/2014 and 2014/2015:
    • The Grant Determinations sent to each LA in February 2013 and January 2014 confirm the amounts awarded to each authority for years 2013/2014 and 2014/2015 and are published.
    • The 2013/14 Grant Determination is divided between Part A (£150 million whole system money) and Part B (£50 million ring fenced solely for ‘Adoption Services’.
    • Part B tells LA’s that “the Government will be using existing data collections to understand the impact of this grant alongside its other adoption reforms” and also that “as a condition of this grant, local authorities are required to write to the Department providing details of what the grant was spent on and the impact that this expenditure had” by way of a “Grant Usage Statement”, certified by the Chief Executive and auditors, to be lodged by 31 May 2014; with an interim Progress Report by 30 October 2013. Templates are provided for these reports by way of Annex C.
    • The Part B Grant Determination also makes clear that the government adoption priorities (in February 2013 at least), were increasing number of adopters and reducing the back-log of children waiting for adoptive placements (rather than, for example, increasing numbers of adoptions).
    • A further (smaller) Adoption Reform Grant was allocated in January 2014.
    • Individual payments varied overall between £2,538,519 (Birmngham, Part A 2013) and about £30,000 (Part B -City of London) save for £0 for the Isle of Scilly.
    • Published information doesn’t explain how the government allocated the individual amounts and whether they related, say, to under performance showing a need for ‘support’; high performance; or just size and economic features of each authority irrespective of adoption performance. Or whether for example bids were made for an amount based on how it would be used.
    • It’s difficult to work it out because there are a lot of variables potentially operating. Effectively 3 different payment amounts are recorded for each authority.  LA performance rankings are made annually, on several different criteria in both Children In Care & Adoption Performance Tables and Adoption Scorecards, with no overall numerical ranking reflecting the overall performance. Each authority is very different in size and other variables such as population density or poverty indicators, which is why the government uses statistical neighbours for some aspects of evaluation of performance.
    • Nor is it clear whether ‘interventions including brokering improvement support’ might have included any element of discussion around allocation of individual adoption reform grant awards or not. No authority has suggested so to us but many have not answered the question at all.
    • Wolverhampton for example disclosed to us that they were subject to  ‘‘interventions’ from central government about their adoption scorecard performances for 2012/2013 and then 2013/2014. They also shared with us the reports they prepared for the ‘challenge’ meetings (the DfE thereafter being satisfied by their responses).
    • But Wolverhampton might be thought to have received a suprisingly low Adoption Reform Grant allocation bearing in mind they are a large urban authority who the government apparently perceived as requiring ‘support’ with their adoption performance. (The stated purpose of the ARG at least within Part B of the Allocation Determination 2013 is to “provide support”).  In February 2013, Wolverhampton received £499,722 and £528,100 while Birmingham got £2,538,519 and then over a million. In January 2014 Wolverhampton got £248,581 while Birmingham got £1,266,536 and 6 or so other authorities received well over half a million.
    • There may be perfectly good reasons but again we think it’s important that the government (and local authorities) are transparent about how these amounts were fixed. Anything else inevitably creates unhelpful mistrust and makes it difficult for commentators to publish clear, balanced information to set the record straight, where appropriate.
  • Quarterly Adoption Leadership Board survey and quarterly data reports on adoption data don’t seem to contain measurements of or targets for absolute numbers of % of adoptions from care.
  • Ofsted fostering, adoption and children looked after data sets under the new (April 2012) inspection regime for local authority and voluntary adoption agencies which were due to be integrated into a new local authority Children Looked After inspection from April 2013. One LA told us that ‘from quarter 4 2014-15 the Ofsted data collection became part of the ALB quarterly survey’.
  • Regional Adoption Boards like the London Adoption Board (LAB): See the Regionalising Adoption Consultation, DfE June 2015; and House of Lords discussion of the new power proposed in the Education and Adoption Bill 2015 to allow the Secretary of State to force poorly performing authorities to delegate certain adoption functions (Hansard, 17 November 2015). The LAB’s stated aims include “ensure that children for whom adoption is the best way of achieving permanence are adopted without unnecessary delay” but also the less transparent “develop a strategic approach to adoption across London to benefit the maximum number of children in achieving the best outcomes through adoption”.  We’ve not seen any LAB or other regional statistics or targets for absolute numbers of % adoptions from care though it is possible they exist and are not published or we haven’t found them.

 Conclusion

Para 102 of the Adoption Action Plan 2012 says the Government doesn’t want scorecards and thresholds “ to distort local authority decisions about whether adoption is the best option for children” (though it only references this to possible disincentives to trying to adopt harder to place children).

It’s hard to see how the government policy of increasing absolute numbers of children adopted, as implemented through the above measures, wouldn’t have a significant effect on decision making for individual children, albeit indirectly. If not it seems reasonable to ask what the purpose of them is and whether they are any longer useful or necessary?

At the very least the apparent policy and measures are creating unhelpful mistrust and alienating families and some professionals within the family justice system alike.   The question also arises as to how such a policy sits with the article 8 obligations of the State not to sanction adoption save as last resort in an individual case (though it is difficult to see exactly how any legal challenge might be constructed so as to actually find its way into the court arena).