Braam v. State of Washington: Recently Released Data Disappointing
April marked the first release of long-awaited benchmark data in Braam v. State of Washington. The benchmarks were designed to measure whether the Braam settlement agreement reforms actually improve the lives of foster care children. The benchmark data revealed that the state has failed to adequately improve in any of the measured areas, including placement stability – a focus of the Braam case. In fact, in one area of placement stability, the lives of foster children have actually gotten worse since settlement of the case.
State of Washington Fails to Meet Benchmarks Set for Placement Stability
The benchmarks measure many aspects of the foster care system by comparing baseline numbers – primarily from data gathered in fiscal year 2005 – to data gathered in later years. The use of benchmark data in the Braam case is a unique feature of the settlement agreement. It measures whether the Department of Social Services’ (The Department’s) actions actually translate into concrete improvements in the lives of foster children in Washington State.
Advocates eagerly awaited the benchmark data on placement stability – the core issue in the case. Jessica Braam, for whom the case is named, experienced more than 34 placements after being removed from her parents and placed in foster care at age 2.
Of all the benchmarks related to placement stability, the Department only provided data on one. And in that one – the measurement of the percentage of children in foster care who have experienced three or more placements during their time in foster care — the Department failed to meet settlement agreement goals. In one case, the number of children experiencing multiple placements actually increased over the baseline numbers.
The placement stability benchmark measured different groups of children based on how long they had been in care.
- Children in Foster Care Less than One Year. In FY 2005, 11 percent of this group had experienced more than three placements. The benchmarks required the Department to reduce this to 9.9 percent for FY 2006. Instead, the FY 2006 percentage increased to 12.3 percent – meaning that an even higher percentage of children in this group experienced three or more placements since 2005.
- Children in Foster Care More than Four Years. Well over half of this group experienced three or more placements. Even though the percentage of children in this group experiencing three or more placements decreased (from 62.3 percent in FY 2005 to 60.9 percent in FY 2006), the Department still failed to meet the required benchmark of 56.1 percent.
The Department Failed to Meet All Other Measured Benchmarks
Of the 10 other benchmarks for which the Department provided data, it failed to meet settlement goals.
Mental Health Benchmarks
In the area of mental health, the Department failed to meet five benchmark requirements related to the screening, assessment, and provision of mental health services to children at the time they enter foster care and during their time in care. The specific benchmarks required the Department to increase the percentage of children who:
- receive health and education screens and Early Periodic Screening, Diagnosis, and Treatment Program (EPSDT) exams within 30 days of entering care,
- receive a mental health assessment within 45 days of entering care if the child needs such an assessment,
- are screened for mental health needs every 12 months, and
- receive recommended mental health services within 30 days of an assessment recommending services.
For FY 2006, the Panel required that the Department provide the above services to 90 percent of all children in foster care, but its performance fell far short of that goal. In all but one of the areas, the Department fell short by over 50 percentage points.
Adolescent Runaway Benchmarks
In the only other area in which the Department provided data — that of adolescent runaways — the Department failed to meet five benchmarks. These benchmarks required the Department to:
- decrease the percentage of children who ran away from foster care homes or facilities one time,
- decrease the percentage of children who ran away from foster care homes or facilities multiple times, and
- decrease the average and median number of days that runaways were “on the run.”
In these areas, the numbers changed very little, if at all, from FY 2005 to FY 2006.
The Department Failed to Provide Data for Most Benchmarks
The Department did not provide data for 17 of the 38 benchmarks. The data for the benchmarks comes from three sources: 1) the Department’s statewide database, which contains administrative data, 2) the foster parent survey to be released in the fall of 2007, and, 3) case reviews, which involve a more intensive review of a sample of actual case files. According to the Department, it does not collect the required data for at least seven of the benchmarks in its statewide database and is still working on how to gather that information through case reviews. An example is the benchmark requiring the Department to update health and education plans for foster children every six months.
Of the remaining 10 benchmarks, the Department did not provide data because the Panel itself is in the process of revising, modifying, or clarifying those benchmarks. The need to revisit these benchmarks became apparent as the Panel, parties, and Department rolled up their sleeves and got into the nitty-gritty details of data measurement. For example, one of the measures requires the Department to increase the average monthly number of beds in active foster homes each year. Recently, the Panel questioned whether this statistic will measure whether there are enough appropriate homes for all children in foster care. Discussions of other more accurate measures — such as using a ratio requiring two available beds for every one child in foster care or measuring number of beds by subgroup (i.e. infants, teenagers, children with special needs) – are underway.
For the 10 benchmarks under Panel review, the parties have provided input and the public has had an opportunity to comment at the public Panel meetings. The Panel is expected to make decisions on these benchmarks by the release of the next monitoring report, or sooner.
The Department’s Failure to Meet Benchmark Goals: What Happens Next?
The settlement agreement outlines two courses of action if the Department fails to meet benchmark standards: (1) development of new compliance plans, and (2) enforcement of the settlement agreement in court.
The Department must now develop compliance plans for all of the benchmarks that it failed to meet. For the 17 benchmarks for which the Department failed to provide adequate data, the Department must provide detailed information about how it will gather that data. The last two public Panel meetings have explored this topic and it has been the subject of ongoing discussions between the Panel and the parties. For those 11 benchmarks for which the Department provided data but failed to meet the required goals, the Department must devise plans and strategies to improve results for children in foster care. Once the Department submits its proposed compliance plans, the plaintiffs will have an opportunity to provide comments. The Panel will have final approval over the plans.
Additionally, if the Department fails to comply with the requirements of the settlement agreement, the plaintiffs may seek enforcement in court. If plaintiffs choose to go to court, they may seek enforcement of benchmarks not met by the Department as well as action steps that the Department has failed to take.
The fourth six-month monitoring report is due in September 2007. If the Department complies with expectations and requirements, this fourth report should contain information on the Department’s performance in all areas in which it failed to provide data for the third monitoring report. This will provide a more comprehensive view of the quality of life for children in Washington State foster care system, and the specific areas where the greatest improvement is needed.
Foster Youth Survey
One of the more exciting developments in the reform process is the recent agreement by the monitoring panel for Braam v. State of Washington to conduct a foster youth survey. The survey will be a valuable way to gather information about the day-to-day experiences of foster youth by asking the youth themselves.
The survey is designed to measure Washington’s compliance with two of the benchmarks: (1) the percentage of youth receiving appropriate independent living preparation services, and (2) the percentage of youth receiving a six-month pre-exit staffing.
The survey may also include questions covered by a previous survey of foster parents, such as frequency of visits, availability of mental health care, child safety, and quality of foster care services. The survey provides an excellent opportunity to compare data received from foster youth with that from foster parents. Not only will the foster youth information supplement the earlier data, but might also illuminate important differences in the measurement methods.
The Panel will form a working group — comprised of Plaintiffs’ counsel representatives, Children’s Administration representatives, foster youth, and other stakeholders — to begin nailing down logistics of the survey, including the number of youth to be surveyed, their age range, and the survey time frame.
 The Panel has suggested that the survey be conducted from January through June 2008, with the survey collecting information from the year 2007.
Braam v. Washington Benchmarks
Listed below are several of the key benchmarks for implementation of the Braam settlement. In some cases, no data has been provided by the state Department of Social Services, while other benchmarks are under review by the Braam implementation panel. The status of all the benchmarks listed is noted accordingly.
 See Case Overview, for more background on the case.
 The Jan-Mar 2007 Youth Law News provides an update on other developments in the Braam reform process.
 Heath Foster, Landmark Battle Over Foster Care is Settled, Seattle Post-Intelligencer, Aug. 12, 2004.
 Braam Settlement Monitoring Report #3, April 17, 2007, pp. 45-46.
 Id., pp. 47-48.
 Id., pp. 56-57.
 The monitoring report counts these 17 benchmarks as among those for which the Department has failed to reach its goal. Braam Settlement Monitoring Report #3, April 17, 2007, p. 43. Available at www.braampanel.org.
 See Braam v. State of Washington: New Developments Under the Reform Process, Youth Law News, Jan-Mar 2007, for more details on the foster parent survey.
 According to the first two monitoring reports, the Department has not complied with all of the required action steps. Available at www.braampanel.org.