Surveillance Bias in Child Maltreatment: A Tempest in a Teapot

Background: Children are believed to be more likely to be reported for maltreatment while they are working with mental health or social service professionals. This “surveillance bias” has been claimed to inflate reporting by fifty percent or more, and has been used to explain why interventions such as home visiting fail to reduce official maltreatment reporting rates. Methods: We use national child abuse reporting data (n = 825,763), supplemented by more detailed regional data from a multi-agency administrative data study (n = 7185). We determine the percentage of all re-reports made uniquely by mental health and social service providers within and across generations, the report sources which could be subject to surveillance bias. Results: At three years after the initial Child protective services (CPS) report, the total percentage of national reports uniquely made by mental health or social service providers is less than 10%, making it impossible that surveillance bias could massively inflate CPS reporting in this sample. Analysis of national data find evidence of a very small (+4.54%) initial surveillance bias “bump” among served cases which decays to +1.84% within three years. Our analysis of regional data showed similar or weaker effects. Conclusions: Surveillance bias effects appear to exist, but are very small.


National Data Description
The national data were drawn from the 2004-2015 NCANDS child files, available at (http://www.acf.hhs.gov/cb/research-data-technology/reporting-systems/ncands). These yearly child files were combined into a single longitudinal database. This is possible due to the presence of a consistent child identifier in all the files. Detailed reporter types in NCANDS data were combined into "MHSS" ("Mental Health" and "Social Services" only) "Other Professional" (all other professional reporters), "Non-Professional" (all other specified reporters) and "Other/Unknown" ("Other" or "Unknown" or "Missing"). It should be noted that because we excluded those (relatively few) children who entered foster care, post investigation services refers to any form of case management, family support or family preservation services provided directly by child welfare. It is, unfortunately, not possible to separate out specific in-home service types, nor are dates or length of service available. Those very few child fatalities in the dataset (less than two thousand per year) also were not included because NCANDS does not provide identifiers in such cases.
We followed a single-year age cohort of children estimated to have been born in 2005. Dates of birth are not available in NCANDS, but child age is recorded in years at the time the report was made. In constructing our single-year age cohort, we therefore selected children with a first report at age "0" in 2005, or a first report at age "1" in 2006, through a first report of age"6" in 2011. This would include children during the first seven years of their life (0-6.99 years old). It was possible that children aged "0" in 2005 could have also had a report in 2004. The 2004 data were therefore checked and such children were excluded, in order to make sure our index reports were true first reports. In simple terms, the 2005-2011 files were consulted (and checked against the 2004 file) to create a list of all children estimated to have been born in 2005 who had a first child abuse and neglect report. The child's first report could be at any time from age 0 up until (but not including) their seventh birthday.
We then checked to see which children in this sample had re-reports within three years, using reports from Federal years through 2014. We followed all children for exactly three years to avoid variable re-report timeframes based on the age at which the child received their first report. If we had simply tracked all subsequent reports, then young children would have been followed for much longer than older children.
One persistent problem in NCANDS data is that data from a given federal year (say, 2014) is not completely present in that year's (2014) submission. This is because open cases or unprocessed cases are often delayed. Such cases are commonly sent the following year. For this reason, we used the 2015 children's data file to supplement our data to the extent that it included "delayed" cases which were actually reported in Federal 2014.
In summary, we included all children estimated to have been born in 2005 who had a first report of maltreatment prior to their seventh birthday. We then followed each child for three years after that initial report and recorded any re-reports.
NCANDS state inclusion/exclusion: The NCANDS Child Files were not routinely provided prior to the past fifteen years, and many states have "gaps" even during that timeframe. Since 2004, only 44 states (including the District of Columbia as a "state") have provided continuous non-missing data.
The problem was compounded when service data was considered. Unfortunately, NCANDS service receipt data have been even more unevenly collected across states. Only 35 (not including the District of Columbia) of our 44 states have existing service ("POSTSERV" variable) data which are sufficiently stable over time (no missing years, no suspiciously large adjacent year variation) to allow for inclusion in the service-specific analyses.
We therefore use two distinct samples, a "44 state sample" (including DC as a "state") for analyses simply looking at professional reporters without controlling for services and a reduced "35 State sample" for analyses comparing served and not served cases. Starting with all fifty states, Puerto Rico and DC, we excluded the following states from the 44 state sample: AK, AL, MD, MI, ND, OR, PR, and WI. The 35 state sample was further reduced by omitting DC, GA, ID, IN, MD, NC, NY, PA and SD.

Regional Data Description
These data are drawn from a larger longitudinal study using data from the St. Louis metropolitan area beginning in the early 1990's and funded by the National Institute of Mental Health (R0MH 06 1733-04 A1). This study used child protective services reports and income maintenance (AFDC and TANF) records to select all children aged 0-11 (inclusive) with a first child maltreatment report during 1993 or 1994, as well as a matched comparison group receiving public assistance but with with no maltreatment reports during that period. Children were followed through midyear, 2006. The data held by the State of Missouri were already linked, because the CPS and Income Maintenance files share a common child identifier. Children who were reported for fatality or who died within the first week were excluded from the study. To allow for independence of observations, one child per family was randomly selected (N = 12,409) when multiple children were present. Exclusion critieria (see main article) further reduced the sample to 7,185. For intergenerational analyses, only 781 families met inclusion criteria (see main article). For more information on this dataset, see Jonson-Reid, Melissa, Brett Drake, and Patricia L. Kohl. "Is the overrepresentation of the poor in child welfare caseloads due to bias or need?" Children and Youth Services Review 31.3 (2009): 422-427.

Monthly Data-Supplementary Tables
Three tables have been appended showing the cumulative reports at the national level, using the 44 state sample (Table S1) the full 35 state sample (Table S2) and the served and unserved cases within the 35 state sample (Tables S3 and S4). Each includes cumulative monthly counts for any report, reports from MHSS sources, reports from Non-MHSS sources, reports from Professional and Non-Professional sources.

Post-Hoc Analysis: Estimating the magnitude of SB
For the 36 month samples, the following procedure was used, with the 35 state sample as the data source. This analysis is built on the following premises (using 36 month figures for the next four numeric points): 1) Served UMHSS cases had 1.545 times as many rereports as unserved cases.
2) Remaining (non-UMHSS) cases had "only" 1.237 times as many rereports as unserved cases. 3) This difference (1.545 vs. 1.237) could plausibly be due to SB among UMHSS cases. 4) We therefore adjust down the total number of re-reports among UMHSS cases by using the lower rate (1.237) to model what might have happened in the absence of SB. The difference between totals is the number of "excess" reports plausibly due to SB.
We repeat this calculation process below for the 3 month data.

Procedure (36 month re-reporting)
The rate of unique reports from MHSS sources was calculated (UMHSS). The number of all other reports was calculated (Other). This was done for both served and unserved samples. We can therefore make the following calculations: 1. While served cases always had higher re-report rates, this increase was more pronounced among UMHSS cases (+54.5033917%) than among other cases (+23.7035942%). 2. If served UMHSS cases had only increased at the lesser rate (the presumably "non-SB influenced" rate of 1.237035942) instead of the observed rate (1.545033917) then their rate of re-reporting would have been reduced to 1.237035942/1.545033917 or 80.0652936% of the actual observed re-reports among UMHSS cases. 3. Given that there were 5515 unique MHSS served re-reports (see Table S-3), this adjustment would reduce those re-reports to 5515*.800652936 or 4416 adjusted "non-SB influenced" total re-reports, a reduction of 1099 reports. 4. There were 211,582 total re-reports made in the 35 state sample (see Table S-2). The estimated revised "non-SB influenced" re-report total is 211,582−1099 = 210,483. 5. For purposes of understanding SB effects upon rereports across the population of all index reports received by CPS (both served and unserved), the "SB-influenced" rate divided by the "non-SB influenced" rate is (211,582/210,483) or 100.52%, suggesting that SB influence may plausibly account for an increase of 0.52% among all index reports (served and unserved cases). 6. For purposes of understanding re-reports among only those cases previously served by CPS (served cases only -unserved cases are omitted), the "SB-influenced" rate is (60,997/59,898) or 101.84%, suggesting that SB influence may plausibly account for an increase of 1.84% among that subset of index reports served by CPS.

Procedure (3 month re-reporting)
Again, the rate of unique reports from MHSS sources was calculated (UMHSS). The number of all other reports was calculated (Other). This was done for both served and unserved samples. We can therefore make the following calculations: 1. While served cases always had higher re-report rates, this increase was more pronounced among UMHSS cases (+76.3644408%) than among other cases (+29.1663174%). 2. If served UMHSS cases had only increased at the lesser rate (the presumably "non-SB influenced" rate of 1.291663174) instead of the observed rate (1.763644408) then their rate of re-reporting would have been reduced to 1.291663174/1.763644408 or 0.732382995% of the actual observed re-reports among UMHSS cases. 3. Given that there were 2599 unique MHSS served re-reports (see Table S-3), this adjustment would reduce those re-reports to 2599*0.732382995 or 1903 adjusted "non-SB influenced" total re-reports, a reduction of 696 reports. "#"represents the cumulative number of children who have had a report of the indicated type up through the month in question. The "%" figure shows the cumulative percentage of children in the total sample (N=613,210) who have had such a report. Note: Multiple reports may occur each month (e.g. professional and nonprofessional).
1. There were 52934 total re-reports made (see Table S-2). The estimated revised "non-SB influenced" re-report total is 52934−696 = 52238. 2. For purposes of understanding SB effects upon rereports across the population of all index reports received by CPS (both served and unserved), the "SB-influenced" rate divided by the "non-SB influenced" rate is (52934/52238) or 101.33%, suggesting that SB influence may plausibly account for an increase of 0.52% among all index reports (served and unserved cases).

For purposes of understanding re-reports among only those cases previously served by CPS
(served cases only -unserved cases are omitted), the "SB-influenced" rate is (16025/15329) or 104.54%, suggesting that SB influence may plausibly account for an increase of 4.54% among that subset of index reports served by CPS (served cases only). "#" represents the cumulative number of children who have had a report of the indicated type up through the month in question. "%" shows the cumulative percentage of children in the total sample (n = 613,210) who have had such a report. Note: Multiple reports may occur each month (e.g. professional and non-professional).