logo

Assessing Mobile Technologies in Child Protective Services

Abstract

Executive Summary

Introduction and Methods

Overall Assessment Results

District Pilot Programs

Overall Deployment and Security Considerations

Overall Recommendations

Appendix A: About the Center for Technology in Government

Appendix B: Current Practice Research

APPENDIX C: List of Survey Categories

Appendix D: Sample Surveys

Introduction and Methods


The Pilot Program

This assessment report was prepared by the Center for Technology in Government (CTG) (see Appendix A) under a contract with the NYS Office of Children and Family Services (OCFS). The purpose of the work was to assess the performance of mobile technology deployed in a pilot test program with child protective service (CPS) workers. The mobile technologies were deployed to a sample of CPS workers for use in their field work and reporting responsibilities. The pilot was conducted in three Local Departments of Social Services (Local Districts): the New York City Administration for Children’s Services (NYC/ACS), Monroe County Department of Human Services, Child and Family Services Division, and Westchester County Department of Social Services, Family and Children's Services. OCFS engaged CTG to conduct this assessment and provide a report to the Commissioner of OCFS to assist in decision making and planning for possible further deployment of these technologies.

This entire pilot initiative was a collaborative effort among OCFS, the Local Districts, and CTG, an applied research center at the University at Albany. OCFS coordinated the procurement and management of the Local District’s initiatives, but each district designed how technology was tested in its own pilot. CTG led the independent assessment of the mobile technologies within the Local Districts.

All three Local Districts tested different technologies and managed their own timeline: CTG’s overall evaluation focused on two core questions: CTG used three main kinds of data to construct answers to these questions: surveys of the users of the technologies, interviews and workshops to gather qualitative descriptions of experiences and challenges, and data on entries to the central CONNECTIONS database. The analysis and conclusions set forth in this report are based on those data resources.

Methods

The timing of CTG’s arrival in this initiative led to some challenges in data collection: one Local District initiative had already begun, one was nearing its end, and one did not get started until quite near the end of the assessment period. To accommodate these differences CTG analyzed data previously collected by the districts and extended deadlines to accept as much information as possible. Overall the assessment extended over a four month period starting in July 2006 and ending in late October 2006

We collected data directly from the participants through a baseline survey, followed by periodic and post-pilot surveys (Appendix B and C), information gathering sessions with CPS caseworker and district implementation teams, and a full-day Final Assessment Workshop with district and OCFS staff. In addition, we did research on current practices in seven other states and analyzed data from the central database (Appendix D).

Overall, 18 separate surveys were administered, covering 70 participants. In addition, CTG interviewed 61 people, nine OCFS staff from both program and IT division and 52 district employees. Of the 52 district employees, 29 CPS caseworkers, 10 supervisors, and 13 from the program/IT Implementation Team participated in five information gathering sessions and one Final Assessment Workshop. Finally, data about 9,200 progress note entries and caseload records from the CONNECTIONS database, entered by field testers in the Local District initiatives, was extracted and analyzed.