logo

Assessing Mobile Technologies in Child Protective Services

Abstract

Acknowledgments

Executive Summary

Chapter 1: Introduction and Project Overview

Chapter 2: Factors that Shape the Laptop Experience

Chapter 3: Mobility and Use

Chapter 4: Productivity and Efficiency

Chapter 5: User Satisfaction

Chapter 6: Recommendations and Future Considerations

APPENDIX A: The Center for Technology in Government (CTG)

APPENDIX B: Methods

APPENDIX C: Data Collection Tools

Chapter 4: Productivity and Efficiency


Productivity and efficiency

We used CONNECTIONS system data and survey responses to examine three core questions about possible technology impacts: (1) Are workers more productive with respect to progress notes and reporting? (2) Does timeliness of reporting change? and (3) Does technology use affect the kinds of reporting activities undertaken or the type of work done? The goal is to trace possible technology impacts on the productivity, timeliness and work flow for documentation.

The evidence presented in this section suggest modest improvements in productivity attributable to laptop use, and less noticeable improvements in timeliness of documentation and work product changes. The small improvements in timeliness may be simply a consequence of limited room for improvement, since overall timeliness of documentation by these testers prior to laptop deployment was good. Also, some factors affecting timeliness, may not be amenable to technology impacts (e.g., overtime policy, management practices).

What we measured

The data extracted from the central CONNECTIONS database included information on cases (investigation begin date and end date), progress notes (related event date, the date a note is entered into the system, and the type of note), and safety assessments (safety assessment submission date). Safety assessment submission date is used because that is the time frame within which the CPS worker has control over entering safety assessment information. Safety assessment approvals require supervisor action and other factors outside the caseworker’s control.

The surveys gathered participants’ perceptions of timeliness, productivity, and changes in work activities. Our findings on timeliness and work flow impacts include analysis of these data.

Also note that data and findings are based on the data collected from the testers in each of the field offices, not the entire field office.

Limitations of the Data

The central database records the timing and types of progress notes entered, but not their length or quality. The number of cases per tester and the notes per case varied widely, as did the types of notes entered. The participants were working on a mix of cases, some open for long periods prior to the pilot test, some started and closed during the pilot, and others remaining open at the end of the test period. Therefore, the notes entered during the pilot test period applied to both new and older cases, ranging from as little as a day to over several weeks old. We used only those cases that had an actual investigation close date. Approximately 20 % of all cases (471 cases) started within 60 days of our pilot data collection date (10/24/07) and were not included in the analysis. Moreover, the data does not include the ultimate disposition of the case or any rating of the quality of outcomes obtained.

In addition, by law there are specific timeframes that must be followed. For example, the “clock starts” for two important processes when a call is made to the central registry. The date the call is made is recorded in CONNECTIONS and a caseworker has seven days from that point to do a safety assessment and 60 days to complete a full investigation. Progress notes are required to be entered contemporaneously, but the definition of contemporaneous is interpreted differently in each field office.

Efficiency and productivity

We examined the changes in efficiency and productivity in terms of the pace of case closings and safety assessments submitted. It appears the rate of timely closing of cases between zero and 60 days increased for both field offices during the test period and can be seen as evidence of productivity increases.

Graph 6 - Proportion of Cases Closed Pre-Pilot and During Pilot

Graph  6  - Proportion of Cases Closed Pre-Pilot and During Pilot

Manhattan testers increased the rate of case closings within 60 days (40 % pre-; 45 % during pilot), despite a 32 % increase in the number cases between pre- and during-pilot test periods (440 cases; 579 cases respectively). Similarly, Staten Island testers increased their rate of case closings within 60 days (71 % pre-; and 79 % during-pilot); while experiencing a smaller case number increase of 8%. Within this population of testers, the Staten Island participants have a larger number of cases overall than Manhattan participants (1383 cases; 1019 cases respectively).

The proportion of safety assessments submitted in the pre-pilot and during-pilot periods was also examined. By law, a safety assessment needs to be submitted within seven days.

Graph 7 - Safety Assessments Submitted Within Seven Days

Graph  7  - Safety Assessments Submitted Within Seven Days

The submission rate of safety assessments within seven days stayed about the same for each field office pre- and during-pilot test periods; however, Manhattan maintained the same rate of submission with a 32 % increase in the number of safety assessments submitted (which corresponds to the case increase mentioned above – each case should have a safety assessment submitted although some cases close without submitting a safety assessment). This can be seen as evidence of productivity increases in Manhattan. Staten Island did not have a substantial increase in case numbers or safety assessments during this period, but maintained a similar submission rate.

Timeliness

One indicator of timeliness used is elapsed time – or the number of days between an event and the posting of documentation in the CONNETIONS system regarding that event, or the length of time necessary to close a case. If we look at the age of cases, for only those cases that opened and closed within the pre-pilot period (April 28, 2007 – July 20, 2007) and the during-pilot period (July 30, 2007 – October 19, 2007), we see a picture which demonstrates timeliness increases. The trend line during the pilot shows a marked increase in cases closed within approximately the first seven days and the percentage of cases closed is higher overall by approximately 10 % over the pre-pilot figures. The pre-pilot trend line also shows a steep increase as it nears 60 days and this may represent CPS workers playing “catch-up” in order to meet the required time frames. With the technology, the days approaching 60 days is much smoother and reflects less of an accelerated “catching-up” process.

Graph 8 - Age of Cases When Closed

Graph  8  - Age of Cases When Closed

Table 2 below represents the percentage of safety assessments submitted over a seven day period. Very few safety assessments are submitted on the same day the cases investigation opens. This is as expected given that completing a safety assessment requires multiple tasks to be completed, such as visiting the home and contacting individuals. Three-quarters of the safety assessments (74 % pre- and 74 % during) are submitted by day seven for both the pre- and during-pilot periods.

Table 2- Percentage of Safety Assessments Submitted Within 7 Days

 
Same day
4 days
1 day
5 days
2 days
6 days
3 days
7 days
Pre
 
During
 
Pre
 
During
 
Pre
 
During
 
Pre
 
During
 
Percent of all safety assessments submitted
 
.09
 
.14
 
3.
 
2
 
8
 
9
 
15
 
17
 
Number of safety assessments
 
1
 
1
 
31
 
17
 
55
 
45
 
76
 
63
 
 
Same day
4 days
1 day
5 days
2 days
6 days
3 days
7 days
Pre
 
During
 
Pre
 
During
 
Pre
 
During
 
Pre
 
During
 
Percent of all safety assessments submitted
 
24
 
27
 
35
 
38
 
52
 
53
 
74
 
74
 
Number of safety assessments
 
95
 
67
 
116
 
86
 
182
 
109
 
230
 
152
 

With 75 % of safety assessments submitted by day seven, that leaves 25 % for improvement and a chance for some technological impact. However, 95 % of all safety assessments are submitted by day 14 after the start of a case (95.2 % pre-pilot period; 95.6 % during-pilot period). Too many factors may be at play here to expect the technology alone to improve timeliness, including tracking down clients or waiting for information from other parties.

We also looked at the elapsed time between progress note entry and the related event. During both periods, approximately half of all progress notes are entered on the same day as the event, and approximately two-thirds are entered within one day after the event. In addition, three-quarters of all progress notes are entered by three-days after the event. We would have expected the proportion of notes on the same day, next day, and second day to increase from the pre-pilot periods. However, there were not overall increases in the proportion, indicating no productivity gains in the reporting process. In addition, if three-days is considered contemporaneous, that leaves only 25 % of all notes where technology impacts may help to improve timeliness.

Table 3- Percentage of Notes Submitted Within 5 Days

 
Same day
3 days
1 day
4 days
2 days
5 days
Pre
 
During
 
Pre
 
During
 
Pre
 
During
 
% of all notes entered
 
50
 
50
 
66
 
65
 
71
 
70
 
Number of notes entered
 
10348
 
8608
 
3469
 
2713
 
1047
 
890
 
 
Same day
3 days
1 day
4 days
2 days
5 days
Pre
 
During
 
Pre
 
During
 
Pre
 
During
 
% of all notes entered
 
76
 
75
 
80.
 
79
 
83
 
83
 
Number of notes entered
 
1085
 
884
 
792
 
657
 
591
 
586
 

There are some slight differences between field offices with Manhattan more able to get progress notes in by the first day, but Staten Island testers catch up by day two. In Manhattan, about half of all progress notes were entered on the same day before and during the pilot (52 % pre-pilot; 51 % during-pilot) and about two-thirds were entered one day after the event (68 % pre-pilot; 65 % during-pilot). In Staten Island, about half of all progress notes were entered on the same day (47 % pre-pilot; 49 % during-pilot) and almost two-thirds were entered one day after the event (69 % pre-pilot; 65 % during-pilot).

Graph 9 focuses on elapsed time and plots the percentage of all notes entered by days from the note entry to related event for each field office. Overall, the same pattern is present as above – no substantial technology impacts on timeliness of progress notes is apparent for each field office.

The information we gathered from surveys and workshops may shed some light on these patterns. First, timeliness is impacted by individual work styles and caseloads. Some individuals were very timely before the introduction of technology, and some were not. Some supervisors reported seeing substantial improvements in productivity in some testers. The introduction of technology appears to affect individuals differently and the aggregate results are that modest gains by one person is adjusted by modest losses for another person. One caseworker stated, “Even though the laptop makes case notes entry a lot easier, because I am able to access CONNECTIONS from anywhere, as long as a signal is available, there is still the issue of the caseload size as well as some of the individual cases and the type of family and their issues that we have to deal with.”

Graph 9 - Percent of Notes Entered by Day After Event by Field Office

Graph  9  - Percent of Notes Entered by Day After Event by Field Office

CPS workers in the workshops described working with their supervisors to close a case while the worker was in the field, and being able to enter the information the supervisor needed to close the case. While, changing overall work habits may not happen in a 12-week pilot test, these experiences represent positive work changes.

Work activity changes

The volume of progress notes is another indicator we used to detect possible work flow effects. Overall, the number of progress notes entered decreased for both field offices during the pilot test period.

Graph 10 - Volume of Notes Entered

Graph  10  - Volume of Notes Entered

There is a decrease in the notes entered per month during the pilot test periods for each field office. Both districts decline in July and went up in August when the laptops were introduced, then declined again in September and October. This is interesting considering the total number of cases increased for Manhattan by 32 %. This demonstrates that the number of progress notes per case varies considerably, presumably by individual working style or other factors.

The changes in work practice were examined in terms of possible variaion in the types of notes entered. The introduction of new technologies could result in changes in the kinds of work done, as well as the speed and quality. For this test, however, it appears that there were no discernable changes in the types of notes entered during the pilot period. The descriptions of the work impacts are discussed throughout this report, but we used the data from CONNECTIONS to show the proportion of four kinds of notes: (1) attempted contacts, (2) contacts, (3) collateral contacts, and (4) summary notes both prior to and during the pilot test periods. These are only a few of the many types of notes, but the numbers of other note types were too small for any meaningful comparisons.

Graph 11 - Percent of Notes Entered by Type and Field Office

Graph 11  - Percent of Notes Entered by Type and Field Office

Perceptions of timeliness and work impacts

Participants were issued a post-pilot survey at the end of the testing period. We asked participants to what extent using a laptop made a difference in CPS work compared to not having the laptop. Five different areas were examined: (1) timeliness of documentation, (2) ability to do work in court, (3) ability to access case information, (4) communication with supervisors, and (5) service to clients. Respondents were asked to rate the difference on a five-point scale 1 being much worse, 3 being about the same, and 5 being much better.

Overall, most caseworkers reported the use of laptops improved their work in terms of timeliness and accessing information, with a very few reporting it as negative. A smaller proportion reported no difference. Tables 4 and 5 below show the percentages.

Table 4 - Perceived Change Timeliness and Work Impacts

 
Much worse
(n)
 
Somewhat worse
(n)
 
About the same
(n)
 
Somewhat better
(n)
 
Much better
(n)
 
Overall (both field offices)
Timeliness of documentation
 
1 % (1)
 
0 % (0)
 
32 % (30)
 
48 % (45)
 
19 % (18)
 
Ability to access case information
 
2 % (2)
 
1 % (1)
 
19 % (18)
 
45 % (42)
 
33 % (31)
 
Communication with supervisors
 
0 % (0)
 
1 % (1)
 
66 % (61)
 
20 % (19)
 
13 % (12)
 
Service to clients
 
2 % (2)
 
0 % (0)
 
69 % (65)
 
17 % (16)
 
12 % (11)
 
Ability to do work in court
 
3 % (3)
 
3 % (3)
 
44 % (40)
 
28 % (25)
 
21 % (19)
 

About two-thirds of participants in both field offices reported their timeliness of documentation to have been somewhat better to much better using the laptop. Over three-fourths of participants in both field offices reported the ability to access case information as being somewhat to much better using the laptop. Conversely, participants did not perceive having the laptop made much of a difference in communicating with supervisors and service to clients (66 % and 69 % respectively). Some participants during workshops said that they did receive new case assignments while in the field by checking their email and CONNECTIONS accounts. Supervisors would put all information in CONNECTIONS that was needed to continue the investigation. However, many caseworkers said that they would still get cell phone calls from their supervisors about new cases in the field, and they would use the cell phone more frequently for this.

The ability to do work in court received mixed ratings. With an almost equal percentage thinking it was about the same (44 %) as those perceiving having a laptop was better (49 %). We decided to break this down by field office (see Table 5 below).

Table 5 - Perceived Ability to Do Work in Court by Field Office

 
Much worse
(n)
 
Somewhat worse
(n)
 
About the same
(n)
 
Somewhat better
(n)
 
Much better
(n)
 
Manhattan
Staten Island
Ability to do work in court
 
2 % (1)
 
0 % (0)
 
40 % (17)
 
28 % (12)
 
30 % (13)
 
Manhattan
Staten Island
4 % (2)
 
6 % (3)
 
49 % (23)
 
28 % (13)
 
13 % (6)
 

Staten Island testers reported less of a positive impact in doing work at court than did Manhattan. In our workshops, we heard that the court houses in Staten Island did not receive a good signal, that cases workers “just need a place to go” emphasizing that the caseworkers wait in the same room as the families and needed a private space to do work on the laptop.