Appendix C: Data Collection Methodology, Tools, Counts and Response Rates
There were four streams of data collection throughout the project: 1) two online surveys (base and post); 2) data from CONNECTIONS (central child welfare information system); 3) district teleconferences; and 4) district questionnaires.
Online Surveys
Two separate surveys, a baseline and post-pilot survey, were administered. The surveys collected data about respondents’ perceptions and attitudes using the laptop or tablet PC within several areas of CPS work – work practice, work time, demographic information, mobility/location, skill and stress levels, technology acceptance, training, and use of technology. The surveys were developed over a period of a several months and a pre-survey was tested. The surveys were modified based on the pilot survey results and the project team’s knowledge and understanding of CPS work. The online surveys were developed and administered through commercial software (Survey Monkey).
Districts were asked to provide the names, email addresses, and titles of participating CPS caseworkers and supervisors. Data reported in the survey represents responses from the caseworkers only. Personalized survey invitations were emailed to participants. The baseline survey was administered prior the deployment of laptops or tablet PCs to participating caseworkers. The baseline survey was open for three weeks starting on 9/21/07 and ending on 10/5/07.
The post-pilot survey was administered three months following the deployment of laptops. The survey was open for one week; starting on 1/3/08 and ending on 1/10/08. Data were collected from three new thematic categories: the impact of laptops on caseworkers’ daily activities, mobility-related issues, and technical difficulties experienced during the pilot period. Data quality checks were performed and the data were recoded as needed.
Overall, there were 448 CPS caseworkers that participated in this study. Supervisors also participated in the study but their survey responses were not included in the results. This was done because the number of supervisors participating in the pilot were not representative across districts and the total number of supervisors responding represented a number too low to report.
The response rate for the baseline survey was 74% (n = 331), while the response rate for the post- pilot survey was 61% (n = 275). The total number of caseworkers that took both surveys was 234, resulting in a response rate of 52%. The table below shows the number of caseworkers and the response rates for each of the participating districts.

Table 2 – Response Rates by Districts
Teleconferences
During the week of December 10-14, 2007, CTG held separate teleconferences with project participants in ten Local Social Service Districts participating in the Demonstration Project to learn more about how they were using the laptops and tablets deployed for CPS work. Participating County DSS were chosen by CTG and the NYS OCFS liaisons. Criteria for choosing the districts included (1) how long they had the technologies in use, and (2) districts that provided a full range of geographical representation across the state, in terms of rural and urban settings and overall size.
Each district participated in one teleconference with CTG interviewers. All participants were given sample questions before the teleconferences that dealt with deployment, connectivity, use and location, changes in work, issues/concerns, policy implications, and overall benefits of laptop use. The following table shows the districts interviewed and the number of participants in each call.

Table 3 – Teleconference Time and Participant Information
CONNECTIONS Data
The overall objective for using CONNECTIONS data was to measure the effect of the use of mobile technologies on CPS work practices by using data from the central database. The CONNECTIONS dataset (i.e., the central database) contained information on case records and caseworkers’ progress notes. The information contained within each of these records included: Stage ID, Person ID, time- related information about the investigation stage (Intake Start Date, Investigation Stage Start Date, Investigation Stage End Date); progress notes information (Progress Notes ID, Progress Notes Event Date, Progress Notes Time, Progress Notes Entry Date, Progress Notes Types, Progress Notes Purposes); safety assessments (Safety Submit Date, Safety Approval Date) logged by caseworkers in each County DSS.
The CONNECTIONS data were pulled by the date a progress note was entered by participants during two timeframes—the pre- and during-pilot periods. These timeframes were equal in duration. A total of 132,045 progress note entries and 14,308 unique investigation stages made up the dataset from 448 CPS caseworkers. The table below shows the start and end times for both timeframes, the duration of each timeframe, the total number of progress notes entries, and the total number of unique cases per participating district.

Table 4 – Overall CONNECTIONS Data Information per District
District Questionnaire
Each district was asked to complete a questionnaire about their district. All of the participating districts completed and submitted the questionnaire. The focus of the questionnaire was to learn about each district’s goals, connectivity solutions, participant selection, technology deployment, changes in policies or work practices, and general information. The following are sample questions from the questionnaire:
- What were your district’s objectives for participating in this pilot: What do you hope to achieve by deploying mobile technology?
- What connectivity solutions did you choose and with what provider?
- Were all devices deployed? If not, how many were not deployed and why?
- Did all participants receive their own device, or are devices shared among several participants? If shared, please describe how the devices were shared among the participants.
- How were CPS workers selected to participate in the pilot?
- Please describe the deployment training process and how each participant received the devices.
- Please describe the security procedures that were addressed during the training?
- What policies, if any, were modified during the pilot period, such as overtime and field visit scheduling? Describe the new policies and how they differ from the previous policies.
- What work practices, if any, were created, changed or abolished during the pilot period?
- What is the geographical area, population, and urban/rural makeup of your district?
- What is the total number of CPS workers in your district (not just those participating in the mobile technology project)?
© 2003 Center for Technology in Government
