How to read this report
This report contains three primary chapters. This introductory chapter provides context for the report, as well as some tips for reading the summary of results. Chapter 2 presents a set of observations and discussion drawn from the analysis. Chapter 3 presents the summary results for each question on the survey. These results represent an overall baseline for digital preservation in the states and territories in terms of the issues examined in the survey. The appendices include a copy of the survey, as well as a summary of the process used to administer the survey. A summary of the units represented in the responses is presented as well. Additional appendices include tables referred to throughout the report such as ones showing how specific types of units responded to specific questions. Please note, for the rest of this report LARM will be used to refer to state and territorial library (L), archives (A), and records management (RM) units.
The State Government Digital Information Preservation Survey
The Center for Technology in Government, in partnership with an advisory board of representatives from state and federal LARM units, designed a survey to create a state government digital information preservation baseline. The scope of the baseline was extensively informed by the results of the Library of Congress States Workshops. The survey addressed questions in the following areas:
-
Institutional Roles and Responsibilities
-
State Government Digital Information Preservation Activities
-
Training Needs for Digital Preservation
-
State Government Digital Information Currently At-Risk
-
Engagement with Enterprise Architecture
Digital Preservation
Digital preservation was defined broadly in this survey as the management of government digital information for long-term access and use.
Not included in the scope of this survey are activities related to the transformation of information from an analog or physical format into a digital format (e.g., scanning of paper records and converting text on paper into text in computer files). See Appendix
B for details on how the survey was developed and administered.
Understanding the data
The following four points should be kept in mind when reading this report:
- Responses represent self-assessments. Responses summarized in this report and detailed in the online State Government Digital Preservation Profiles represent “self-assessments” by the responding states.
- A single response may represent a variety of possible units. State librarians, archivists, and records managers were given the option of submitting one or multiple responses for their state or territory, therefore, the responses themselves vary in terms of the number of units represented in a single response. While CTG received a total of 67 responses representing all fifty states and three territories, the actual responses include different representations from state LARM units. For some states, LARM units are represented in a single response. In other cases, the state library and archives for a particular state submitted separate responses. Finally, in several cases, the survey response from a state represented a single unit such as the state library or state archives only. While this response method has contributed to some very comprehensive and informative state profiles, it does make the reporting of results somewhat unique. Readers are encouraged to use this report as a “spring board” into the state profiles to learn about the challenges and successes of specific states. See Appendices C and D for details on which states and LARM units responded and the nature of their response efforts.
- Level of analysis. It is important to note that the majority of the analysis was done at the respondent rather than the state level. As described in Appendix C, different types of responses were received from different states. For example, two states might have their LARM units represented in the results. However, the first state may have submitted one integrated response while the second submitted two separate responses. Therefore, when reading the summary of results, keep in mind that the results do not represent a unified state-level picture and, in some cases, separate responses from the same state may result in an over representation or contradictory information from one state compared to responses from those states that submitted a single response from multiple units.
The one exception to this comes in the analysis of survey Section 2 Institutional Roles and Responsibilities. For the analysis of Section 2, responses were combined to represent a “state-level” response. The purpose of this section is identify where authority for setting standards and responsibility for providing services to executive, legislative, and judicial agencies is placed in each state. To create a state-level response, separate state responses, i.e., separate responses from a state archives and a state library, for example, were combined. This combination of responses and the exclusion of responses not representing at least the state library and archives resulted in 38 state-level responses (i.e., 37 states and one territory). See Appendix D for the specific state responses included in this summary.
- Response completeness. When interpreting result summaries, please note that not every respondent answered every question on the survey. Nonetheless, for each question summarized below, readers can assume a minimum of a 64% response rate. Section 2, as explained above, is an exception to this rule. For a breakdown of responses to the questions see Appendix E and the Web-based State Government Digital Preservation Profiles.
