logo

Preserving State Government Digital Information: A Baseline Report

Abstract

Acknowledgments

Executive Summary

Chapter 1. Introduction

Chapter 2. Creating capability for digital preservation partnerships

Chapter 3. Building a Baseline

APPENDIX A. Advisory Committee

Appendix B. Survey Process

Appendix C. Summary of Respondents

Appendix D. Survey Respondents

Appendix E. Survey Responses Organized by Respondent and Question

Appendix F. State Government Digital Information Preservation Survey

Chapter 3. Building a Baseline

The summary results represent the first-ever compilation of data about the status of institutional relationships, activities, concerns, and priorities related to the preservation of government information in digital form at the state and territorial level in the United States. This baseline data may inform the development of performance expectations in states or even possibly the development of performance standards. As this point however, they simply provide information about the current state of affairs – a baseline – against which future state government digital information preservation efforts may be compared. They can be used to inform strategy development, business cases and priority setting for states and other interested organizations alike. The results from the open and closed ended questions asked in the survey are presented in order of their inclusion in the survey (See Table 13).

Table 13.
State government digital information survey topics
Section
Description
 
Question format
 
Section
 
Title
 
Section 1.
 
Responding Unit(s)
 
State and organization including which units were represented in the survey
 
Open-ended and multiple choice
 
Section 2.
 
Institutional Roles and Responsibilities
 
Identification of which units, if any, have authority over digital preservation related standards and provide services to Executive, Legislative, and Judicial Agencies
 
Closed-ended with some open-ended for explanation
 
Section 3.
 
State Government Digital Information Preservation Activities
 
Description of recent or ongoing efforts to preserve state government digital information in the respondents’ state
 
Open-ended
 
Section 4.
 
Training Needs for Digital Preservation Related Activities
 
Identification of existing training available or basic or advanced training needed for specific capabilities
 
Closed-ended with some open-ended for explanation
 
Section 5.
 
State Government Digital Information Currently At-Risk
 
Examples of state government information that is at-risk or is no longer accessible in the respondents’ state
 
Open-ended
 
Section 6.
 
Enterprise Architecture
 
Awareness of and involvement in the respondents’ state Enterprise Architecture efforts
 
Open-ended and multiple choice
 
Section 7.
 
Additional Thoughts or Comments
 
Opportunity for respondents to comment on the survey itself or provide any additional information related to their digital preservation efforts
 
Open ended
 

The discussion in each section includes tables and graphs that show the number of responses given for the closed-ended questions and related tables that characterize many of the key points or themes from the open-ended questions. For a detailed discussion of the survey process, including the survey itself and respondent demographics, see Appendices B, C, and F.

Section 2. Institutional roles and responsibilities

Section 2 of the survey asked participants to indicate which units (i.e., library, archives, records management, or other), if any, have authority for setting standards for digital information created or maintained by executive, legislative, and judicial agencies. Respondents were also asked to indicate which of these units, if any, provide services related to digital information to these agencies. The findings in this section are based on state-level responses. Thirty-seven states and one territory are included in the analysis for a total of 38 responses. Each includes representation from both the library and archives units; in many but not all these responses, records management units and several other units were represented as well. Table 14 lists the specific authorities and services included in the survey.

Table 14.
Authority for setting standards and services provided for digital information for executive, legislative, and judicial agencies
 
Authority for setting standards
 
Setting data management standards and or guidelines for information creation (e.g., metadata, file formats)
 
Setting information technology standards and or guidelines for information creation (e.g., state approved software applications)
 
Setting standards for information retention and disposal (e.g., retention periods and methods of disposal) for various series/types of digital records and publications
 
Services provided
 
Storage for digital information
 
Consultation and training services on digital information creation
 
Consultation and training services on digital information management
 
Consultation and training services on digital information preservation
 
Consultation and training services on digital information access
 
Preservation (e.g., migration, reformatting)
 
Access (e.g., search engine)
 
Certification (e.g., trustworthiness of system, backups sufficient)
 

For each of the standards and services respondents were asked to identify the following: In this section, respondents were allowed to select all that apply if authority or providing services were shared or delegated. As a result responses indicate that authority for setting standards and providing services is shared across multiple units. Therefore, the original categories were modified and then used to summarize results. In general, the modified categories map to the original categories provided to respondents, but they allow findings about shared authority and responsibility to be highlighted as well.

With respect to authority: With respect to services: Please note as a consequence of these modifications percentages across the rows of Tables 15-20 do not add up to 100.

Authority for setting standards
Regardless of the branch of government with authority for setting standards, it is most often assigned to units other than the state LARM units. There appears to be one exception to this finding having to do with standards for information retention and disposal for various series/types of digital records and publications for executive agencies. In these cases, authority for setting retention and disposal standards that govern executive agencies is assigned to archives and records management units and in some cases shared among these units and the library.

Table 15.
Authority for setting standards for digital information created or maintained
by executive agencies
Standard
 
Authority not assigned
 
L has authority alone
 
L shares authority with A, RM, or both
 
A, RM, or both have authority alone
 
L, A, or RM share authority with Other
 
Only Other has authority
 
Setting data management standards and or guidelines for information creation (e.g., metadata, file formats)
 
11%
(4)
 
3%
(1)
 
5%
(2)
 
11%
(4)
 
34%
(13)
 
37%
(14)
 
Setting information technology standards and or guidelines for information creation (e.g., state approved software applications)
 
13%
(5)
 
0%
(0)
 
0%
(0)
 
5%
(2)
 
16%
(6)
 
66%
(25)
 
Setting standards for information retention and disposal (e.g., retention periods and methods of disposal) for various series/types of digital records and publications
 
5%
(2)
 
0%
(0)
 
21%
(8)
 
39%
(15)
 
26%
(10)
 
8%
(3)
 

Table 16.
Authority for setting standards for digital information created or maintained
by legislative agencies
Standard
 
Authority not assigned
 
L has authority alone
 
L shares authority with A, RM, or both
 
A, RM, or both have authority alone
 
L, A, or RM share authority with Other
 
Only Other has authority
 
Setting data management standards and or guidelines for information creation (e.g., metadata, file formats)
 
29%
(11)
 
3%
(1)
 
3%
(1)
 
8%
(3)
 
13%
(5)
 
39%
(15)
 
Setting information technology standards and or guidelines for information creation (e.g., state approved software applications)
 
26%
(10)
 
0%
(0)
 
0%
(0)
 
3%
(1)
 
3%
(1)
 
63%
(24)
 
Setting standards for information retention and disposal (e.g., retention periods and methods of disposal) for various series/types of digital records and publications
 
21%
(8)
 
0%
(0)
 
8%
(3)
 
26%
(10)
 
18%
(7)
 
24%
(9)
 

Table 17.
Authority for setting standards for digital information created or maintained
by judicial agencies
Standard
 
Authority not assigned
 
L has authority alone
 
L shares authority with A, RM, or both
 
A, RM, or both have authority alone
 
L, A, or RM share authority with Other
 
Only Other has authority
 
Setting data management standards and or guidelines for information creation (e.g., metadata, file formats)
 
16%
(6)
 
3%
(1)
 
0%
(0)
 
8%
(3)
 
11%
(4)
 
61%
(23)
 
Setting information technology standards and or guidelines for information creation (e.g., state approved software applications)
 
13%
(5)
 
0%
(0)
 
0%
(0)
 
0%
(0)
 
5%
(2)
 
82%
(31)
 
Setting standards for information retention and disposal (e.g., retention periods and methods of disposal) for various series/types of digital records and publications
 
13%
(5)
 
0%
(0)
 
11%
(4)
 
18%
(7)
 
18%
(7)
 
39%
(15)
 

Responsibility for providing services
Responsibility for providing services to executive agencies is most often held by state LARM units. For judicial agencies, however, this responsibility most often falls outside of that state’s LARM units. The one exception to this finding is in the area of digital information access. Consultation and training in this area are provided to judicial agencies by various combinations of state LARM units.

Legislative agencies are supported in a more mixed method. Four of the services – digital information creation, consultation and training services on digital information management, consultation and training services on digital information preservation, and preservation (e.g., migration, reformatting) – are the responsibility of a combination of the ARM units sometimes in conjunction with L units. Three of the services – storage for digital information, access (e.g. search engine), and certification (e.g., trustworthiness of system, backups sufficient) – most often reside with units other than LARM.

Table 18.
Services provided to executive agencies
Service
 
Service not provided
 
L provides service alone
 
L provides service with A, RM, or both
 
A, RM, or both provide service alone
 
L, A, or RM provide service with Other
 
Only Other provides service
 
Storage for digital information
 
11%
(4)
 
8%
(3)
 
11%
(4)
 
8%
(3)
 
34%
(13)
 
29%
(11)
 
Consultation and training services on digital information creation
 
21%
(8)
 
5%
(2)
 
8%
(3)
 
18%
(7)
 
32%
(12)
 
16%
(6)
 
Consultation and training services on digital information management
 
13%
(5)
 
5%
(2)
 
3%
(1)
 
42%
(16)
 
32%
(12)
 
5%
(2)
 
Consultation and training services on digital information preservation
 
18%
(7)
 
3%
(1)
 
21%
(8)
 
45%
(17)
 
11%
(4)
 
3%
(1)
 
Consultation and training services on digital information access
 
26%
(10)
 
13%
(5)
 
11%
(4)
 
18%
(7)
 
18%
(7)
 
16%
(6)
 
Preservation (e.g., migration, reformatting)
 
18%
(7)
 
11%
(4)
 
11%
(4)
 
24%
(9)
 
26%
(10)
 
11%
(4)
 
Access (e.g., search engine)
 
13%
(5)
 
21%
(8)
 
8%
(3)
 
11%
(4)
 
26%
(10)
 
21%
(8)
 
Certification (e.g., trustworthiness of system, backups sufficient)
 
34%
(13)
 
5%
(2)
 
0%
(0)
 
5%
(2)
 
16%
(6)
 
37%
(14)
 

Table 19.
Services provided to legislative agencies
Service
 
Service not provided
 
L provides service alone
 
L provides service with A, RM, or both
 
A, RM, or both provide service alone
 
L, A, or RM provide service with Other
 
Only Other provides service
 
Storage for digital information
 
13%
(5)
 
3%
(1)
 
13%
(5)
 
5%
(2)
 
32%
(12)
 
34%
(13)
 
Consultation and training services on digital information creation
 
26%
(10)
 
3%
(1)
 
5%
(2)
 
18%
(7)
 
18%
(7)
 
29%
(11)
 
Consultation and training services on digital information management
 
24%
(9)
 
3%
(1)
 
3%
(1)
 
24%
(9)
 
21%
(8)
 
26%
(10)
 
Consultation and training services on digital information preservation
 
26%
(10)
 
3%
(1)
 
5%
(2)
 
37%
(14)
 
11%
(4)
 
16%
(6)
 
Consultation and training services on digital information access
 
32%
(12)
 
3%
(1)
 
3%
(1)
 
21%
(8)
 
13%
(5)
 
29%
(11)
 
Preservation (e.g., migration, reformatting)
 
18%
(7)
 
3%
(1)
 
11%
(4)
 
21%
(8)
 
18%
(7)
 
29%
(11)
 
Access (e.g., search engine)
 
18%
(7)
 
13%
(5)
 
11%
(4)
 
3%
(1)
 
18%
(7)
 
37%
(14)
 
Certification (e.g., trustworthiness of system, backups sufficient)
 
37%
(14)
 
0%
(0)
 
0%
(0)
 
5%
(2)
 
5%
(2)
 
50%
(19)
 

Table 20.
Services provided to judicial agencies
Service
 
Service not provided
 
L provides service alone
 
L provides service with A, RM, or both
 
A, RM, or both provide service alone
 
L, A, or RM provide service with Other
 
Only Other provides service
 
Storage for digital information
 
13%
(5)
 
3%
(1)
 
5%
(2)
 
11%
(4)
 
24%
(9)
 
45%
(17)
 
Consultation and training services on digital information creation
 
18%
(7)
 
3%
(1)
 
0%
(0)
 
21%
(8)
 
16%
(6)
 
42%
(16)
 
Consultation and training services on digital information management
 
16%
(6)
 
3%
(1)
 
3%
(1)
 
24%
(9)
 
16%
(6)
 
39%
(15)
 
Consultation and training services on digital information preservation
 
26%
(10)
 
3%
(1)
 
0%
(0)
 
26%
(10)
 
16%
(6)
 
29%
(11)
 
Consultation and training services on digital information access
 
21%
(8)
 
3%
(1)
 
0%
(0)
 
16%
(6)
 
21%
(8)
 
39%
(15)
 
Preservation (e.g., migration, reformatting)
 
39%
(15)
 
0%
(0)
 
5%
(2)
 
13%
(5)
 
16%
(6)
 
26%
(10)
 
Access (e.g., search engine)
 
29%
(11)
 
5%
(2)
 
8%
(3)
 
5%
(2)
 
18%
(7)
 
34%
(13)
 
Certification (e.g., trustworthiness of system, backups sufficient)
 
37%
(14)
 
0%
(0)
 
0%
(0)
 
3%
(1)
 
5%
(2)
 
53%
(20)
 

The following additional conclusions can be drawn from the data:

Section 3. State government digital information preservation activities

One of the best ways to boost digital preservation capabilities, according to the Library of Congress States Workshop participants, is to learn from practical examples of successful digital preservation projects. Section 3 of the survey was designed to identify these practical examples by capturing descriptions of up to five state government digital information preservation activities and the parties involved from each state. Respondents were asked to characterize each activity from among a pre-determined list of activity types and to provide a narrative description of each.

Of the 67 responses, 54 (81%) included at least one example of a recent or ongoing digital preservation activity. The most common types of digital information preservation activities mentioned were the preservation of digital publications of state governments and the harvesting of agency Web sites. Activities noted also include digitization of paper records, development of digital depositories, development of search engines and Web portals, preservation of historical records, and development of guidelines and metadata standards. Table 21 provides a full list of the types of activities reported.

Table 21.
State government digital information preservation activities
Activity
 
Times mentioned
 
Preservation of digital publications
 
18
 
Harvesting of state agency Web sites
 
16
 
Digitization of paper records
 
11
 
Development of digital repositories
 
9
 
Search engine / Web portal
 
7
 
Preservation of historical records
 
7
 
Development of guidelines
 
5
 
Development of metadata standards
 
4
 
Preservation of e-mail records
 
4
 
Research
 
4
 
Preservation of geospatial records
 
3
 
Migration
 
3
 
Development of models
 
2
 
Training
 
2
 

In many cases, the activity descriptions highlight how partnerships among LARM units, IT and other agencies were established within states as a mechanism for carrying out the digital preservation activity. Overall, however, few examples of collaboration across states were reported.

The activity descriptions also provide some insight into funding for digital information preservation activities. In some cases, in-state support came by way of specific state legislation, in other cases states reported receiving funding from the National Historical Publications and Records Commission (NHPRC) and the Institute of Museum and Library Services (IMLS) to launch their efforts.

The activity descriptions include information about specific standards, strategies, and software in use in the states. For example, several respondents reported adopting TIFF format for their preservation activities. About a dozen respondents reported subscribing to Online Computer Library Center (OCLC) Digital Archives and Web Archives Workbench services to capture electronic publications and agency Web sites. Other third-party software and services reported as in use in state digital information preservation activities include ContentDM developed by DiMeMa, Inc., LOCKSS by Stanford University, Archive-It by Internet Archives, CEP by University of Illinois at Urbana-Champaign, and DigiTool by Ex Libris.

Section 4. Training needs for digital preservation related activities

Preserving information in digital form requires a new set of individual as well as organizational capabilities. LARM representatives at the Library of Congress States Workshops recognized this need as they expanded their discussions beyond organizational capabilities to individual capabilities. In general, participants identified a lack of knowledge about and support for the necessary training. A focus on training needs in the survey was agreed upon as the best way to gauge these capabilities. Section 4 of the survey was designed to support these efforts. Respondents were asked to indicate the level of training (i.e., basic or advanced) needed to build the capabilities necessary for a successful digital preservation program in their state (see Table 22). To see the training needs of individual respondents, see the “Training Needs for Digital Preservation Capabilities” tables in Appendix E.

Table 22 shows the level of training reported as necessary for each of the 12 capabilities. (Respondents were asked to select only one level of training needed.) Of note, respondents indicated a Basic level of training was needed in all 12 capabilities. Across the 12 capabilities, a Basic level of training was needed most often for “negotiation with key stakeholders” (33 responses, 55%) and “development of mechanisms to monitor the long-term usability of information” (33, 54%). Advanced training was needed most often for “management of long-term storage of digital information in a repository” (27, 44%) and “management of digital information (metadata, reformatting, etc)” (24, 40%). The capabilities receiving the most training attention across the states are “identification of key stakeholders related to specific digital information” (21, 34%) and “selection and appraisal of digital information” (15, 24%).

However, respondents indicated a general lack of capabilities or skills critical to digital information preservation.

Respondents identified additional digital preservation related capabilities not included in the table for which they either already had training or needed training in, including: Attending conferences and workshops held by professional associations and academic institutions as well as attending training provided by other external sources such as OCLC were also mentioned as ways individuals were acquiring necessary training.

Of note, several respondents indicated that a lack of training was not their main barrier to undertaking digital preservation related activities; rather, it was the more fundamental lack of personnel and funding.

Finally, and of particular interest, while a Basic level of training was needed most for all of the capabilities, there were at least five (5) respondents for each capability that indicated Training already provided. Moreover, four (4) respondents indicated Training already provided for all 12 capabilities and nine (9) respondents indicated Training already provided for at least seven of the 12 capabilities.

Table 22.
Training needs for digital preservation related capabilities
Capability
 
Training already provided
 
Basic training needed
 
Advanced training needed
 
Identify the type and amount of digital information throughout the state
 
20% (12)
 
49% (30)
 
31% (19)
 
Select and appraise state government information in digital form
 
24% (15)
 
47% (29)
 
29% (18)
 
Identify key stakeholders related to specific digital information (other local/state agencies, other states, private sector, etc.)
 
34% (21)
 
43% (26)
 
23% (14)
 
Negotiate and make agreements with key stakeholders to preserve digital information
 
22% (13)
 
55% (33)
 
23% (14)
 
Acquire state government information in digital form for holdings
 
22% (13)
 
43% (26)
 
35% (21)
 
Manage state government information in digital form (metadata, reformatting, etc.)
 
17% (10)
 
43% (26)
 
40% (24)
 
Manage the ingest of digital information into a repository
 
21% (13)
 
48% (30)
 
31% (19)
 
Manage the long-term storage of digital information in a repository
 
8% (5)
 
48% (30)
 
44% (27)
 
Develop mechanisms to monitor the long-term usability of state government information in digital form
 
11% (7)
 
54% (33)
 
34% (21)
 
Make state government information in digital form accessible to users
 
23% (14)
 
46% (28)
 
31% (19)
 
Produce a disaster and recovery planning for state government information in digital form
 
18% (11)
 
48% (30)
 
34% (21)
 
Manage copyright, security, and other legal issues of relevance to state government digital information
 
16% (10)
 
48% (30)
 
35% (22)
 


Section 5. State government digital information currently at-risk

Participants at the Library of Congress States Workshop noted a lack of information about how much digital information is actually at risk of being lost. This gap was discussed as part of the barrier to making a case for the investments necessary to build digital preservation capability. Section 4 of the survey was designed to fill this gap in knowledge about information considered by respondents to be at-risk of being lost. Respondents were asked to identify up to five examples of state government digital information at-risk of deteriorating or being altered or lost. They were also asked to describe the conditions causing the information to be at-risk, and any strategies being considered to preserve such information. See Tables 23 and 24 for the types of at-risk information mentioned and the conditions for causing them to be at-risk. In addition, respondents were asked to identify digital information previously lost or no longer accessible. See Table 25 for a list of the digital information provided in response.

At-risk information
The two types of at-risk information most frequently mentioned are e-mails and the Web sites of state agencies. Several respondents stated that the electronic correspondence of elected officials, digital publications, information stored in databases, and court records were at risk as well. Other at-risk information included legislative proceedings, electronic filings, GIS (Geographic Information System) records, digital video files, born-digital records without print copies, and data in obsolete format (e.g., 5 ¼ inch floppy disks, magnetic tapes).

Respondents also noted a challenge in responding to this section of the survey due to a lack of knowledge about the kinds of information currently at-risk in their states.

Table 23.
At-risk state government digital information
At-risk Information
 
Number of times mentioned
 
e-mails of state agencies
 
15
 
Web sites of state agencies
 
11
 
Electronic correspondences of elected officials
 
10
 
Digital publications
 
9
 
Information in databases
 
7
 
Legislative records (e.g., legislative proceedings, electronic legislative bill files)
 
5
 
Court records
 
5
 
Data in obsolete format (e.g., 5 ¼ inch floppy disks, magnetic tapes)
 
4
 
Born-digital records without print copies
 
4
 
GIS files
 
4
 
Digital video files
 
4
 
Personnel records
 
3
 
Electronic filings
 
2
 
Electronic newsletters
 
2
 

Lack of funding, lack of awareness of the importance of digital preservation, and a lack of standards were identified as the most important conditions causing information to be at-risk. Other frequently mentioned contributing conditions include the lack of individual agency efforts to preserve electronic records (especially the information they post on their Web sites), the lack of centralized authority, the obsolescence of technology, the lack of statewide plan or policy, and the lack of adequate statutory provisions or requirements (See Table 24 for details).

Table 24.
Conditions causing information to be at risk
Condition
 
Number of times mentioned
 
Lack of funding
 
13
 
Lack of awareness
 
12
 
Lack of standards
 
12
 
Lack of individual agencies’ efforts to preserve their electronic records
 
11
 
Lack of centralized authority
 
8
 
Obsolescence of technology
 
7
 
Lack of statewide plan/policy
 
7
 
Lack of statutory provisions
 
7
 
Lack of staff
 
6
 
Lack of long-term planning
 
4
 
High cost of preservation
 
4
 
Use of proprietary format and software
 
4
 
Lack of skill
 
3
 
No steps taken to preserve information
 
3
 
Large volume of records
 
3
 
Frequent content changes
 
3
 
Lack of training/education
 
2
 
Political changes/turnover
 
2
 
Lack of knowledge
 
2
 
Lack of coordination
 
1
 
Lack of leadership
 
1
 

Several respondents mentioned problems with agency Web sites in particular, related to the practice of publishing official reports and records on agency Web sites without the existence of a plan for capturing and preserving content of long-term or enduring value. Respondents further indicated, in many cases, that agency Web sites are maintained in a decentralized manner without uniform standards or guidelines. As a result, agencies have varying retention policies and often alter or remove items. The large volume of Web documents and frequent content changes also make the capturing of Web sites difficult, even in those states where policies and plans do exist.

Concerns about information in digital video format were also noted by respondents. The use of digital video as the new official transcript or minutes of proceedings is increasing, however, standards for digital video format are not fully established, making preservation of video files difficult.

The findings indicate that a majority of states have no current strategy for the preservation of at-risk digital information. In some instances, strategies are under development. However, several general strategies in place that were identified by the respondents included:

Information no longer accessible
As shown in 25, agency Web sites are the type of digital information reported lost most frequently. Other digital information not preserved and no longer accessible include state government electronic publications, information stored in obsolete media (e.g., 5 ¼ inch floppy disks, magnetic tapes), and e-mails. One state reported that the back issues of about 50% of a sample of 165 online serial titles are no longer available.

Some respondents described cases where information was lost due to administration changes. In one case, all server drives were erased and files from a former administration were replaced with content from a new administration after the inauguration of a new governor. One state reported the loss of two years worth of their governor's correspondence during the conversion to a new storage system.

Table 25.
Types of state government digital information already lost
Lost information
 
Number of times mentioned
 
State agency Web sites
 
10
 
Digital publications
 
7
 
Data in obsolete format (e.g., 5 ¼ inch floppy disks, magnetic tapes)
 
6
 
e-mails
 
4
 
Do not know
 
5
 

Section 6. Enterprise Architecture

According to a National Association of State Chief Information Officers (NASCIO) report published in October of 2005, over 95% of the responding states have embraced Enterprise Architecture (EA) as a framework for systematically determining needs and demands and reshaping “government processes, organization, and supporting management systems.”7 During the Library of Congress States Workshops, which included 20 representatives of state IT organizations, there was agreement that EA efforts offers a largely untapped opportunity for LARM units to partner with information technology organizations and others in support of digital preservation. Therefore, Section 6 of the survey was designed to gauge awareness of and involvement of LARM professionals in their state EA efforts.

Overall respondents appear to be aware of their state’s EA efforts (66.7%) while only 37.1% reported any involvement in those efforts. (see Figure 1).

Figure 1. Awareness of and Involvement in State Enterprise Architecture (EA)

Figure 1. Awareness of and Involvement in State Enterprise Architecture (EA)

The nature of involvement in EA efforts varies. Many respondents indicated active participation in EA committees and working groups. The majority of specific EA activities in which the respondents were involved focused on standards and policy development. The roles respondents are filling in these committees include influencing elements of the architecture that pertain directly to recordkeeping issues; determining how best to preserve the long-term records needed by the state; and helping to develop the data and electronic records domains of the EA. Several of the respondents indicated a more involved role in their state’s EA efforts including activities such as server and e-mail consolidation; redesign of state portals; content management system testing; and developing a comprehensive statewide strategy for the management of all records created by state agencies.

Several respondents indicated indirect involvement in EA efforts through occasional attendance at EA related meetings, informal or “back-channel” communication with other agencies regarding EA, meeting reporting requirements (e.g., submitting return on investment technology plans to the state CIO every two years), participating in surveys, or simply observing their state’s EA initiatives.

7Thirty seven states and the District of Columbia responded to the NASCIO survey. Download a copy of the NASCIO report The States and Enterprise Architecture: How far have we come? Findings from the NASCIO 2005 EA Assessment, NASCIO, October 2005 at http://www.nascio.org/publications/index.cfm.