NYS agencies go online
The evolution of agency goals
The project proceeded in a series of workshops where the participating agencies worked together and individually on service definition, development, refinement, and evaluation. The project methodology was designed to guide them through a process of aligning Internet technologies with the programmatic objectives of their organizations. The result was a multi-faceted analysis of each proposed project and in some cases major rethinking of the original proposal and its feasibility given organizational and programmatic realities. A brief description of each of the tools used is presented in Table 2. A more detailed description can be found in
Making Smart IT Choices on the CTG Web site.
Over the course of the project, most of the agencies revised their service goals and changed the membership of their development teams. Virtually all of these adjustments were made in response to new insights gained during the workshops and the subsequent development work taking place in each agency. Six of the seven agency teams completed prototype Web sites (their home pages are presented in Appendix C) and all participated in the evaluation phases of the project which focused on barriers encountered and lessons learned.
Each of the seven agencies was engaged in a project which involved an innovative approach to delivering service. The tool set, selected from among the tools used at the Center to support sound IT decisions, was chosen to provide full coverage of management, policy, and technology considerations. Table 3 shows how the tools contributed to an integrated framework for innovation.
The agency participants represented a wide range of technology, management, and policy skills and background. We therefore expected that the tools would stimulate different actions and insights in each agency. To explore this expectation, a survey and group interviews were conducted at the end of the project. The survey was designed to identify the overall contribution of each tool to the progress of the project and the insights gained by the project teams as a result of using each tool. The interviews, conducted as a group with each project team, were designed to assess the value of each tool in assisting the participants to identify and overcome barriers.
All of the tools were highly rated by the agency teams. As Figure 2 shows, every tool was rated 4.64 or higher on a seven-point scale. In general, however, these results suggest different types of tools were useful for different reasons. Prototyping and presenting the site appeared to make the greatest overall contribution to project progress and insight. The next group of tools (specifically, the stakeholder analysis, the strategic framework, best practices, and information structure) received similar scores and together helped agencies establish a vision and develop a plan of action. The organizational issues questionnaire, cost and performance model, and technology awareness events make up the third group, which participants identified as having relatively less impact on their projects. Overall, the use of prototyping and the set of planning tools provided consistent value across all the teams. The remaining tools varied in their value, which we attribute to their applicability in the specific agency environments.
Prototyping
“Prototyping is the only way to develop a Web site.” All of the project teams supported this statement made by a member of the OASAS team. Prototyping and presenting sites for the purpose of gaining shared understanding, for critical review, and for generating support was highly valuable to the six project agencies that created a Web site. DHCR, for example, used prototyping extensively and found it to be a very effective method of introducing agency management to the Internet and getting approval for the project initially. “We found prototyping to be the best way to generate enthusiasm within the program areas for the project and to give program liaisons a concrete idea of what it was we were trying to accomplish,” was how one DHCR team member described their use of prototyping.
Unfortunately, the Hamilton County team did not have the opportunity to use prototyping to paint a picture that would generate clarity and support. The Hamilton County team was unable to produce a prototype to present to the Hamilton County Gateway Committee to build support. The team saw this inability to create a tangible focus for the committee’s attention as limiting their ability to get critical support for and feedback on the project.
The commitment to a public date for a presentation was identified by several of the agencies as being a primary motivating factor in the development of prototype sites. “Deadlines make things get done” was how one agency put it, another stated that it “forced them to stay focused.”
Producing the prototype for public presentation resulted in different approaches to maintaining focus on development. Three agencies adopted the use of a “quiet room” for their teams. The “quiet room” allowed the staff to remove themselves from their regular work location and to focus on the prototype. A second approach, considered less desirable, but also effective in maintaining focus, was to invest personal time in the development of necessary skills. Several agency staff stated that it was not unusual for personal resources to be invested in project start-up activities. However, they further stated that this was more necessary in this project than in previous projects. In some cases, agency participants invested their own dollars into reference materials and software to support the development process.
Stakeholder analysis, strategic framework, best practices, & information structure
Several teams indicated that they worked iteratively among these four tools. For example, the stakeholder analysis provided a list of customers to be served by the information structure and the strategic framework identified an approach or innovation that had to be investigated using best practices. Most teams reported that they recognized the value in revisiting these four tools on a regular basis, both as a Web site team and with their various managers. However, they also identified limitations in their ability to do so. Those agencies operating under a top management edict to “get us on the Internet” found it particularly difficult to get managers to pay attention to the results of these planning and design tools, let alone participate in their use. One team, however, used the stakeholder analysis to respond with focus to this kind of directive. The team used the stakeholder analysis to narrow down the project to a reasonable size by forcing the “for who” and “for what purpose” questions. They consistently maintained that specific answers to specific questions were required before they could produce meaningful results.
These tools spoke primarily to the management and policy issues faced by the project teams. Who is being served, who is responsible for the affected service area, and what content must be deployed to provide service? A number of the teams identified early on through the use of these tools that they did not have the appropriate participants on their project teams. ORPS for example, returned to the agency after one workshop and brought program staff onto the team. DMNA added a systems person to their team. A number of the agencies added the public information office to the team.
Getting agencies to understand the content issues associated with the Web site was accomplished through the use of the information structure at several of the agencies. DHCR used it to identify overlapping information areas; at OASAS it allowed the Web site to be conceptually complete. Using this tool allowed everyone involved to “see” the whole and it helped them work as a team.
Organizational issues questionnaire, cost and performance factors, and technology awareness activities
Feedback on the value of these tools reflected the varying environments in which the agency teams worked. GTSC found the organization issues questionnaire to be primarily a reporting exercise as they had previously addressed the roles and responsibilities related to their site. However, they indicated that the small size of their organization might be the reason for their having addressed this sooner than others did. They further stated that it “enlightened” them to the fact that the success of the project rested on a small number of people.
ORPS, in the midst of the creation of a new group to address Internet issues, found the questionnaire to be “extremely helpful” in that it defined issues that needed to be considered. One agency did not use it at all, but instead adopted the “build it and they will come” approach. They felt it was most important for their management to support the service approach and, having accomplished that, to have management designate organizational roles and responsibilities.
The cost and performance factors workshop was designed to develop a cost/benefit analysis of Web-based service delivery. Several of the agencies explained their lower rating of the value of this tool as the consequence of having been given marching orders to “get us on the Internet.” The nature of these projects—developing prototype applications with emerging technologies—generated little demand from management for comprehensive cost/benefit analysis. However, most of the agencies expected that the maturation of the prototypes into fully functional service delivery mechanisms would depend on their ability to use this or a similar tool to present costs and benefits in support of further resource allocations.
The technology awareness events were also of varying value to participants. The DHCR team, for example, was made up of MIS staff who had been experimenting with HTML. The online tutorial in HTML was of limited value to them. However, they found the more advanced technology awareness activities to be very valuable in helping them establish a technology vision for their Web site. “Educated everyone as to the usefulness of the software,” was the opinion of one agency. “Who do we mean when we say ‘everyone’?” was one question posed when planning technology awareness events. Involving management in high level technology awareness events was suggested by one agency as a way to have managers share in establishing a technology vision in addition to a program vision. However, this can be a two-edged sword. One agency reported their management was disappointed in the simplicity of the prototype produced. The managers expected the agency team would develop a site on the order of CNN.COM or MSNBC.COM.
We found that the value to the agency of the technology awareness events had little to do with the size of the agency. Both large and small agencies found some value in even the most basic technology presentations.