Skip to main content
photo
 
Chapter 1. The risks of IT innovation in government

In 1995, the Standish Group began to publish reports of the IT failure rates of both public and private organizations in the United States. They suggested that more than 80 percent of systems development projects fail in whole or in part. Most projects cost more, take longer than planned, and fail to achieve all of their goals. One-third are canceled before they are completed.

One system development project cancellation took place at the State of California Department of Motor Vehicles (DMV) in the mid-1990s. The project to move nearly 70 million vehicle, license, and identification records from an antiquated system to a new relational database was both behind schedule and over budget.

When California's lawmakers finally decided to end the agency's IT project, over $44 million had already been spent and no end was in sight. One of the reasons the DMV project failed, says California Assemblyman Phillip Isenberg, is "because the agency staff were over their heads with a technology they did not understand."

The project also lacked a clear link between agency operational goals and the capabilities of the selected technology. Due to procurement restrictions, the agency was committed to a specific hardware platform before all the available options could be explored. As a result of the failure, California's technology procurement process faces even greater control and oversight. Despite these problems, California has an annual IT budget of well over $1 billion, and more big projects are on the horizon - as they are everywhere in the world.

In fact, government constitutes one of the world's largest consumers of information technology. Because of its size, complexity, and pervasive programs and services, government cannot operate effectively without using advanced information technologies. However, as the California DMV failure amply demonstrates, the risks of IT innovation in government are daunting.

Years of research on information system success and failure have been unable to conclusively identify the factors that cause good or bad results. Information technology success and failure seem to be in the eye of the beholder.

We've spoken to public managers who consider a project a success if it comes in on time and on budget. Others, who evaluate functionality and usability, might call the same project a failure. Many see failure when, regardless of time and budget, a new system makes it more difficult to do routine and familiar tasks. They have the latest technology but can't get their work done as well as they did before. We've heard about systems that perform beautifully, but can't be supported by in-house staff and therefore continue to generate high costs for consultants to maintain them.

Failure may be a desirable statewide system that local governments can't use because they lack their own expertise and technical infrastructure to connect to it. Failure has also been described as an on-time, on-budget system with great user interfaces and functionality, but users will not work with it because they don't trust the underlying data sources.

How do you protect against something you can't define? We advocate an approach that builds knowledge and understanding through careful analysis of the goals, the larger environment, the specific situation, the likely risks, and the reasonable alternatives. That kind of thinking will help you raise useful questions, engage partners, challenge old models, garner support, assess policies, identify risks, consider contingencies, and result in more successful innovation.