Best and current practice research involves learning both what works and what doesn't, based on the relevant experience of others. Often, you may find that your business problem has already been dealt with, in whole or in part, by other government agencies, private or non-profit organizations, or academic researchers. Identifying and evaluating these solutions are important early steps in your business case development. An abundance of information and expertise in the IT community, as well as elsewhere in the public sector, can contribute to solving similar problems in other organizations. In particular, there is a great deal to learn from those cases where things did not go as well as expected.
A way to find potential solutions. Best and current practice research involves identification and consideration of various approaches to a problem, or the components of a problem, that a project is intended to address.
A method for learning from others' success and failure. Such research may take different forms, but the ultimate goal is to learn from the experience of others so you can avoid "recreating the wheel" or replicating the mistakes of others.
An early project task. This research should be conducted during the business case development process and continued over the life of the project.
Understanding the problem. By finding out how other organizations tackled a similar problem, you can develop a better understanding of your problem from multiple and varied perspectives. Learn more about how the problems were addressed, and the success factors and advantages and disadvantages of the solutions.
Finding potential solutions that have already been tried. You can identify individuals and organizations that have solved, or tried to solve, problems similar to yours. You can learn from their experiences and gain feedback on your proposed and ongoing project activities.
Identifying methods and resources. Use this tool to identify methods and mechanisms for evaluating a range of solutions, from the IT solutions used by different organizations to how departments are structured to deal with problems, to the strategies used for organizing partnerships across agencies. In addition, current practice research is an effective way of identifying sources of relevant policy, process, and technical expertise and technology. Explore the range and variety of technical solutions.
Investigating other resources. Universities and professional organizations have an array of conferences, journals, books, and published studies that may help you identify relevant current and best practices.
Classifying all parts of the problem. By identifying all relevant components of a problem, you can avoid the trap of "treating the symptoms" of the problem instead of the problem itself.
Assumptions about others' work. When gathering data about other organizations' solutions, you must make assumptions as to the appropriateness or relevance of their experiences to the problem you're facing.
Reliance on published data and people's memories. In order to get information about current and best practices, you must rely on published reports and the recollections of people involved in those projects. This may limit the kind and amount of information you are able to acquire.
Reluctance to discuss failures. Organizations and individuals are more likely to share stories about their successes than their failures. But both kinds of stories can provide valuable information.
Defining "best" is subjective and anecdotal. What is best for one organization may not be best for another. Defining best practices in any field can help direct discussions about alternatives, but be careful about assuming anything is "best" for everyone, everywhere. Be selective and rigorous in judging whether you should adopt a practice from another organization.
Center for Technology in Government. (July 2000) Using the Internet to Find Current and Best Practices. Albany: Center for Technology in Government. Available online at http://www.ctg.albany.edu/publications/didyouknow/dyk_2000_jul/dyk_2000_jul.pdf
Cortada, J. (1997) Best practices in information technology: How corporations get the most value from exploiting their digital investments. Upper Saddle River, NJ: Prentice Hall.
Eglene, O. (2001) Conducting Best & Current Practices Research: A Starter Kit. Albany: Center for Technology in Government. Available online at http://www.ctg.albany.edu/publications/guides/conducting_best/conducting_best.pdf
Rocheleau, B. (2000) "Prescriptions for Public-Sector Information Management: A Review, Analysis, and Critique." American Review of Public Administration 30 (4) 414-35.
Technology awareness activities help to identify what technologies make sense to use given a specific problem. These activities are used to educate people about the capabilities of the technology, so they can begin to think creatively about transforming the way the agency operates. Becoming aware of the capabilities of specific technologies helps to inform analyses of alternative solutions and helps narrow investment choices to those that will work best for your organization.
Educational activities focused on technology. These activities include reviewing the relevant literature in both trade and technical journals, visiting trade shows, hearing presentations by organizations with exemplary systems, visiting organizations that have installed similar systems, arranging vendor demonstrations, or developing and demonstrating one or more prototype versions of a proposed system.
Conversations about technology options. Technology awareness sessions presented by experts or experienced users help you find out about different classes of technology, and then more specifically, about certain products within classes of technology.
Learning about new classes of technology. In order to use a technology effectively it is necessary to understand thoroughly the capabilities and limitations of that technology. Technology awareness activities are designed to accomplish that purpose. The amount or kind of educational activities that are needed in a project depends on the size of the gap between the staff's current knowledge and optimal familiarity with the proposed technology.
This is only an introduction. Understanding and adapting to a new technology is often a slow and difficult process for a number of reasons -- reengineering processes commonly requires cultural, organizational, and interorganizational changes on the part of an organization. In addition, there is a difference between "knowing about" something and actually experiencing it. Some of the benefits and limitations of a technology can only be appreciated after years of experience by an organization.
Keeping up is a challenge. Technology changes constantly making it difficult to stay ahead of the curve. Try to meet this continuous learning challenge by regularly reviewing publications or attending conferences. Your entire organization benefits if someone stays abreast of new developments and emerging tools.
DB2 magazine http://www.db2mag.com
Federal Computer Week's Technology Reviews
www.fcw.com/techtest.asp.
Government Computer News' Product pages www.gcn.com/prodcentral.
Government Technology. Solutions for state and local government in the information age. Includes case studies, best practices, and a government technology library. www.govtech.net.
Association for Computing Machines. info.acm.org.
Comprehensive information on technology news, technology reviews and case studies.
www.zdnet.com.
In benchmarking, you compare yourself to the best-known example of a process, product or service. This example provides a reference point against which to evaluate your own performance. A narrow benchmarking framework may be used to compare organizations that are similar in mission and basic technology. A wider framework, which can be achieved by looking outside your own domain, can also provide important lessons or improved methods that would be missed by looking only at organizations just like yours.
Identifying and selecting the appropriate benchmarks is a critical part of the process. The news media, professional publications, and competitions are good ways to identify possible benchmark candidates.
A way to compare yourself to the best. Organizations that develop effective innovations and approaches to a particular problem typically publicize it. Most professional organizations and many publications sponsor annual competitions for best practices and noteworthy innovations. There are also databases of benchmark and best practice information for the public sector.
A way to build consensus, support, and partnerships. Selecting a benchmark also requires consensus and support within your organization. In addition, you may have to establish a partnership with the benchmark organization.
A thorough analysis and understanding of business process. For benchmarking specific technology applications, you need a thorough analysis and clear understanding of the business process and/or product to be evaluated. Without it, the lessons or innovations revealed by using the benchmark may be missed or misapplied.
Learning how to improve efficiency and performance. The central benefit of good benchmarking is learning how to improve efficiency and performance. Benchmarks achieve their superior performance by innovative, often highly creative ways and offer rich opportunities for learning and gaining new perspectives. These new ideas, perspectives, and techniques can be learned through benchmarking much more efficiently and quickly than by self-study alone, formal research, or evaluation projects.
Taking advantage of other group's investment. By using another organization as a benchmark, you're benefiting from its considerable investments in research, testing, training, and experimentation. Use the knowledge you acquire to help avoid mistakes and achieve higher performance.
Information sharing and collaboration. Benchmarking also involves information sharing and potential for collaboration. The process may start an ongoing exchange of performance ideas and innovation among organizations, a relationship that may result in a partnership that provides greater opportunities for performance improvements.
Getting positive publicity and recognition for participants. Successful benchmarking efforts can also lead to public recognition for the participants. The potential for performance gains can be substantial, resulting in opportunities for increased public support and rewards.
Once-in-a-lifetime experience. The outstanding performance of the benchmark may be due to special circumstances or factors that you can't replicate.
Can you live up to this standard? The high standards and great success of the benchmark organization can raise unrealistic expectations among your project participants.
Need solid support and good resources. Trying to replicate the success of the benchmark requires political support and consensus within your organization. In addition, you need adequate resources to respond appropriately to the challenges produced by using a benchmark for assessment.
Andersen, B. and P. Pettersen (1996) The Benchmarking Handbook. London: Chapman & Hall.
Bogan, C. and M. English (1994) Benchmarking for Best Practices: Winning Through Innovative Adaptation. New York: McGraw Hill.
Harrington, H. J. and J. S. Harrington (1996) High Performance Benchmarking: 20 Steps to Success. New York: McGraw Hill.
Keehley, P., S. Medlin, S. MacBride, and L. Longmire (1997) Benchmarking for Best Practices in the Public Sector. San Francisco, CA: Jossey-Bass.
Rocheleau, B. (2000) "Prescriptions for Public-Sector Information Management: A Review, Analysis, and Critique." American Review of Public Administration 30 (4) 414-35.
Organizations use environmental scanning to monitor important events in their surrounding environment. It is a way to answer the question, "What's happening in my environment that will affect my future?" Scanning involves identifying the issues and trends that have important implications for the future. The scanning includes analysis of the information about these issues and trends to assess their importance and determine their implications for planning and strategic decision making.
A way to discover emerging trends of strategic importance. Scanning is different from ordinary information gathering in that it is concerned primarily with the future, emerging trends, and issues that have strategic importance for your organization.
A method for gathering information from variety of sources. It involves gathering information from publications, conferences, personal and organizational networks, experts and scholars, market research, and any source that appears to be useful. Organizations may have formal, continuous processes for environmental scanning, with a permanent unit of the organization responsible. Or the effort may be episodic and organized in an ad hoc manner.
Data analysis for planning purposes. Simply gathering the environmental data is insufficient. It is also necessary for you to interpret the data correctly and make it useful for planning and decision making.
Taking advantage of opportunities. Environmental scanning can help capitalize on emerging opportunities. It can be an important part of strategic planning by helping you shape strategy to better fit emerging conditions.
Anticipating developments to avoid costly mistakes. Scanning can also help avoid costly mistakes by helping planners and decision-makers anticipate change in the environment. This is particularly important in any planning that involves information technology, since the capabilities and costs of IT are evolving at a rapid pace.
Level of resources required. It's hard to judge the appropriate level of resources to devote to environmental scanning. Where environmental conditions are turbulent and full of potentially significant changes, large amounts of resources may be justified.
Interpretation an inexact science. More importantly, the interpretation of trend information and forecasting is an inexact science at best. The farther into the future a scan probes, the more careful you must be with the interpretation.
Abels, E. (2002) "Hot Topics: Environmental Scanning." Bulletin of the American Society for Information Science and Technology 28 (3) 16-17.
Cornell University's Cooperative Extension has a list of resources pertaining to environmental scanning. http://www.cce.cornell.edu/admin/program/documents/scan.htm [Retrieved May 27, 2003]
Choo, C. W. (2001) "Environmental scanning as information seeking and organizational learning." Information Research, 7(1). Choo's article provides a detailed academic view of environmental scanning. http://informationr.net/ir/7-1/paper112.html [Retrieved May 27, 2003]
Mafrica, L. (2003) "From Scan to Plan: How to Apply Environmental Scanning to Your Association's Strategic Planning Process." Association Management 55, 42-47.
When the stakes are high and uncertainties are great, it pays to build a model of your idea and test it in any and every way that you can. By modeling the a process, system, or program before it is designed and implemented, you can more clearly think through how it will impact overall organizational processes and performance. When the idea "flies on paper" in the modeling stage you can be more confident that it will success in real operation. That is why building models and testing them thoroughly before getting to the final design and implementation phase is an effective way to hold down development costs and minimize risks.
Extensions of a formal model of a problem. Review the earlier section of this handbook on "Models of Problems," because solution models share many features with problem models.
Representations of operation. Like the problem model, solution models often represent processes, information flows, decision points, and relationships -- but this time they are improved to solve the problem. The solution model should show how the new process or system will function within the whole organizational context.
Representations of organizational and customer-oriented effects. These representations are just as critical as the ones that show how the proposed IT system itself will function.
Simulating full system operation. Models of solutions help you describe and simulate how a full system will operate within the context of organizational and human factors.
Thinking big. These types of models help you see the implications of a limited prototype when it is expanded to full-scale operations. Managers are forced to think through technical, organizational, and policy issues in designing these models.
Exploring costs and benefits. You can delve into the costs and benefits of proposed solutions by linking the model to financial data.
Asking "what if" questions. A new or revised system or process will have various organizational and human factor effects. By asking "what if" questions, you can anticipate issues and problems before they are encountered in a real world system implementation.
All depends on the data. Models of solutions are no better than the data and relationships upon which they are built. Your model must be built upon a foundation solid data and analysis if you want it to accurately forecast the impacts of the new system.
Expense. These models can be very complex, expensive and time-consuming to build. Models of solutions may require specialized expertise, which may be unavailable in your agency.
Wolstenholme, E. (1993). Evaluation of Management Information Systems. Chichester; New York: Wiley.
When building a business case for an IT project or innovation initiative, you will want to consider alternative solutions. Investigating modest, moderate, and elaborate alternatives during the analysis process allows you to identify a range of possible choices. You can then compare the costs and benefits of the alternatives and make more informed decisions.
Levels of choices. Developing modest, moderate, or elaborate alternative solutions can help outline the range of choices available to you. The features, functions, and technology that go into each level depend on your goals and resources.
Modest. This level involves a minimum investment in effort, time, and resources.
Moderate. At this higher level, you may include additional features or options and a wider range of internal and external information sources.
Elaborate. This ultimate level involves advanced features, technologies, or options for the most ambitious project you could undertake to solve your organization's problem.
Identifying features and functionality. Place possible features and functions in the appropriate level. Think about the modest, moderate, and elaborate answers to questions about customers, services, features, information sources, and resources. You will be better able to make decisions when the options are categorized.
Characterizing benefits. Describe the major benefits you expect to result from the implementation of your project. Benefits typically fall into one of three categories: better, cheaper, or faster. Classify the potential benefits as modest, moderate, or elaborate.
Assessing and measuring performance. Once you have outlined project features and benefits appropriate for each level, you need to figure out a way to measure performance. Devise specific, objective methods for measuring the outcomes and results of your project.After a project is operational, these performance goals can be used to evaluate how well the project performed with respect to its overall specific goals.
Determining the basis for cost estimation. Determining the cost of your project is easier because of your modest, moderate, and elaborate descriptions. Each of these alternative suggestions cost categories and degree of needed investment that you can build on in the more detailed cost estimation phases.
Know what resources are likely to be available. Have a good handle on what resources will be available to implement the modest, moderate, and elaborate project options. Be as realistic as possible when devising these alternatives. "Elaborate" does not mean the sky is the limit, unless that is actually a realistic expectation for your situation.
Keep your focus on outcomes and results. When talking about the potential benefits of your project, describe them in terms of outcomes and results instead of inputs and outputs. This helps you focus your attention on the service you're providing rather than the delivery mechanisms.
Devise concrete measures. You must describe your performance measures in explicit, objective terms. Be as specific and clear as possible to avoid future confusion about how project features and services will be measured and assessed.
Make forecasts. While you may be uncomfortable doing it, you should try to forecast the impact of each approach. Make educated guesses about how the modest, moderate, and elaborate options will affect stakeholders, service delivery, and the business process. Forecasts will help you make better decisions about which option to implement.
How to present modest, moderate, and elaborate alternatives
- Gather a group of people with strong knowledge of the issues and goals of the project and work through this exercise together.
- Make a table with four columns and as many rows as you have features, functions, or other bases for comparison. Label the columns "features & functionalities," "modest," "moderate," and "elaborate."
- In each row, briefly describe the modest, moderate, and elaborate alternatives for one feature or function. For example, user support features might range from on-line help to a business hours help desk, or even 24/7 support staff.
- Make a second table with four columns, labeled "benefits," "modest," "moderate," and "elaborate." This table contains three rows labeled "cheaper," "faster," and "better." Briefly describe the cheaper, faster, and better benefits that are likely to accrue from the modest, moderate, and elaborate alternatives you have just specified. Some examples:
-
-
Cheaper:
- Reduce duplication in areas such as data collection and program development
- Generate revenue
- Savings in non-personal services: telephone, printing, mailing
- Savings in personal services
-
-
Faster:
- Reduce the number of steps in a process
- Staff members get access to information in a more timely manner
- Citizens get access to services in a more timely manner
-
-
Better:
- Improved responsiveness to citizen need through 24-hour access
- More satisfied clients because information is more accurate and consistent
- Ability to reach more customers with existing services
-
-
Be as specific as possible in defining expectations for system performance (e.g. "90 percent of telephone inquiries will be completed on the first call"). While this may prove difficult to do at first, quantifying system performance expectations will help to clarify project goals and objectives, and provide a basis for evaluation when the project is completed.
-
-
- Compare the alternatives in terms of the features they would offer and benefits they would generate and keep this information handy to compare with cost estimates.
Pardo, T. A., S. S. Dawes and A. M. Cresswell (2000) . Opening Gateways: A Practical Guide for Designing Electronic Records Access Programs. Center for Technology in Government. See the "Program Design Tool" http://www.ctg.albany.edu/publications/guides/gateways?chapter=5
Often an IT project involves a relatively new technology or combination of technologies with which you and your staff have little familiarity and even less expertise. This is especially true in state and local government agencies, which traditionally opt to stick with tried-and-true technologies that may be several generations older than the proposed technology. To apply the new technology successfully in a cost-effective manner, all project participants -- from end users who are specifying system functionality to developers who deliver the ultimate implementation -- need to be thoroughly familiar with the potential benefits and risks inherent in the technology. Prototypes help you understand the proposed technology fully in order to reengineer processes successfully and take maximum advantage of the new system.
Proof of concept. We define a "prototype" as a system built for a proof of concept. It is not a full-scale system or even a pilot. A prototype essentially identifies, demonstrates, and evaluates the key management, policy, technology and costs implications of a desired system. A prototype is also built to identify the value proposition all participants.
An education and communication tool. Prototypes help educate end users, managers, and system developers about potential applications of technology, and how it can help solve their problems. Prototypes are powerful tools used to bridge the gap between what project team members currently know about a new technology and what they will need to know to apply it successfully. They also help bridge the gap between developers and potential users because both groups can look at and discuss the prototype in concrete terms.
Not a pilot system. The purpose of a prototype is to show prospective users how a system might work so that they may think creatively about its potential usefulness. By contrast, a pilot is used in a limited real-life setting. Pilots are much more costly than prototypes to build because they have to work well enough not to hinder the activity of people who have to get real work done. This requires an attention to quality control and performance that typically drives the cost of development up substantially.
Educating projects participants. The primary value of a prototype is to educate the project participants. Often, end users and managers have the least awareness of a technology's potential because they may not have been exposed to it through their day-to-day activities. Therefore, prototypes typically emphasize the user interface portion of the system. Prototype development may also address data preparation costs, maintenance requirements, technical support requirements, end-user training requirements, and infrastructure needs.
Stimulating both imagination and realism. A prototype can push people to dream of potential innovative applications of the technology. At the same time, seeing the technology in the concrete leads to a more realistic assessment of costs and benefits. Seeing a mock-up of the application helps guide the analysis to factors that are relevant to the real work of an organization. Interviews, model-building, surveys, and experiments all become more accurate if the participants have personally experienced how a system might work. The prototype itself can be used to gather data on the likely impact on processes and operations.
Setting the stage for implementation. Prototyping activities can be aimed at all levels of staff. If the system being prototyped is ultimately procured, training and other costs may be lower because of your organization's experience with the prototype.
Assessing risk. The reactions that people have to a prototype can help you assess the risks involved in the project. Risks may be inherent in any of the internal or external factors that could affect the success of the project. These may include such potential barriers as staff and client resistance to change, immaturity of a new technology, personnel limitations, technology failures, and expected changes in the technical, political, or management environment.
Won't work like a real system. Because a prototype is only part of a system, it won't work like a real system. It will have very limited functionality, few attractive presentation features, and limited or fake data. The limitations of the prototype may not be apparent to naive users, and their experience needs to be moderated by expert advice.
Can be costly. Depending on the educational needs of the project and the technologies involved, developing a prototype may be a costly proposition. If custom development is necessary, you may need someone experienced in a particular rapid application development environment. Specialized hardware and software may be necessary to support even a small prototype.
Lewis, R. (2002) "The Value of a Test Drive." Information World 24 (13) 48.
Sommerville, I. (2000). Software Engineering, Sixth Edition. New York: Addison-Wesley.
© 2003 Center for Technology in Government
