Sample analyses
To help you use these models, we present two sample analyses using the tools described above. Although these cases are based on our experiences in the Internet Services Testbed Project and other agencies, they are entirely fictional. In our experience, there is no such thing as a "typical" situation -- each agency has its own set of unique characteristics such as opportunities for cost- sharing or shortages of personnel that make general conclusions difficult. The cost figures used in these examples are based on assumptions that may not be stated in the analysis; for example, all staff at the first agency have network-ready PC's on their desks, so little additional equipment is necessary. Depending on your situation, you may be able to share the costs of the Web service with other projects and agencies, so that your costs may vary tremendously from those described here. The analysis has been simplified; for example only first-year costs are considered. The examples are provided solely to show how the analysis might proceed using these tools. Your situation will likely be different.
Example 1. Using Benefit-Cost Analysis at the Office of Cost Reduction (OCR)
The Scenario
The IT director of the Office of Cost Reduction thought it was going to be a bad day when was running late for work. Her pessimism was muted when she got to work for her weekly meeting with the Commissioner of OCR. The Commissioner wanted to know whether OCR could accomplish its mission in a cheaper way. A friend of his at the Office of Mental Hygiene had told the Commissioner about their Web site, and now the Commissioner wanted to find out whether a Web site could be useful for OCR.
The IT director and her team had been eager for months to conduct an Internet development project. She left his office full of energy; however the Commissioner's words "Only if it saves us money" were still ringing in her ears when she left his office.
Thinking about the agency's collection of information from the general public, local government, and certain private sources such as insurance companies and appraisers, she come up with the first areas that should be addressed. The agency had a number of paper based forms and information packets that it distributes to these constituents. "If we could distribute and collect these forms electronically, we might be able to save money," she thought to herself. She decided to explore the option of allowing key business partners the option of submitting forms electronically by building an Web site, and giving these partners access.
She walked back to her office and opened a draft of this guide that she had received a month ago. She found the chapter on Benefit-Cost analysis, Resource Allocation methods, and Multi- attribute Utility models. After reading through them she decided that the appropriate analysis tool would be the Benefit-Cost analysis, not only because it was the best one for dealing with "cheaper" type situations, but the Director had earlier shown a certain fondness for this method.
There were four things she had to do to complete a benefit-cost analysis:
-
Determine the features and functionality of potential Web-based services.
-
Determine the performance benefits of the Web-based services, placing a dollar value on each benefit.
-
Determine every cost associated with the Web-based services, adding them together as a total dollar amount.
-
Compare the benefits (measured in total dollars) to the costs (measured in total dollars).
Filling out the System Features and Functionality Worksheet
Understanding the potential value of the tools available, she got together a small team of people to plan the service. Staff from the program areas, from human resources, from the IT division, and from OCR's public information office did the planning. Using the "system features and functionality" worksheet, the group was able to specify three potential systems, ranging from giving the public electronic access to their most popular forms, to on-line submission of the forms from key business partners (see Figure 8).
Figure 8. System Features and Functionality, OCR
|
System Features and Functionality
|
|||
|---|---|---|---|
|
Modest
|
Moderate
|
Elaborate
|
|
|
Who are your customers?
|
citizens, taxpayers, government officials at all levels |
same as modest, plus local tax officials, libraries |
same as moderate, plus key business partners:realtors, appraisors, and insurance companies |
|
What information-based services will you provide?
|
e-mail, limited number of existing documents published on the WWW |
same as modest, plus many more documents and forms |
same as moderate, plus on-line forms for the key business partners and access to agency databases |
|
How will customers get access to these services?
|
e-mail and WWW as available in the customer base |
same as modest |
same as moderate |
|
What will customers be able to do?
|
send e-mail to agency, receive e-mail replies, access our web page |
same as modest |
same as moderate, plus fill in forms. web front end to agency databases |
|
What system features will be included?
|
e-mail and web page |
same as modest |
same as moderate, plus on line forms. support for partners to help them fill out on line forms. |
|
What information sources (internal and external) must be coordinated?
|
all Departments creating forms |
same as modest |
same as moderate, plus departments that manage agency databases |
|
What security and confidentiality measures must be implemented?
|
none |
same as modest |
heightened-policies, plus restricted access, encryption and authenticity |
|
What activities will be outsourced?
|
none |
same as modest |
traning and support for key business partners that want to fill out on line forms |
The final "system features and functionality" worksheet showed that the modest and the moderate alternatives would be fairly similar. The elaborate alternative included a Web site with access for agency key partners to register forms themselves.
Filling out the Cost Worksheet
Enthusiastic after filling out the last worksheet the team started the job of quantifying the costs of offering the specified services on the Web. The team soon realized that because their agency was already well-equipped technically, not all cells had costs in them.
Figure 9. Cost Worksheet, OCR
|
Cost Worksheet
|
||||||
|---|---|---|---|---|---|---|
|
MODEST
|
MODERATE
|
ELABORATE
|
||||
|
First Year
Cost
|
Subseq.
Annual
|
First Year
Cost
|
Subseq.
Annual
|
First Year
Cost
|
Subseq.
Annual
|
|
|
Organizational Readiness
|
||||||
|
Training for Technology Awareness |
|
|
|
|
|
|
|
Planning for Internet Presence |
10,000 |
|
10,000 |
|
10,000 |
|
|
Access for Agency Staff and Other Users
|
||||||
|
Hardware for End Users |
|
|
|
|
12,000 |
4,000 |
|
Software for End Users |
|
|
|
|
5,000 |
2,000 |
|
Network and Internet Access for End Users |
|
|
|
|
10,000 |
3,000 |
|
Other Vendor Service |
|
|
|
|
|
|
|
Human Resources
|
|
|||||
|
Start-up Process for Equipment Procurement |
|
|
|
|
2,000 |
|
|
Establish and Manage Vendor and ISP Contracts |
1,000 |
|
1,000 |
|
5,000 |
1,000 |
|
End User Support
|
||||||
|
Vendor Services |
|
|
|
|
10,000 |
15,000 |
|
Human Resources
|
|
|||||
|
Establish and Manage Vendor Contractsx |
|
|
|
|
2,000 |
1,000 |
|
Development and Delivery of User Training |
2,000 |
2,000 |
2,000 |
2,000 |
35,000 |
15,000 |
|
User Time in Training |
2,000 |
2,000 |
2,000 |
2,000 |
2,000 |
2,000 |
|
Help Desk for Users |
|
|
|
|
20,000 |
20,000 |
|
Content Development and Maintenance
Host of Site-Infrastructure
|
||||||
|
Hardware for Content Developers |
3,000 |
3,000 |
3,000 |
3,000 |
3,000 |
3,000 |
|
Software for Content Developers |
1,000 |
1,000 |
1,000 |
1,000 |
1,000 |
1,000 |
|
Network and Internet Access for Content Developers |
3,000 |
1,000 |
3,000 |
1,000 |
30,000 |
15,000 |
|
Other Vendor Services |
|
|
|
|
|
|
|
Human Resources
| ||||||
|
Start-up Process for Equipment Procurement |
|
|
|
|
3,000 |
3,000 |
|
Establish and Manage Vendor Contracts |
1,000 |
1,000 |
1,000 |
1,000 |
1,000 |
1,000 |
|
Development and Delivery of Staff Training |
15,000 |
15,000 |
15,000 |
15,000 |
30,000 |
30,000 |
|
Staff Time in Training |
20,000 |
10,000 |
20,000 |
10,000 |
40,000 |
20,000 |
|
Webmaster |
45,000 |
45,000 |
45,000 |
45,000 |
45,000 |
45,000 |
|
Editorial Review |
|
|
|
|
|
|
|
Content Creation and Coordination |
3,000 |
3,000 |
3,000 |
3,000 |
10,000 |
5,000 |
|
Web Site Design and Development |
10,000 |
10,000 |
10,000 |
10,000 |
100,000 |
50,000 |
|
Staff Support for Service |
15,000 |
15,000 |
20,000 |
20,000 |
30,000 |
30,000 |
|
Programming Support |
|
|
|
|
40,000 |
20,000 |
|
Database Administration |
|
|
|
|
20,000 |
20,000 |
|
Other Management Support |
|
|
|
|
|
|
|
Other Clerical Support |
3,000 |
3,000 |
3,000 |
3,000 |
3,000 |
3,000 |
|
Content Development and Maintenance
Host of Site-Infrastructure
|
||||||
|
Hardware |
8,000 |
4,000 |
8,000 |
4,000 |
8,000 |
4,000 |
|
Software |
3,000 |
1,000 |
3,000 |
1,000 |
6,000 |
2,000 |
|
Network and Internet Access |
1,500 |
1,500 |
1,500 |
1,500 |
1,500 |
1,500 |
|
Other Vendor Services |
|
|
|
|
|
|
|
Human Resources
| ||||||
|
Front-end Research and Technical Evaluation |
500 |
500 |
500 |
500 |
1,000 |
1,000 |
|
Start-up Process for Equipment Procurement |
500 |
|
500 |
|
500 |
|
|
Establish and Manage Vendor and ISP Contracts |
|
|
|
|
|
|
|
Development and Delivery of Staff Training |
1,000 |
1,000 |
1,000 |
1,000 |
2,000 |
2,000 |
|
Staff Time in Training |
500 |
500 |
500 |
500 |
2,000 |
2,000 |
|
Network and Systems Administration |
3,000 |
3,000 |
3,000 |
3,000 |
6,000 |
6,000 |
|
Web Server Management |
1,000 |
1,000 |
1,000 |
1,000 |
5,000 |
3,000 |
|
Operations Support |
1,000 |
1,000 |
1,000 |
1,000 |
3,000 |
3,000 |
|
Clerical Support |
0 |
|
|
|
|
|
|
INFRASTRUCTURE AND OTHER SUBTOTAL
|
29,500 |
11,500 |
29,500 |
11,500 |
96,500 |
50,500 |
|
HUMAN RESOURCES SUBTOTAL
|
124,500 |
113,000 |
129,500 |
118,000 |
407,500 |
283,000 |
|
TOTAL COSTS
|
154,000 |
124,500 |
159,000 |
129,500 |
504,000 |
333,500 |
With the cost worksheet filled out, the group had its total costs over the three dimensions (modest, moderate, and elaborate). It was now time to go to the next step in the benefit-cost process, assessing the benefits of the new system.
Performance Variables, Measures, and Targets
Because their primary objective was reduced costs, the group decided to develop only the "cheaper" dimensions of the "Performance Variables, Measures, and Targets" worksheet. This resulted in two different "cheaper" variables, as depicted in Figure 10
Figure 10. Performance Variables, Measures, and Targets, OCR

The Benefit-Cost Analysis
The group felt ready to finally get to the benefit-cost analysis. Having the costs ready, it was now time to specify the benefits in quantitative terms. Each benefit had to be turned into a dollar amount. Additional information about the agency's annual printing, and data entry costs were collected. With the performance variables, measures, and targets to draw on, the group managed to quantify the benefits reasonably quickly.
Figure 11. Benefit-Cost Table, OCR
|
|
Modest
|
Moderate
|
Elaborate
|
|---|---|---|---|
|
Benefits:
|
|
|
|
|
Printing Costs |
$100,000 |
150,000 |
400,000 |
|
Data Entry |
400,000 |
||
|
Sum Benefits:
|
100,000 |
150,000 |
650,000 |
|
Sum Costs (from cost worksheet):
|
124,500 |
129,500 |
407,500 |
|
Benefit-Cost Ratio
|
0.80 |
1.16 |
1.59 |
Aftermath
The benefit-cost table above indicates that investment at the modest level will not pay off. For each dollar invested, only 80 cents are saved. However, at the moderate level the agency would see a payoff. For each dollar invested, $1.16 is saved. Finally, at the elaborate level, the investment will pay off with $1.59 saved per dollar invested. It would seem that the elaborate level would be the right investment for OCR.
After doing the calculations in the benefit-cost table, the group decided to do sensitivity analysis on the numbers. They questioned their own beliefs about the numbers, and tested how different assumptions about both benefits and costs related to different benefit cost ratios. They found that the ratios were fairly sensitive to changes in the underlying assumptions, but as they felt reasonably confident in their assumptions, the group decided to recommend that the agency invest in an elaborate Web site. However, knowing that a lot of uncertainties needed to be resolved, the group also recommended a project development strategy of first developing a moderate Web site, with a prototype of the elaborate functions. The elaborate functions would then be worked on after the moderate Web site had been implemented and the agency had more experience developing services for the Internet.
The benefit-cost model developed in this example is very simple. An agency deciding to do a benefit-cost analysis should have, or get, expertise on benefit-cost modeling. Issues that were not addressed in this example such as opportunity cost, interest rates, and multi-year investments, can in many cases change the results completely. For an introduction to benefit- cost analysis, see James Edwin Kee, "Benefit-Cost Analysis in Program Evaluation," in Joseph
S. Wholey, Harry P. Hatry, and Kathryn E. Newcomer (eds.), Handbook of Practical Program Evaluation. San Francisco: Jossey-Bass, 1994.
The Office of Business and Tourism wishes to attract businesses and tourists to the state. It works with a variety of state agencies, local governments, business officials, the tourist industry, and citizens groups to help promote the idea of people working and vacationing in the state.
Recognizing that many business leaders and affluent tourists have access to the Internet, and that other states are using this technology for economic development, the agency wishes to augment its current efforts with a new Web service.
A group of OBT leaders got together, and used the "system features and functionality" worksheet to specify what was wanted from the new service. The results indicated that the Web-service would be different levels of information dissemination, from a fairly modest system, to an elaborate system (below).
Figure 12. System Features and Functionality, OBT
|
System Features and Functionality
|
|||
|---|---|---|---|
|
Modest
|
Moderate
|
Elaborate
|
|
|
Who are your customers?
|
general public business, and entrepreneurs |
same as modest, plus potential tourists |
same as moderate |
|
What information-based services will you provide?
|
access to directory of agency and regional services |
same as modest, plus business and tourism promotion |
same as moderate, plus one-stop shopping for business start-up |
|
How will customers get access to these services?
|
WWW with possible follow-up telephone call |
WWW, email, follow-up mailing of tourism documents. agency provides some Internet access for businesses through libraries |
same as moderate |
|
What will customers be able to do?
|
browse agency general information and follow pointers to related sites |
browse official promotional information for businesses and tourists. request additional info. be sent to them. |
same as moderate, plus get all government information related to business start-up or relocation to a partic. area of the state |
|
What system features will be included?
|
hypertext and email |
same as modest, plus WWW, email, some multi-media |
same as moderate, but a expanded set of information |
|
What information sources (internal and external) must be coordinated?
|
agency directory of services, and key external information sources |
same as modest, plus tourism brochures; business start-up information |
same as moderate, plus all forms and brochures related to businesses from federal, state and local offices |
|
What security and confidentiality measures must be implemented?
|
Make sure no one can change information |
same as moderate |
same as moderate |
|
What activities will be outsourced?
|
host of WWW site |
same as moderate |
same as moderate |
Knowing that they would have to quantify the costs of providing these features and functionality, the group decided to use the cost worksheet to find out how the different investment levels would result in different cost scenarios.
Figure 13. Cost Worksheet, OBT
|
Cost Worksheet
|
||||||
|---|---|---|---|---|---|---|
|
MODEST
|
MODERATE
|
ELABORATE
|
||||
|
First Year
Cost
|
Subseq.
Annual
|
First Year
Cost
|
Subseq.
Annual
|
First Year
Cost
|
Subseq.
Annual
|
|
|
Organizational Readiness
|
||||||
|
Training for Technology Awareness |
1000 |
|
10000 |
|
10000 |
|
|
Planning for Internet Presence |
1000 |
|
10000 |
|
100000 |
|
|
Access for Agency Staff and Other Users
|
||||||
|
Hardware for End Users |
|
|
100000 |
30000 |
100000 |
30000 |
|
Software for End Users |
|
|
10000 |
10000 |
10000 |
10000 |
|
Network and Internet Access for End Users |
|
|
|
10000 |
|
10000 |
|
Other Vendor Service |
|
|
|
|
|
|
|
Human Resources
|
|
|||||
|
Start-up Process for Equipment Procurement |
|
|
10000 |
|
10000 |
|
|
Establish and Manage Vendor and ISP Contracts |
|
|
5000 |
1000 |
5000 |
1000 |
|
End User Support
|
||||||
|
Vendor Services |
|
|
|
|
|
|
|
Human Resources
|
|
|||||
|
Establish and Manage Vendor Contractsx |
|
|
|
|
|
|
|
Development and Delivery of User Training |
|
|
2000 |
1000 |
2000 |
1000 |
|
User Time in Training |
|
|
|
|
|
|
|
Help Desk for Users |
|
|
10000 |
10000 |
10000 |
10000 |
|
Content Development and Maintenance
Host of Site-Infrastructure
|
||||||
|
Hardware for Content Developers |
10000 |
20000 |
40000 |
20000 |
||
|
Software for Content Developers |
2000 |
1000 |
4000 |
2000 |
10000 |
10000 |
|
Network and Internet Access for Content Developers |
1000 |
2000 |
10000 |
|||
|
Other Vendor Services |
|
|
|
|
20000 |
|
|
Human Resources
| ||||||
|
Start-up Process for Equipment Procurement |
1000 |
|
1000 |
1000 |
1000 |
1000 |
|
Establish and Manage Vendor Contracts |
5000 |
5000 |
||||
|
Development and Delivery of Staff Training |
1000 |
1000 |
5000 |
5000 |
||
|
Staff Time in Training |
10000 |
1000 |
20000 |
10000 |
||
|
Webmaster |
40000 |
40000 |
||||
|
Editorial Review |
1000 |
1000 |
1000 |
1000 |
10000 |
10000 |
|
Content Creation and Coordination |
10000 |
10000 |
10000 |
10000 |
100000 |
100000 |
|
Web Site Design and Development |
10000 |
10000 |
10000 |
10000 |
20000 |
20000 |
|
Staff Support for Service |
50000 |
50000 |
||||
|
Programming Support |
|
|
|
|
50000 |
20000 |
|
Database Administration |
|
|
|
|
20000 |
10000 |
|
Other Management Support |
1000 |
1000 |
10000 |
10000 |
10000 |
10000 |
|
Other Clerical Support |
1000 |
1000 |
10000 |
10000 |
||
|
Content Development and Maintenance
Host of Site-Infrastructure
|
||||||
|
Hardware |
50000 |
10000 |
||||
|
Software |
20000 |
10000 |
||||
|
Network and Internet Access |
10000 |
10000 |
||||
|
Other Vendor Services |
|
1000 |
|
10000 |
20000 |
20000 |
|
Human Resources
| ||||||
|
Front-end Research and Technical Evaluation |
20000 | |||||
|
Start-up Process for Equipment Procurement |
10000 | |||||
|
Establish and Manage Vendor and ISP Contracts |
|
1000 |
|
1000 |
10000 |
10000 |
|
Development and Delivery of Staff Training |
10000 |
5000 |
||||
|
Staff Time in Training |
30000 |
20000 |
||||
|
Network and Systems Administration |
30000 |
20000 |
||||
|
Web Server Management |
20000 |
20000 |
||||
|
Operations Support |
5000 |
5000 |
||||
|
Clerical Support |
|
5000 |
5000 |
|||
|
INFRASTRUCTURE AND OTHER SUBTOTAL
|
12000 |
3000 |
134000 |
64000 |
280000 |
140000 |
|
HUMAN RESOURCES SUBTOTAL
|
25000 |
23000 |
91000 |
88000 |
568000 |
388000 |
|
TOTAL COSTS
|
37000 |
26000 |
225000 |
152000 |
848000 |
528000 |
Knowing the specific costs associated with the possible levels of investment, the group decided that it was important to specify measurable targets for the new system. Knowing that this could easily be accomplished by using the "performance variables, measures, and targets" worksheet. The worksheet was filled out as described below:
Figure 14. Performance Variables, Measures, and Targets, OBT
Variable: Reduced mailing costs (Cheapter)
Measure: # requests for information go down
1
|
Modest Target
|
Moderate Target
|
Elaborate Target
|
|---|---|---|
|
Reduce by 5% |
Reduce by 10% |
Reduce by 15% |
Variable: Customer response time (Faster)
Measure: time customer must wait for information
|
Modest Target
|
Moderate Target
|
Elaborate Target
|
|---|---|---|
|
www-service: < 1 minute Regular service: 50% reduct |
www-service: < 1 minute Regular service: 50% reduct |
www-service: < 1 minute Regular service: 50% reduct |
Variable: One stop shopping (better)
Measure: Time of customer stay on www-service
|
Modest Target
|
Moderate Target
|
Elaborate Target
|
|---|---|---|
|
Mean of 5 minutes |
Mean of 10 minutes |
Mean of 20 minutes |
Variable: Enhanced service quality (Better)
Measure: Customer surveys
|
Modest Target
|
Moderate Target
|
Elaborate Target
|
|---|---|---|
|
Increase customer satisfaction by 10% |
Increase customer satisfaction by 20% |
Increase customer satisfaction by 30% |
Variable: Expanded communication (better)
2
Measure: Increase in # national/international customers.
|
Modest Target
|
Moderate Target
|
Elaborate Target
|
|---|---|---|
|
National: 10% increase International: 5% increase |
National: 15% increase International: 10% increase |
National: 20% increase International: 15% increase |
Variable: Increased revenue generation for the state (Better)
Measure: Traceable to new customers
3
|
Modest Target
|
Moderate Target
|
Elaborate Target
|
|---|---|---|
|
3 x agency spending |
4 x agency spending |
5 x agency spending |
Variable: +Competition (Better)
Measure: % of businesses relocated to our state compared to other states
|
Modest Target
|
Moderate Target
|
Elaborate Target
|
|---|---|---|
|
Increase 5% |
Increase 10% |
Increase 15% |
Having all this data available, the group decided that it was now time to try out the decision tools that they had. First they decided to use the resource allocation method.
Using the Resource Allocation Method
All this available information can now be used to select the appropriate level of investment. To do this we use a simplified resource allocation method. All the information in the model is transferred from the cost worksheet and the "performance variables, measures, and targets" worksheet.
2 Able to communicate both nationally and internationally
3 Only measure revenue from customers that are traceable to OBT's effort
The final model (all costs in thousands of dollars, transferred from cost worksheet):
Figure 15. Resource Allocation Method, OBT

The different levels of benefit were ranked from 0 to 100 as described on page 20, and the costs from the cost worksheet on page 31 were retrieved (in $1000's for simplicity). The group of leaders were somewhat surprised to find that the ratio for the first investment level was the highest, then the ratio declined as the agency invested in more elaborate Web sites. The result indicated quite clearly that the modest Web site investment would be the best management decision for the agency.
Note to the reader: There is no "cutoff" single ratio to be used in the decision. The decision is made by comparing the different ratios to one another.
Using the Multi-attribute Utility (MAU) Model
For projects where the agency decides to use the MAU model, the number of decision criteria will generally be fewer than for the resource allocation model. Even though the number of criteria developed for OBT is higher than what would normally be used for a MAU model, the OBT example is used to make it easier for the reader to follow the problem. However, for simplicity and realism in using the MAU model, a few criteria have been removed.
Since the OBT had some previous experience with using a MAU model, they might have decided to use a MAU model rather than the Resource Allocation method to analyze their plans. Following the same rules as for the resource allocation method, for most performance measures the agency gave the "no investment" a 0 utility value and "elaborate" investments a 100 utility value. The cost category was given the reverse, with "elaborate" getting a 0 utility. This resulting table is shown in Figure 16.
Figure 16. Partial MAU model, OBT
|
|
Alternatives
|
||||||
|---|---|---|---|---|---|---|---|
|
Rank
|
Weight
|
Criteria
|
No Investment
|
Modest
|
Moderate
|
Elaborate
|
|
|
Cost of developing WWW-service
|
100 |
90 |
70 |
0 |
|||
|
Expanded comm. Capabilities
|
0 |
33 |
67 |
100 |
|||
|
Increased revenue generation
|
0 |
30 |
95 |
100 |
|||
|
Better comp. w/ other states
|
0 |
33 |
67 |
100 |
|||
|
Total Utility:
| |||||||
Next, for each criterion, they rated the remaining alternatives between 0 and 100. Remember that a utility is an assessment of how much a certain alternative on one criterion is "worth" to the agency. It is very important to understand that this ranking is often not linear, this can be true even if the underlying measurements are linear. E.g. the criteria "increased revenue generation" is measured by "3 x agency spending" at the modest investment, "4 x agency spending" at the moderate level of investment, and "5 x agency spending" at the elaborate level. This might be interpreted as a linear variable, however, the funding sources for the agency might have set a minimum goal of 4 times agency spending. This means that achieving the modest goal of "3 x agency spending" is of little or no value. Whereas the difference between four times and five times is negligible. This might lead the weighting of the "increased revenue generation" criteria to be: 0, 30, 95, and 100. The procedure for finding the weights is explained on page 21.
Figure 17. Complete MAU model, OBT (numbers rounded)
|
Alternatives
|
||||||
|---|---|---|---|---|---|---|
|
Rank
|
Weight
|
Criteria
|
No Investment
|
Modest
|
Moderate
|
Elaborate
|
|
2 |
.27 |
Cost of developing WWW-service
|
100 |
90 |
70 |
0 |
|
27
|
24
|
19
|
0
|
|||
|
4 |
.06 |
Expanded comm. Capabilities
|
0 |
33 |
67 |
100 |
|
0
|
2
|
4
|
6
|
|||
|
1 |
.52 |
Increased revenue generation
|
0 |
30 |
95 |
100 |
|
0
|
16
|
49
|
52
|
|||
|
3 |
.15 |
Better comp. w/ other states
|
0 |
33 |
67 |
100 |
|
0
|
5
|
10
|
15
|
|||
|
Total Utility:
|
27
|
47
|
82
|
73
|
||
Notice that the second criterion in the "Performance variables, measures, and targets" worksheet was dropped from this analysis. This is because it did not change over the three investment alternatives, and thus would not have changed the model.
With the MAU model almost complete, the group from OBT had to compare the importance of the different criteria. This was done by comparing every criterion to the others to find the most important ones. Increased revenue generation, was given a 1 in the "rank" column, and then compared to every other remaining criteria. This led to the "cost of developing WWW-service" getting the second highest ranking. This criterion was then compared with the two remaining criteria, resulting in the "better competition with other states" ending up as rank number three, and the last criterion getting the last rank.
With the ranking done, weights needed to be assigned to the criteria. Knowing the rank made it a little easier for the OBT group, because they knew that the criterion ranked as number one had to have a higher weight than the criterion ranked as number two, and the criterion ranked as number two had to have a higher weight than the third ranked criterion, etc. When there are relatively few performance criteria (i.e., 3 - 5), weights may be determined relatively easily by an expert. When the number of performance criterion is larger, it may be very difficult indeed to determine how to weight the cost attribute relative to all the other performance criteria so that the trade-off between cost and benefits is appropriate. An expert in MAU modeling can assist in this process.
Rather than assigning weights, it may be simpler and more efficient to just rank the criteria in order of importance. This approach has been developed in F. Hutton Barron and Bruce E. Barrett, "Decision Quality Using Ranked Attribute Weights," Management Science, 1996, 42(11), 1515-1523. In their process, each of the performance criteria are ranked from most important to least important. Once that ranking has been established, weights are assigned according to the following table:
Figure 18. MAU-Model Weighting
|
2
|
3
|
4
|
5
|
6
|
7
|
8
|
|
|---|---|---|---|---|---|---|---|
|
Rank
|
criteria
|
criteria
|
criteria
|
criteria
|
criteria
|
criteria
|
criteria
|
|
1
|
.75
|
.61
|
.52
|
.46
|
.41
|
.37
|
.34
|
|
2
|
.25
|
.28
|
.27
|
.26
|
.24
|
.23
|
.21
|
|
3
|
|
.11
|
.15
|
.15
|
.16
|
.16
|
.16
|
|
4
|
|
|
.06
|
.09
|
.10
|
.11
|
.11
|
|
5
|
|
|
|
.04
|
.06
|
.07
|
.08
|
|
6
|
|
|
|
|
.03
|
.04
|
.05
|
|
7
|
|
|
|
|
|
.02
|
.03
|
|
8
|
|
|
|
|
|
.02
|
For more information, please see Ward Edwards and J. Robert Newman, "Multi-attribute Evaluation," in Hal R. Arkes and Kenneth R. Hammond (eds.), Judgment and Decision Making: An Interdisciplinary Reader. New York: Cambridge University Press, 1986.
After applying the weights, the group multiplied the weights with the utility scores in each cell, and put the product in the middle of the cell. These numbers were summarized into the total utility row.
Looking at the final result, the group decided to do a sensitivity analysis. They tried to "tamper" with some of the numbers to get the answers to some "what if" questions. e.g. "what if something in the
funding situation change, and the first and second ranked criteria change order?." This, and a number of other "what if" questions were discussed, and the agency group finally decided that the resulting recommendation was not very sensitive to changes in the environment.
The group decided that the agency Web site should be designed and built and that they should go for the moderate level of investment.
Note to the reader: It is important to understand that using the resource allocation model and the MAU-model on the same problem might lead to different recommendations. This is because the assumptions underlying the two models are quite different. While the resource allocation model uses real costs, the MAU model uses weights that might in some situations be more appropriate, and in other situations not appropriate. The MAU model also differs by weighting the different criteria. You must understand these assumptions, and choose a model according to what best fits your situation.
1The reason for the relatively low targets is that the WWW-service increases visibility and might produce additional requests for information.
2Able to communicate both nationally and internationally
3Only measure revenue from customers that are traceable to OBT's effort
© 2003 Center for Technology in Government
