Skip to main content
 
A Cost Performance Model for Assessing WWW Service Investments



Sample analyses

Using different models at the Office of Business and Tourism (OBT)

The Office of Business and Tourism wishes to attract businesses and tourists to the state. It works with a variety of state agencies, local governments, business officials, the tourist industry, and citizens groups to help promote the idea of people working and vacationing in the state.

Recognizing that many business leaders and affluent tourists have access to the Internet, and that other states are using this technology for economic development, the agency wishes to augment its current efforts with a new Web service.

A group of OBT leaders got together, and used the "system features and functionality" worksheet to specify what was wanted from the new service. The results indicated that the Web-service would be different levels of information dissemination, from a fairly modest system, to an elaborate system (below).

Figure 12. System Features and Functionality, OBT

System Features and Functionality
 
Modest
 
Moderate
 
Elaborate
 
Who are your customers?
 
general public business, and entrepreneurs
 
same as modest, plus potential tourists
 
same as moderate
 
What information-based services will you provide?
 
access to directory of agency and regional services
 
same as modest, plus business and tourism promotion
 
same as moderate, plus one-stop shopping for business start-up
 
How will customers get access to these services?
 
WWW with possible follow-up telephone call
 
WWW, email, follow-up mailing of tourism documents. agency provides some Internet access for businesses through libraries
 
same as moderate
 
What will customers be able to do?
 
browse agency general information and follow pointers to related sites
 
browse official promotional information for businesses and tourists. request additional info. be sent to them.
 
same as moderate, plus get all government information related to business start-up or relocation to a partic. area of the state
 
What system features will be included?
 
hypertext and email
 
same as modest, plus WWW, email, some multi-media
 
same as moderate, but a expanded set of information
 
What information sources (internal and external) must be coordinated?
 
agency directory of services, and key external information sources
 
same as modest, plus tourism brochures; business start-up information
 
same as moderate, plus all forms and brochures related to businesses from federal, state and local offices
 
What security and confidentiality measures must be implemented?
 
Make sure no one can change information
 
same as moderate
 
same as moderate
 
What activities will be outsourced?
 
host of WWW site
 
same as moderate
 
same as moderate
 

Knowing that they would have to quantify the costs of providing these features and functionality, the group decided to use the cost worksheet to find out how the different investment levels would result in different cost scenarios.

Figure 13. Cost Worksheet, OBT

Cost Worksheet
 
MODEST
MODERATE
ELABORATE
First Year
Cost
 
Subseq.
Annual
 
First Year
Cost
 
Subseq.
Annual
 
First Year
Cost
 
Subseq.
Annual
 
Organizational Readiness
Training for Technology Awareness
 
1000
 
 
 
10000
 
 
 
10000
 
 
 
Planning for Internet Presence
 
1000
 
 
 
10000
 
 
 
100000
 
 
 
Access for Agency Staff and Other Users
Hardware for End Users
 
 
 
 
 
100000
 
30000
 
100000
 
30000
 
Software for End Users
 
 
 
 
 
10000
 
10000
 
10000
 
10000
 
Network and Internet Access for End Users
 
 
 
 
 
 
 
10000
 
 
 
10000
 
Other Vendor Service
 
 
 
 
 
 
 
 
 
 
 
 
 
Human Resources
 
 
Start-up Process for Equipment Procurement
 
 
 
 
 
10000
 
 
 
10000
 
 
 
Establish and Manage Vendor and ISP Contracts
 
 
 
 
 
5000
 
1000
 
5000
 
1000
 
End User Support
Vendor Services
 
 
 
 
 
 
 
 
 
 
 
 
 
Human Resources
 
 
Establish and Manage Vendor Contractsx
 
 
 
 
 
 
 
 
 
 
 
 
 
Development and Delivery of User Training
 
 
 
 
 
2000
 
1000
 
2000
 
1000
 
User Time in Training
 
 
 
 
 
 
 
 
 
 
 
 
 
Help Desk for Users
 
 
 
 
 
10000
 
10000
 
10000
 
10000
 
Content Development and Maintenance
Host of Site-Infrastructure
Hardware for Content Developers
 
10000
 
 
20000
 
 
40000
 
20000
 
Software for Content Developers
 
2000
 
1000
 
4000
 
2000
 
10000
 
10000
 
Network and Internet Access for Content Developers
 
 
1000
 
 
2000
 
 
10000
 
Other Vendor Services
 
 
 
 
 
 
 
 
 
20000
 
 
 
Human Resources
 
Start-up Process for Equipment Procurement
 
1000
 
 
 
1000
 
1000
 
1000
 
1000
 
Establish and Manage Vendor Contracts
 
 
 
 
 
5000
 
5000
 
Development and Delivery of Staff Training
 
 
 
1000
 
1000
 
5000
 
5000
 
Staff Time in Training
 
 
 
10000
 
1000
 
20000
 
10000
 
Webmaster
 
 
 
 
40000
 
 
40000
 
Editorial Review
 
1000
 
1000
 
1000
 
1000
 
10000
 
10000
 
Content Creation and Coordination
 
10000
 
10000
 
10000
 
10000
 
100000
 
100000
 
Web Site Design and Development
 
10000
 
10000
 
10000
 
10000
 
20000
 
20000
 
Staff Support for Service
 
 
 
 
 
50000
 
50000
 
Programming Support
 
 
 
 
 
 
 
 
 
50000
 
20000
 
Database Administration
 
 
 
 
 
 
 
 
 
20000
 
10000
 
Other Management Support
 
1000
 
1000
 
10000
 
10000
 
10000
 
10000
 
Other Clerical Support
 
 
 
1000
 
1000
 
10000
 
10000
 
Content Development and Maintenance
Host of Site-Infrastructure
Hardware
 
 
 
 
 
50000
 
10000
 
Software
 
 
 
 
 
20000
 
10000
 
Network and Internet Access
 
 
 
 
 
10000
 
10000
 
Other Vendor Services
 
 
 
1000
 
 
 
10000
 
20000
 
20000
 
Human Resources
 
Front-end Research and Technical Evaluation
 
 
 
 
 
20000
 
 
Start-up Process for Equipment Procurement
 
 
 
 
 
10000
 
 
Establish and Manage Vendor and ISP Contracts
 
 
 
1000
 
 
 
1000
 
10000
 
10000
 
Development and Delivery of Staff Training
 
 
 
 
 
10000
 
5000
 
Staff Time in Training
 
 
 
 
 
30000
 
20000
 
Network and Systems Administration
 
 
 
 
 
30000
 
20000
 
Web Server Management
 
 
 
 
 
20000
 
20000
 
Operations Support
 
 
 
 
 
5000
 
5000
 
Clerical Support
 
 
 
 
 
 
5000
 
5000
 
INFRASTRUCTURE AND OTHER SUBTOTAL
 
12000
 
3000
 
134000
 
64000
 
280000
 
140000
 
HUMAN RESOURCES SUBTOTAL
 
25000
 
23000
 
91000
 
88000
 
568000
 
388000
 
TOTAL COSTS
 
37000
 
26000
 
225000
 
152000
 
848000
 
528000
 

Knowing the specific costs associated with the possible levels of investment, the group decided that it was important to specify measurable targets for the new system. Knowing that this could easily be accomplished by using the "performance variables, measures, and targets" worksheet. The worksheet was filled out as described below:

Figure 14. Performance Variables, Measures, and Targets, OBT

 Variable: Reduced mailing costs (Cheapter)
 Measure: # requests for information go down 1
Modest Target
 
Moderate Target
 
Elaborate Target
 
Reduce by 5%
 
Reduce by 10%
 
Reduce by 15%
 


 Variable: Customer response time (Faster)
 Measure: time customer must wait for information
Modest Target
 
Moderate Target
 
Elaborate Target
 
www-service: < 1 minute Regular service: 50% reduct
 
www-service: < 1 minute Regular service: 50% reduct
 
www-service: < 1 minute Regular service: 50% reduct
 


 Variable: One stop shopping (better)
 Measure: Time of customer stay on www-service
Modest Target
 
Moderate Target
 
Elaborate Target
 
Mean of 5 minutes
 
Mean of 10 minutes
 
Mean of 20 minutes
 


 Variable: Enhanced service quality (Better)
 Measure: Customer surveys
Modest Target
 
Moderate Target
 
Elaborate Target
 
Increase customer satisfaction by 10%
 
Increase customer satisfaction by 20%
 
Increase customer satisfaction by 30%
 


 Variable: Expanded communication (better) 2
 Measure: Increase in # national/international customers.
Modest Target
 
Moderate Target
 
Elaborate Target
 
National: 10% increase International: 5% increase
 
National: 15% increase International: 10% increase
 
National: 20% increase International: 15% increase
 


 Variable: Increased revenue generation for the state (Better)
 Measure: Traceable to new customers 3
Modest Target
 
Moderate Target
 
Elaborate Target
 
3 x agency spending
 
4 x agency spending
 
5 x agency spending
 


 Variable: +Competition (Better)
 Measure: % of businesses relocated to our state compared to other states
Modest Target
 
Moderate Target
 
Elaborate Target
 
Increase 5%
 
Increase 10%
 
Increase 15%
 

Having all this data available, the group decided that it was now time to try out the decision tools that they had. First they decided to use the resource allocation method.

Using the Resource Allocation Method

All this available information can now be used to select the appropriate level of investment. To do this we use a simplified resource allocation method. All the information in the model is transferred from the cost worksheet and the "performance variables, measures, and targets" worksheet.

2 Able to communicate both nationally and internationally

3 Only measure revenue from customers that are traceable to OBT's effort

The final model (all costs in thousands of dollars, transferred from cost worksheet):

Figure 15. Resource Allocation Method, OBT

Web Service Development & Management Plan

The different levels of benefit were ranked from 0 to 100 as described on page 20, and the costs from the cost worksheet on page 31 were retrieved (in $1000's for simplicity). The group of leaders were somewhat surprised to find that the ratio for the first investment level was the highest, then the ratio declined as the agency invested in more elaborate Web sites. The result indicated quite clearly that the modest Web site investment would be the best management decision for the agency.

Note to the reader: There is no "cutoff" single ratio to be used in the decision. The decision is made by comparing the different ratios to one another.

Using the Multi-attribute Utility (MAU) Model

For projects where the agency decides to use the MAU model, the number of decision criteria will generally be fewer than for the resource allocation model. Even though the number of criteria developed for OBT is higher than what would normally be used for a MAU model, the OBT example is used to make it easier for the reader to follow the problem. However, for simplicity and realism in using the MAU model, a few criteria have been removed.

Since the OBT had some previous experience with using a MAU model, they might have decided to use a MAU model rather than the Resource Allocation method to analyze their plans. Following the same rules as for the resource allocation method, for most performance measures the agency gave the "no investment" a 0 utility value and "elaborate" investments a 100 utility value. The cost category was given the reverse, with "elaborate" getting a 0 utility. This resulting table is shown in Figure 16.

Figure 16. Partial MAU model, OBT

 
Alternatives
Rank
 
Weight
 
Criteria
 
No Investment
 
Modest
 
Moderate
 
Elaborate
 
  
Cost of developing WWW-service
 
100
 
90
 
70
 
0
 
  
Expanded comm. Capabilities
 
0
 
33
 
67
 
100
 
  
Increased revenue generation
 
0
 
30
 
95
 
100
 
  
Better comp. w/ other states
 
0
 
33
 
67
 
100
 
Total Utility:
 
    


Next, for each criterion, they rated the remaining alternatives between 0 and 100. Remember that a utility is an assessment of how much a certain alternative on one criterion is "worth" to the agency. It is very important to understand that this ranking is often not linear, this can be true even if the underlying measurements are linear. E.g. the criteria "increased revenue generation" is measured by "3 x agency spending" at the modest investment, "4 x agency spending" at the moderate level of investment, and "5 x agency spending" at the elaborate level. This might be interpreted as a linear variable, however, the funding sources for the agency might have set a minimum goal of 4 times agency spending. This means that achieving the modest goal of "3 x agency spending" is of little or no value. Whereas the difference between four times and five times is negligible. This might lead the weighting of the "increased revenue generation" criteria to be: 0, 30, 95, and 100. The procedure for finding the weights is explained on page 21.

Figure 17. Complete MAU model, OBT (numbers rounded)

Alternatives
Rank
 
Weight
 
Criteria
 
No Investment
 
Modest
 
Moderate
 
Elaborate
 
2
 
.27
 
Cost of developing WWW-service
 
100
 
90
 
70
 
0
 
27
 
24
 
19
 
0
 
4
 
.06
 
Expanded comm. Capabilities
 
0
 
33
 
67
 
100
 
0
 
2
 
4
 
6
 
1
 
.52
 
Increased revenue generation
 
0
 
30
 
95
 
100
 
0
 
16
 
49
 
52
 
3
 
.15
 
Better comp. w/ other states
 
0
 
33
 
67
 
100
 
0
 
5
 
10
 
15
 
Total Utility:
 
27
 
47
 
82
 
73
 

Notice that the second criterion in the "Performance variables, measures, and targets" worksheet was dropped from this analysis. This is because it did not change over the three investment alternatives, and thus would not have changed the model. With the MAU model almost complete, the group from OBT had to compare the importance of the different criteria. This was done by comparing every criterion to the others to find the most important ones. Increased revenue generation, was given a 1 in the "rank" column, and then compared to every other remaining criteria. This led to the "cost of developing WWW-service" getting the second highest ranking. This criterion was then compared with the two remaining criteria, resulting in the "better competition with other states" ending up as rank number three, and the last criterion getting the last rank.

With the ranking done, weights needed to be assigned to the criteria. Knowing the rank made it a little easier for the OBT group, because they knew that the criterion ranked as number one had to have a higher weight than the criterion ranked as number two, and the criterion ranked as number two had to have a higher weight than the third ranked criterion, etc. When there are relatively few performance criteria (i.e., 3 - 5), weights may be determined relatively easily by an expert. When the number of performance criterion is larger, it may be very difficult indeed to determine how to weight the cost attribute relative to all the other performance criteria so that the trade-off between cost and benefits is appropriate. An expert in MAU modeling can assist in this process.

Rather than assigning weights, it may be simpler and more efficient to just rank the criteria in order of importance. This approach has been developed in F. Hutton Barron and Bruce E. Barrett, "Decision Quality Using Ranked Attribute Weights," Management Science, 1996, 42(11), 1515-1523. In their process, each of the performance criteria are ranked from most important to least important. Once that ranking has been established, weights are assigned according to the following table:

Figure 18. MAU-Model Weighting
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
Rank
 
criteria
 
criteria
 
criteria
 
criteria
 
criteria
 
criteria
 
criteria
 
1
 
.75
 
.61
 
.52
 
.46
 
.41
 
.37
 
.34
 
2
 
.25
 
.28
 
.27
 
.26
 
.24
 
.23
 
.21
 
3
 
 
.11
 
.15
 
.15
 
.16
 
.16
 
.16
 
4
 
 
 
.06
 
.09
 
.10
 
.11
 
.11
 
5
 
 
 
 
.04
 
.06
 
.07
 
.08
 
6
 
 
 
 
 
.03
 
.04
 
.05
 
7
 
 
 
 
 
 
.02
 
.03
 
8
 
 
 
 
 
 
 
.02
 

For more information, please see Ward Edwards and J. Robert Newman, "Multi-attribute Evaluation," in Hal R. Arkes and Kenneth R. Hammond (eds.), Judgment and Decision Making: An Interdisciplinary Reader. New York: Cambridge University Press, 1986.

After applying the weights, the group multiplied the weights with the utility scores in each cell, and put the product in the middle of the cell. These numbers were summarized into the total utility row.

Looking at the final result, the group decided to do a sensitivity analysis. They tried to "tamper" with some of the numbers to get the answers to some "what if" questions. e.g. "what if something in the

funding situation change, and the first and second ranked criteria change order?." This, and a number of other "what if" questions were discussed, and the agency group finally decided that the resulting recommendation was not very sensitive to changes in the environment.

The group decided that the agency Web site should be designed and built and that they should go for the moderate level of investment.

Note to the reader: It is important to understand that using the resource allocation model and the MAU-model on the same problem might lead to different recommendations. This is because the assumptions underlying the two models are quite different. While the resource allocation model uses real costs, the MAU model uses weights that might in some situations be more appropriate, and in other situations not appropriate. The MAU model also differs by weighting the different criteria. You must understand these assumptions, and choose a model according to what best fits your situation.

1The reason for the relatively low targets is that the WWW-service increases visibility and might produce additional requests for information.
2Able to communicate both nationally and internationally
3Only measure revenue from customers that are traceable to OBT's effort