Skip to main content
 
Advancing Return on Investment Analysis for Government IT: A Public Value Framework



Section III: A Public Value Framework for Government IT Assessment

J. How to Summarize and Present Results?

This framework for public value assessment presents both problems and opportunities for summarizing and reporting. The problems arise from the large number and types of results that the assessment can produce. For presentation to policy makers and non-technical audiences, the results should be as simple and accessible as possible. Simple charts and summary tables are best for this purpose. For multiple stakeholders and value variables, the number and complexity of charts may become a problem. This section discusses some of the specific issues and alternative methods available.

For qualitative variables, the presentation of results can show the presence of a value result, and information about magnitude and direction, if relevant and available. Using the information in Table 3, a summary display similar to Table 7 below can present the types and direction of results, with the estimated relative magnitudes as well. The down arrow in the stewardship row for vendors indicates a potential loss to vendors due to transparency increases that diminish opportunities for some forms of influencing procurement. Whether this is a positive or negative result overall may not be clear.

Table 7. Qualitative Display of Stakeholder Impacts

For the more quantitative results, where the public value variables lend themselves to calculation and statistical analysis, many presentation and summary methods are available. Clear and simple visuals are generally preferable to tables of quantitative data for non-technical audiences. For cross sectional data, a column chart, such as in Figure 16, can be used to present the same comparative public value data converted to a ten-point index or scale.

Figure 16. Stakeholder Impact Column Chart
Figure 16. Stakeholder Impact Column Chart

Such a multidimensional chart can be difficult to interpret, however, and some ROI methods use the so-called radar chart for the same data, shown in Figure 17. This kind of chart provides a clearer image of the pattern of results for each stakeholder type and value dimension. However, this kind of display becomes much more difficult to interpret if the number of axes or stakeholders is large.

Figure 17. Stakeholder Impact Radar Chart
Figure 17. Stakeholder Impact Radar Chart

The use of indexes can also be used to compare the relative risk and value of alternative investments on a value scale if the variables can be combined into a single index. The Demand and Value Assessment Model (Australia) provides guidance on how to produce such a public value index and use it for comparison purposes. An example of that kind of result is shown in Figure 18 below.

>Figure 18. Portfolio Risk and Value Comparisons - DVAM
Figure 18. Portfolio Risk and Value Comparisons - DVAM

For cross sectional assessments, more elaborate presentations are available, particularly if the value variables are indexed or based on quantitative data. The Accenture Public Service Value Model (PSV) can use historical data about government program effectiveness and costs to show changes in performance over time.16 The model is based on the principle that public value is created when both outcome results and cost effectiveness increases. An example of this form of results presentation is shown in Figure 19 for the Arizona Department of Revenue. Overall, the performance shown is improving from 1999 to 2001 and 2002 to 2003 when the organization is creating increased outcomes and doing so more cost effectively.

>Figure 19. Historical Performance Change Model (PSV - Accenture)
Figure 19. Historical Performance Change Model (PSV - Accenture)

Documentation of results through this kind of chart or visual device should include background material about methods and measurement issues. This framework advocates the use of a wide range of data and analytical styles, many of which are considered controversial or suspect in some environments. Decision makers and analysts often have deep-seated biases about the validity of qualitative data or social statistics, or other non-traditional material for ROI analysis. Those performing a public value assessment must, therefore, be thorough in providing rationale and supporting material for all results. They must be attentive to the issues of interpretation and validity that may affect how key members of their audience respond to the assessment results.

The principles for SROI analysis used by the REDF organization provide valuable guidance for the conduct of the kind of assessment described here. These principles can be applied to traditional ROI analysis as well, but seem particularly well suited to the public value issues involved with this framework and related methods. The principles make an appropriate bridge from the general ideas and methods presented here to the difficult work of carrying out public value and ROI assessment in practice.

SROI Design Principles: (17)
  1. Feasible - A basic SROI Analysis should be something any organization can afford to prepare itself.
  2. Accessible - The process should be understandable and relevant to organizations at various stages of development.
  3. Rigorous - The method should be substantive and well-executed, and based upon premises that are validated by informed practitioners.
  4. Replicable - The framework should result in similar conclusions when applied by different practitioners who use similar parameters (such as the scope and options). Thus, results should also be comparable over time and among organizations, at least among analyses that use similar options and where the options are clearly noted.
  5. Transparent - The process by which the analysis was prepared, and the context in which results should be seen, should be transparent.
  6. Credible - The results should be credible to investors, purchasers, mangers, and other users.
  7. Integrative - The framework should relate to, and where possible integrate with, other approaches to understand social value.
  8. Avoids misuse - Proper application of the framework should reduce the risk of misuse of, or misleading, SROI numbers or analyses.
  9. Open source - The framework should continuously be informed and improved by the collective wisdom of practitioners in an inclusive, iterative process.
  10. Useful - Applying the framework should result in information that enables users to make decisions or take actions that further their goals.