Treasury Board of Canada Secretariat
Symbol of the Government of Canada

Outcome Management: Lessons Learned and Best Practices


2  Key Findings

2.1   Lessons Learned from Outcome Management in the GoC

Ten key Outcome Management lessons learned were identified through the results of the interview process.  They are summarized below along with illustrative examples from the projects.  These lessons highlight strengths and weaknesses of using an Outcome Management approach for the first two stages of the Outcome Management process (since none were far enough along to progress to Stage 3 or Stage 4), and provide direction for future development of the Outcome Management method and practice.

Lesson 1:  Use Outcome Management to focus on the initiative’s outcomes and results; not just the outputs.  Align these outcomes with departmental and government priorities.

Outcome Management was successfully used by projects and initiatives to challenge current business assumptions and focus the team on the outcomes (and value created) in addition to the outputs or deliverables created.  This structured review, unlike the cost-benefit analysis approach, asks why activities are being done, in addition to what the activities are producing and provides not only clarity around the vision and value but also highlights areas where improvements can be made.  Outcome Management highlights an initiative’s alignment with government and departmental priorities and articulates this early in the project’s lifecycle.

NRCan’s GeoBase initiative used Outcome Management as an agent of change to understand and articulate the issues and relevance of GeoBase (a repository of geospatial data).  Using the Outcome Management process, they were able to identify how the GeoBase supports key government priorities such as emergency preparedness to provide value (in addition to data).

Using Outcome Management, PWGSC’s Acquisitions Branch Portfolio Management Project identified two or three IT systems (outputs) that could be retired – one of which the business users identified having little business value and was still consuming resources.

Lesson 2:  Outcome Management is a strong team building exercise.  But more than that, it can be used to articulate both the business and information technology outcomes of the initiative.

By following the Outcome Management process, the project team and stakeholders must clearly and precisely define the value of their project or initiative – both in business and information technology terms.  This process can be used to unite the team around a common understanding allowing the project to start with everyone on the same page and promotes effective communication through precise articulation of outcomes using common language.  It can be used to gain agreement on many issues and challenges, especially for horizontal initiatives that often include varied perspectives, and close the gap in viewpoint between business and technology staff.  Outcome Management also highlights the business results of a project or initiative – the raison d’etre of making an investment that is all too often overlooked (especially in technology-based projects).

The RCMP’s Real-Time Identification initiative was initially seen as an IT project.  The Outcome Management process engaged the business participants broadening the perspective and demonstrating that the project provides measurable business value.

PWGSC’s Shared Travel Services Initiative used the process and the deliverables to orient new team members.  Now the team has not only a common understanding of the objectives and outcomes, but a common language that everyone can use.

Lesson 3:  Engage all stakeholders in the process – especially if the initiative crosses organizational or jurisdictional boundaries to leverage the results of the Outcome Management process.

Having all the stakeholders participate consistently (with no substitutes) in the process is critical for ensuring good results along with strong executive support.  A broad representation of the right stakeholders – including external stakeholders – is the key to success.  This is especially important when the initiative is a horizontal program or when stakeholders cross jurisdictional boundaries.

Furthermore, participating in the process was viewed as one of the most valuable results of Outcome Management.  The facilitated workshops are a key ingredient of Outcome Management and a strong facilitator is essential for ensuring balanced input.  The resulting report that was provided can be improved to be more usable.

CRA’s e-Information project used Outcome Management to bring three different departments – each with its own vision – together to create a common understanding.  The process showed the linkages and interrelationships between stakeholders.  The sessions were not seen as an arduous process – it only took three ½ day sessions and they were well-facilitated.

Lesson 4:  Outcome Management provides increased flexibility in defining intangible or “soft” benefits.

Initiatives are often faced with the challenge of defining and communicating the important soft or intangible benefits associated with the project.  Outcome Management provides increased flexibility around the definition of benefits or results and can be effective when used to describe and measure intangible benefits such as public good.  Unlike the cost-benefit analysis approach, it allows the linkage between project activities and these “softer” outcomes to be made clear using the cause-and-effect logic model.  Outcome Management’s value case concept also extends the traditional business case to better express soft benefits and stakeholder impacts in addition to approaches centred only on costs.

The RCMP’s Real-Time Identification project used the value case to articulate the intangible benefits of the initiative more clearly and effectively.

Lesson 5:  Outcome Management should be conducted early in the initiative lifecycle with as much detail as possible.  It should also be done at major gates during the execution of the initiative.

Starting early in the process is a good practice – for example, before preliminary project approval (PPA) if it applies.  The Outcome Register (an Outcome Management tool) is the most useful when it specifies performance measures and accountabilities – the more complete the Register, the better.  The Outcome Register should also be reviewed and updated at key points during the project lifecycle – for example at effective project approval (EPA) and at specified review points to ensure that it remains relevant and the initiative remains aligned.

The Inter-Agency Committee on Geomatics lead by NRCan, representing 10 different departments with different objectives, used Outcome Management to precisely define the outcomes of the working group.  This precision forced the team to challenge assumptions and clarify common goals.

Lesson 6:  Integrate Outcome Management with existing methods, frameworks, management tools.  Use it to strengthen or replace other deliverables.

Outcome Management needs to be aligned and integrated with existing federal government processes and best practices such as the project approval process, project management framework and other deliverables such as business cases, project plans, and scorecards.  When it is integrated, the benefits grow.  The value case extends the concepts of the business case for a more comprehensive description of the initiative.

PWGSC’s Shared Travel Services Initiative engaged internal audit in the process strengthening the linkages between the project results and the model that will be used to audit the project.

The RCMP’s Real-Time Identification project incorporated the measurable outcomes into their corporate performance measurement program.

Lesson 7:  Successful Outcome Management requires champions, education, and communication.  Tools and methods can be improved.

While the concepts of Outcome Management are becoming more common in business management culture in general and in the GoC specifically, a successful implementation requires executive support (including Treasury Board), awareness building, promotion of the processes and communication with participants.  Also, better outputs from Outcome Management – user friendly tools, senior management communication tools – would improve the effectiveness of the approach.

NRCan’s GeoBase initiative used the results of the Outcome Management process and developed one-page case studies – rather than charts and diagrams – to demonstrate the value of the initiative to senior management.  These one-page case studies which describe success stories were well received.

Lesson 8:  Outcome Management still needs to overcome systemic challenges in the government context.

Outcomes – including tangible workforce adjustments and cost savings – are difficult to realize in the government.  Initiatives often realize these outcomes over time and often the anticipated cost savings and workforce savings are reallocated to other areas as part of the natural process of managing operational priorities.  In other words, because outcomes are generally realized slowly rather than as a “big-bang”, they often are not attributed to the initiative.  In addition, in a complex environment it is difficult to isolate the effect of a single initiative on a set of outcomes.

Often there is a challenge and even apprehension by participants to specify and later harvest outcomes because the impression is that any savings may be taken away from operating units.  At least one respondent remarked that some of these benefits were explicitly excluded from the business case for exactly the reason that they might be “scooped up” before they could actually be harvested.  This is an understandable action in an era of taxpayers demanding efficiencies, but needs to be balanced in terms of what will be identified voluntarily.  Section 3 further discusses similar challenges and opportunities.

Lesson 9:  Cost-benefit analysis was useful to document an initiative’s costs and areas for cost avoidance as well as conducting options analysis.

Included in the interviews were four projects that used cost-benefit analysis, not the Outcome Management approach.  The cost-benefit analysis resulted mainly in a thorough identification of costs, and in some cases identified a baseline of current costs in an activity-based costing framework.  In addition, the analysis was able to identify some areas of cost avoidance as a justification for the proposed investment.  Finally, the analysis techniques demonstrated that it can be used to conduct an options analysis that can guide a management decision to select the most effective option from amongst those available and considered.

Lesson 10:  Cost-benefit analysis identifies an initiative’s direct benefits; Outcome Management also provides a clearer definition of soft benefits (refer to Lesson 4).

Cost-benefit analysis resulted in a thorough definition of direct benefits, however, when these interview results were contrasted to the sample of projects that used Outcome Management, the approach did not result in as broad a definition of outcomes including indirect ones such as program outcomes, good governance outcomes, or Results to Canadians.  One of the projects used the cost-benefit analysis very effectively to drive business change.-

Canada Customs and Revenue Agency’s NETFILE project used cost-benefit analysis to identify areas for business improvement and because of the transactional nature of the domain, were able to build on this analysis and move towards activity-based costing in seven regions.

2.2   Best Practices in Outcome Management from Other Countries

This section presents best practices in Outcome Management in countries outside the Canadian context, which includes Organisation for Economic Co-operation and Development (OECD) member nations that have implemented their own form of Outcome Management.

Complement Performance Monitoring with Evaluations

To ensure optimal decision-making, performance monitoring and evaluations should be viewed as complementary elements of an overall Outcome Management approach.  Performance monitoring by itself can alert managers to problems that arise with regard to performance, but will not typically present solutions as the data alone may not be sufficient to solve most problems.  Evaluations determine the reasons behind the performance, such as cause-effect relationships, and make recommendations on how to improve.  Such activities are often overlooked in favour of simply presenting performance monitoring data; however, organizations need to integrate both monitoring and evaluation in an outcome management system to produce better decisions and results. See footnote 2

Adopt a Results-Oriented Set of Balanced Measures

Balanced measures are part of a strategic management system for achieving long-term performance goals.  Specifying balanced measures involves taking into consideration all stakeholders, including management, employees, partners and the public at all stages when conducting performance evaluations.  It also means going beyond standard financial measures in a traditional business case, to include both quantitative and qualitative measures that touch on “softer” issues such as morale, public health, efficiency, social and environmental aspects.  Using balanced measures offers a way for an organization to track the various factors that make up successful performance and outcomes.  The balanced measures approach sets an organization’s focus across employee, client, and business perspectives.  Prevalent in the private sector, this approach has made its way to assessing government initiatives and has been implemented in departments and agencies of the United States. See footnote 3   Another measure often used for achieving a balance of quantitative and qualitative outcomes is the Triple Bottom Line (TBL) approach, which encompasses financial, environmental, and social results.  The Government of South Australia has begun implementing TBL reporting across departmental agencies, and notes that the benefits of using TBL include enhanced reputation, benchmarked performance, improved risk management, and improved communication with stakeholders. See footnote 4  The Organisation for Economic Co-operation and Development (OECD) E-Government Project released a report in 2005 outlining the costs and benefits in e-government across jurisdictions, and found that most public agencies operate with multiple “bottom lines” and that many are beginning to use a balanced measures approach in realizing outcomes. See footnote 5  Appendix C summarizes the Outcome Management frameworks across the five countries that have most adopted this approach.

Align Measures with Accountability and Decision-Making Authority

In developing performance measures, organizations should ensure that these measures are aligned with accountability and decision-making authority.  Measures should directly relate to assigned roles and responsibilities, and individuals should only be held accountable for the areas in which they have influence.  Managers should lead by example and cascade accountability across the organization by creating an outcome-based organization and encouraging the sponsorship of measures at all levels.  Staff should be kept informed at all stages of the process. The public should also be kept informed through the Internet and traditional reports.  After successful initiatives, staff should be rewarded on a team basis. See footnote 6  Complexity of accountability and authority rises quickly when horizontal or vertical initiatives are undertaken, which involves multiple jurisdictions that may be complementary or opposing in their mandates or strategic objectives.  Organizational members must know and understand their responsibilities and what they contribute to the group’s goals.  Outcome owners and managers should have the ability to identify their own expected results and methods for data collection to ensure better reporting, decision-making, and outcomes. See footnote 7  Further to this, Benko and McFarlan in Connecting the Dots propose an alignment of project portfolios with objectives to enhance both public trust and return on investments. See footnote 8

Give Managers Autonomy

Managers given the responsibility of accounting for initiatives should at the same time be given the decision-making authority and ability to shift resources from low-performing activities and projects to ones that are performing at a higher level.  This idea of managing a portfolio of initiatives as opposed to a project is fundamental to obtaining the maximum set of results.  Unless invested with the authority, managers would be unable to directly improve performance results and would become disengaged in the process. See footnote 9  Maizlish and Handler in IT Portfolio Management present an approach to simplifying the process of achieving a rationalized and aligned project portfolio. See footnote 10  In addition, Lebow and Spitzer in Accountability propose a “freedom-based” approach to responsibility and workplace design that advocates individual freedom and personal accountability. See footnote 11

Use Measures that Provide Insight, not just Data

In selecting performance measures, there can often be a tendency to select ones for the simple reason that they are easily measured and produce ample amounts of data.  Instead of being based upon ease of data collection, measures should be related to organizational and strategic goals, and provide relevant and timely information for decision-makers to assess the progress of an initiative.  The measures should indicate the efficiency of the process, the results in comparison with the initiative’s intended goals, and the effectiveness of particular activities in terms of contributions to overall program objectives.  Well-selected measures can describe the direction and accomplishments made during the course of a program, as well as serve as a guide for future improvements in serving stakeholders. See footnote 12  The Auditor General of Canada’s Implementing Results-Based Management: Lessons from the Literature presents findings of best practices from across Canada and other jurisdictions for developing performance measurement systems. See footnote 13

Adaptation Instead of Adoption

Best practices are not necessarily suited for other organizations in similar initiatives.  However, practices can be adapted to fit a group’s needs and culture.  For example, a U.S. performance management study found that departments implementing the Balanced Scorecard™ approach as developed by Kaplan and Norton See footnote 14 had adapted it into a set of measures uniquely suited to the structure, culture, and goals of the organization. See footnote 15

Conduct Pilot Projects

Pilot projects are useful for testing new outcome-based management systems.  The test period allows for problems in the Outcome Management approach to be discovered early on and worked out before the full program is launched.  To be completely effective, pilot projects should imitate exactly how the final program will operate and must last long enough to clearly indicate the potential of the system.  This requires that resources be allocated and that a representative sample of group members participate in the trial run. See footnote 16

Manage Cultural Change

In addition to ensuring that new systems and structures are in place when managing change in an organization, it is also important to manage the changes in the organizational culture, which includes the norms, values, and behaviours of members in the organization.  It also includes formal and informal rewards and recognition mechanisms, as a means of fostering the new desired behaviour, and consequences for those that do not.  Outcome Management requires a renewed focus on learning and integrating lessons into decision-making, on results rather than processes, and on transparent performance reporting.  In formulating and adopting a change management strategy, changes brought about by a new system based on results could be supported through offering coaching, staff training, help desks, technical assistance, or knowledge bases.  Managing cultural change allows for new organizational values and procedures to be institutionalized, and for performance to adapt to new standards. See footnote 17