We are currently moving our web services and information to Canada.ca.

The Treasury Board of Canada Secretariat website will remain available until this move is complete.

Guide for the Development of Results-based Management and Accountability Frameworks

Section 1. Introduction to the Results-based Management and Accountability Framework

This is a guide to assist managers and evaluation specialists in establishing a foundation to support a strong commitment to results, a prime responsibility of public service managers. As outlined in the management framework for the federal government, Results for Canadians, public service managers are expected to define strategic outcomes, continually focus attention on results achievement, measure performance regularly and objectively, learn from this information and adjust to improve efficiency and effectiveness.

The Results-based Management and Accountability Framework (RMAF) is intended to serve as a blueprint for managers to help them focus on measuring and reporting on outcomes throughout the lifecycle of a policy, program or initiative. This document describes the components of a Results-based Management and Accountability Framework and to offer some guidance to managers and evaluation specialists in their preparation.

1.1 What is a RMAF?

Whether related to a policy, program or initiative, a Results-based Management and Accountability Framework is intended to help managers:

  • describe clear roles and responsibilities for the main partners involved in delivering the policy, program or initiative - a sound governance structure;
  • ensure clear and logical design that ties resources to expected outcomes - a results-based logic model that shows a logical sequence of activities, outputs and a chain of outcomes for the policy, program or initiative;
  • determine appropriate performance measures and a sound performance measurement strategy that allows managers to track progress, measure outcomes, support subsequent evaluation work, learn and, make adjustments to improve on an ongoing basis;
  • set out any evaluation work that is expected to be done over the lifecycle of a policy, program or initiative; and
  • ensure adequate reporting on outcomes.

If successfully developed, the Framework should represent:

  • an understanding between the partners on what they aim to achieve, how they plan to work together to achieve it, and how they will measure and report on outcomes;
  • a tool for better management, learning and accountability throughout the lifecycle of a policy, program or initiative; and
  • an early indication that the policy, program or initiative is set up logically - with a strong commitment to results - and with a good chance to succeed.

1.2 Why Do We Need a RMAF?

The management framework for the federal government, Results for Canadians, sets up the expectation that managers will focus on measuring progress toward the attainment of the results of their policies, programs and initiatives such that ongoing improvements can be made. The Treasury Board (TB) Policy on Transfer Payments formalizes the requirement for a RMAF as part of a TB submission, and the TB Evaluation Policy indicates that there are other occasions when a RMAF may provide benefits to managers, even when not required under the TB Policy on Transfer Payments.

The Government direction and policy is to provide members of Parliament and the public with relevant, accurate, consolidated, and timely information on how tax dollars are being spent and what Canadians receive as a result. The Government of Canada is committed not only to measuring and reporting on results, but also to establishing clear standards against which actual performance will be reported.

Three parliamentary instruments are crucial in working towards these objectives. Departmental Reports on Plans and Priorities (RPP), which are tabled in the spring along with the government's Main Estimates, report on the rationale for initiatives and establish the strategic outcomes against which actual performance will be measured. Departmental Performance Reports (DPR) are Estimates documents, which are tabled in the fall. They report on achievements against the strategic outcomes that were established in the departmental RPP. The third key document is Managing for Results which is also tabled each fall, along with the DPR, as part of the "Fall Reporting Package." This government-wide report on performance is now being refocused to summarize Canada's progress within a set of key societal indicators.

All three of these reports are tabled in Parliament by the President of the Treasury Board and may be referred to the relevant Standing Committee of the House of Commons for further review.

The form and focus of departmental planning and reporting is drawn from an organization's Planning, Reporting and Accountability Structure (PRAS). The Departmental PRAS, a Treasury Board approved document, provides the framework by which the RPP and DPR are developed and resources are allocated to most federal organizations. The PRAS requires departments and agencies to clearly outline the shared outcomes they want to achieve on behalf of Canadians.

The RMAF should be prepared at the outset of a policy, program or initiative, ideally at the time when decisions are being made about design and delivery approaches. When the RMAF is part of a Treasury Board submission, it's approval is implicit. RMAFs prepared outside a Treasury Board submission process, however, need to proceed through an approval process given that the RMAF represents a serious commitment to results measurement and reporting.

In order to better meet commitments under the Social Union Framework Agreement (SUFA) to improve accountability and transparency to Canadians, managers should consult the SUFA Accountability Template in the development of RMAFs. This comprehensive template, which reflects all aspects of SUFA accountability provisions, is the basis of a pilot project in support of the Federal Government's SUFA accountability commitments.

Although RMAFs generally address most of the measurement and reporting requirements in the SUFA template, there are specific areas that may require examination. These include areas related to: mechanisms to engage Canadians in the development and review of social policies and outcomes; establishment of mechanisms for Canadians to appeal administrative practices; use of comparable indicators where appropriate; and tracking Canadians' understanding of the Federal contribution to policies, programs and initiatives.

1.3 Continuum of Results Measurement

The measurement of results is not an isolated activity. Rather, the process of measuring results begins with the design of a policy, program or initiative and evolves over time. Different results-measurement activities occur at different points in time, but always as part of the ongoing management of a policy, program or initiative. This continuum, from the initial consideration of performance measurement, through performance monitoring to formative and summative evaluation, is presented in Exhibit 1.1.

The diagram offers a pictorial view of the key stages and the process required to develop performance measures for any given policy, program or initiative. While shown as a linear process, it must be stated that performance measurement development is iterative and therefore review and feedback are critical parts of the process.

The development of a RMAF would involve stages 0 to 3 in this continuum - in essence, establishing the commitment for outcomes measurement. This is not an end in itself however. The ability to measure and report on results requires the 'implementation' of the RMAF - and this takes managers through stages 4 to 6, and lays the groundwork for evaluation activities (i.e. stages 7 and 8).

While program managers are accountable and need to be integrally involved in every stage, most organizations have specialists who can facilitate the development and implementation of results measurement. Notably, evaluators, key to stages 7 and 8, can also play a useful facilitation role in stages 0, 1 and 2. Likewise, information management specialists could be key advisors in stages 3 and 4. This is discussed in more detail in a later section of this Guide.

Click here to view full sized image - Continuum of Results Measurement

Continuum of Results Measurement

Continuum of Results Measurement – Text version

1.4 Who Should Be Involved in the Development of a RMAF?

There are three key parties involved in the development and implementation of a Results-based Management and Accountability Framework: managers, evaluation specialists and, in the case of those involving Treasury Board submissions, analysts of the Treasury Board Secretariat.

Managers hold the primary responsibility for the development of the RMAF. Managers are:

  • responsible for ensuring that the content of the framework is accurate and that it reflects the design and operation of the policy, program or initiative, as well as all reporting requirements; and
  • responsible for implementing the RMAF, that is, ensuring that data are collected and reported on accordingly.

Evaluation specialists can be an effective support to managers in this process:

  • working with managers, evaluators can provide important guidance and technical expertise throughout the development and implementation of the Results-based Management and Accountability Framework; and
  • assisting in the development of the logic model, facilitating development of an appropriate set of performance measures and advising on key methodologies and measurement issues implicit in the performance measurement and evaluation strategies.

When RMAFs are developed to meet a Treasury Board commitment, analysts of the Treasury Board Secretariat can advise departmental managers and evaluators on general requirements related to the framework before it is approved by the departmental Minister and submitted to the Board. As such, it may be helpful to consult with this group during the preparation of a RMAF.

1.5 What are the Guiding Principles for this Process?

The development and implementation of a Results-based Management and Accountability Framework should be conducted under the following guiding principles:

  • utility - to ensure that managers can use the framework to explain their policies, programs and initiatives to Canadians and to institute sound performance measurement approaches and manage for results.
  • shared ownership - to meet the needs of all stakeholders and with the active involvement of managers, to ensure that information needs of managers, as well as formal accountability requirements are met;
  • transparency - to ensure that all stakeholders understand what outcomes are expected as well as how and when they will be measured;
  • decision- and action-oriented - to ensure that information needed by managers and other stakeholders is available when it is required for key decisions;
  • credibility - to ensure that professional standards See footnote 1 are adhered to and that the framework establishes realistic commitments for measurement and reporting; and
  • flexibility - to respond to the ever-changing context within which policies, programs and initiatives operate, the framework needs to be regularly revisited and adapted as necessary.

While there is not a specific required length for a RMAF, to be most useful the final document should consist of a concise presentation of each of the necessary components of a RMAF. A framework might be as short as 10 to 15 pages and, preferably, no longer than 30 to 35 pages. Managers should use their judgement in making decisions about the level of detail required, considering issues such as whether the information exists elsewhere (and thus does not need to be replicated in great detail) and maximizing the probability that the document will be utilized (thereby restricting length to a manageable size).

The next sections of this document guide the reader through the components of a RMAF and the steps involved in their development. Key concepts from a lexicon developed for use by Treasury Board in the context of performance measurement and evaluation are presented in Annex A.

Date modified: