We are currently moving our web services and information to Canada.ca.

The Treasury Board of Canada Secretariat website will remain available until this move is complete.

Five-Year Evaluation of the Management Accountability Framework


Archived information

Archived information is provided for reference, research or recordkeeping purposes. It is not subject à to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.

4  Recommendations

As demonstrated in the previous section, our key findings were developed based on the evidence gathered from multiple lines of evidence and were consolidated, analyzed and synthesized into the 10 key findings reported.  Based on the results, we have developed five recommendations to address our principal findings.  These recommendations support the evolution of MAF as a performance enabler and to continue its usefulness and sustainability as a tool for both TBS and deputy heads to strive towards management excellence.  

The following illustration maps the five key recommendations to the 10 principal findings and the lines of evidence used to support the findings.

Figure 7: Mapping of Evaluation Recommendations to Principal Findings and Lines of Evidence
Figure 7: Mapping of Evaluation Recommendations to Principal Findings and Lines of Evidence

Figure 7 - Text Version

Figure 7 - Display full size graphic

4.1  Detailed Recommendations

The following section describes our recommendations including a description of the benefits and risks associated with implementation.  Management responses have been developed and are presented in a separate document.

1.  Implement a risk/priority based approach to the MAF assessment process.

To ensure that MAF is addressing the most relevant indicators for a particular organization and has embedded sufficient incentives for senior executives, we recommend that TBS consider implementing a risk/priority approach for MAF assessments.  Possible approaches, which would require further analysis, to achieving this are provided below.

Assessment based on the risks/priorities unique to the organization:  While the context letter provided to the deputy head with the final assessment does include key priorities for focus, TBS could, after consultation with departments and agencies, identify which indicators address the main priorities and risks of that organization for that year.  This discussion could occur in conjunction with debriefs of the previous round’s assessment.

To limit the amount of time and resources required to obtain agreement on the indicators, a variation of this recommendation could include grouping departments and agencies (e.g., by industry sector or size) and identifying the indicators that should be assessed, based on the risks and priorities of the group.  

Assessment based on clustering of indicators:  Classify the indicators into categories:  mandatory, optional, and cyclical.  Mandatory indicators (e.g., financial management and control), for all organizations would be assessed every year.  For the optional category, the assessment cycle could be based on the risk/priorities by departmental cluster (e.g., by industry sector or size).  Finally, the cyclical category might include a rotating set of indicators that reflect current government priorities.  With sufficient advance notice, the categorization of the indicators could be subject to change. 

Assessment based on the results of previous rounds:  Review the previous MAF assessment score.  Based on the performance for specific indicators, a department or agency rated as “Strong” in the current assessment round would not be formally assessed in the subsequent round.  This would provide a one-year “pass” from formal assessment.  This approach could also be combined with a self-assessment for those indicators not being assessed to confirm that no significant changes have taken place during the year.  A similar exercise was attempted for AoM#1 in Round VI.

Assessment based on department/agency size:  We agree with and recognize that TBS has already applied this approach to assessments.  Small and micro agencies are assessed using a three-year cycle and a ‘light’ version of the MAF.  TBS may wish to consider expanding on this concept by establishing separate MAF criteria and reporting requirements for small agencies, i.e., a “MAF-light”, to address the disproportionate burden felt by small agencies under the current MAF assessment approach.

We think the main benefit to using a risk-based approach is the flexibility in tailoring the assessment to the department or agency’s specific needs; it provides an explicit means for organizations and TBS to agree on what is important.  A risk-based approach also supports reduction in both the departmental and TBS resources required to manage the MAF assessment process.

There are two identified risks to using this approach.  First, if all aspects are not measured every time, there is a risk that a substantive issue may be missed or overlooked.  Second, if a risk-based approach is used, comparisons across organizations may be difficult.

2.  Develop the guiding principles (“golden rules”) for assessing managerial performance and incorporate them into the existing MAF assessment methodology.

While each AoM measures a separate management area, they each support an overall assessment framework and methodology.  We recommend that a common set of guiding principles or “golden rules” be identified, agreed upon and applied to each AoM.  Examples of the golden rules that should be considered are as follows.

  1. Maintain an appropriate balance between quantitative and qualitative indicators within each AoM.
  2. Develop outcome-based indicators of managerial performance.
  3. Leverage information that is available through other oversight activities that support the indicators identified for each AoM.
  4. Maintain the stability of indicators, which is critical to the assessment of progress over time.
  5. Ensure clarity and transparency of indicators, measurement criteria and guidance documentation.
  6. Engage functional communities in open dialogue in the ongoing development of AoM methodology and assessment measures.
  7. Include assessment and identification of both policy compliance and result-based managerial performance.
  8. Seek ways to recognize and provide incentives to encourage innovation across departments and agencies.

The following paragraphs provide further explanation for the recommended golden rules outlined above.

a)  Balance quantitative and qualitative indicators:  We recommend that TBS review the lines of evidence and consider the appropriate balance between objective, quantitative indicators and qualitative, subjective indicators.  Where subjectivity is required, sufficient definitions are necessary to minimize the risk of inconsistent application across departments and agencies. 

b)  Focus on outcome-based measures:  When assessing elements of managerial performance (as compared to policy compliance), outcome-based indicators (vs. process-based) allow an organization to assess whether the management practices put into place had an impact on the quality of decisions and actions.  We recommend that when reviewing the lines of evidence, consideration be given to structuring those indicators that are measuring managerial performance as outcome-based to provide a more accurate representation of the impact of the management practices put into place.  As the MAF logic model developed by the MAF Directorate highlights expected outcomes of MAF, this model could be leveraged to develop these indicators.

c)  Leverage existing and available information:  Once indicators and measurement criteria have been developed, the sources of information used to assess against the indicators must be established.  Leveraging existing information through other oversight mechanisms both internal to departments and agencies and from central agencies (e.g. Office of the Comptroller General, Auditor General of Canada) would maximize the efficiency of the reporting element of the MAF assessment process.

d)  Stability of indicators:  Once developed for a particular AoM and overall agreement and support have been obtained, the indicators ideally should remain stable.  Changes should be considered when there are significant changes to the environment relative to that managerial area.

e)  Clarity and transparency of indicators, criteria and guidance documentation:  To the extent possible, all indicators, lines of evidence and measurement criteria should be clear, simple to understand and transparent in the approach that leads to an assessment result.  In conjunction with recommended golden rule a), to the extent that subjectivity is built into an indicator, clarity and transparency on the application of judgment will be required.   

As part of the assessment methodology, we recommend that a consistent scoring approach be developed that will facilitate a standard yet flexible overall score per AoM.  A weighted average approach would support this goal.

We recommend TBS consider changing the terminology used in the MAF assessment scale.  Our consultations indicated that the term “acceptable” carries a negative connotation.  Alternative terms such as “meets standards”, “meets expectations” or “well placed”, among others may be considered.

f)  Engage functional communities in open dialogue in the ongoing development of AoM methodology and assessment measures.  When changes to AoM methodology and measures are required to be updated and refined, we recommend that departmental stakeholders be consulted during the development process.  TBS could leverage the applicable functional communities to facilitate an ongoing and cooperative dialogue to ensure that changes meet the needs of different stakeholders and further increases the likelihood of acceptance by departments and agencies.

g)  Assessment and identification of measures of policy compliance versus managerial performance:  We agree that MAF should continue to measure both managerial performance and policy compliance.  We recommend; however, that TBS separate the assessment of managerial performance and policy compliance by identifying which indicators measure performance and which measure compliance. This could include different approaches to assess policy compliance versus managerial performance.

  • Policy compliance indicators could use an evidence-based, primarily quantitative approach.  In contrast, it may be more appropriate to measure managerial performance using self-assessment, interviews or surveys to gain insight into the more qualitative indicators. 
  • Information available through other oversight/reporting mechanisms is more likely to support policy compliance indicators.  For example, some data for policy compliance indicators can be found in an organization’s audit reports.

h) Recognize and provide incentives to encourage innovation:  To continue to encourage innovation and creativity across the Federal government, TBS has an opportunity to provide incentives and recognize these efforts.  There are various approaches to embed innovation within the MAF, as follows: 

  • Development of an “innovation” AoM:  TBS could consider adding a voluntary AoM related to “innovation” for which departments and agencies would submit evidence to support their efforts towards innovation.  Annually, the assessment of innovation could be targeted to a specific government priority, i.e. “greening of government”.  An assessment against established criteria should be performed by qualified Federal government and external panel members, with associated public recognition and awards for those with the highest ratings in this category.
  • Innovation embedded within existing “strong” rating criteria:  Innovation could be embedded as a component of the “strong” rating criteria within each AoM being assessed.  Based on the results of the MAF assessments, those demonstrating innovation within different AoMs as a result of the rating definitions could be selected for consideration for further assessment and recognition as described above.

Incorporating the golden rules for assessing managerial performance will benefit organizations by providing greater clarity around expectations and increasing the consistency of assessments.  Further, this benefits both TBS and departments and agencies in more accurately gauging progress and focusing on specific areas of importance to the organization.

We note that if changes are made to the performance assessment framework, there is a risk to the short-term stability of the lines of evidence.  Additionally, we note that a significant amount of resources are required to properly and exhaustively streamline indicators; it may be that this approach would need to be rolled-out over time.

3.  Introduce or leverage an existing governance body with senior representatives from client departments to assist MAF.

We recommend that a deputy head or a departmental senior executive (i.e., Assistant Deputy Ministers) steering committee be introduced or an existing forum be leveraged. This governance body could be used to advise on changes to the MAF process and to use the MAF results to guide government management priorities.

  • Currently there is some degree of input requested from the departments and agencies in developing the MAF process and indicators.  To increase the acceptance of the assessment process, input should be sought from the governance body on the relative impact of the MAF indicators on the affected organizations. 
  • As the senior executives within the departments and agencies are collectively responsible for the managerial practices across the government, there is an opportunity to engage the governance body to review the assessment results, including horizontal issues, to discuss implications and provide input on key priorities for the upcoming year based on those results.

A governance body with senior representatives from client departments will benefit TBS by increasing the likelihood that changes to the MAF process will be accepted and acted upon within departments and agencies.  The governance body becomes an added communication tool that may be used by TBS.  Conversely, introducing additional governance may increase the time required for reaching agreement.  That is, there is a risk that decisions become delayed. 

4.  Develop a stakeholder engagement and communication strategy and plan, including early engagement when changes are made to the MAF assessment process.

Our findings indicate a lack of clarity among departmental stakeholders on critical elements of the MAF process, which directly impacts their acceptance and satisfaction with the process and methodology.  There are opportunities to increase the engagement of stakeholders within the departments and agencies through enhanced communication at key steps within the MAF assessment process. 

We recommend that TBS consider the development of an engagement and communication strategy and plan related to the MAF assessment rounds to increase the visibility of the process.  This could include the following key gaps in understanding and clarity:

  • Classification of indicators by policy compliance versus managerial performance;
  • TBS’ quality assurance and review process prior to release of draft and final assessment results; and
  • The role of MAF assessments in the overarching system of public administration and deputy head evaluation by the COSO (communication should be coordinated with the PCO). 

Apart from direct bilateral discussions between TBS and departments, the communication and engagement strategy may be supported through the use of existing forums or communities such as Chief Financial Officers and Chief Audit Executives, as well as through a MAF governance body (as per our previous recommendation).  Additionally, the communication strategy could potentially address how to communicate the MAF process and results to parliamentarians.

We further recommend that as changes are made to the assessment process, methodology or indicators, early engagement of department and agency stakeholders be considered to allow sufficient time to respond and increase the likelihood of complete and accurate information being submitted.

Developing and executing an engagement and communication strategy and plan benefits TBS by enabling greater acceptance of the MAF process within departments and agencies.  However, we recognize that such a plan may increase pressure on TBS to provide timely updates by placing additional constraints around communication deadlines.

5.  Assign formal responsibilities within TBS to oversee the MAF assessment methodology/framework and management of horizontal issues/action plans.

We recommend that TBS consider expanding the role of the MAF Directorate to provide horizontal oversight and integration across the lines of evidence for both methodology and results.  This expanded role would require the MAF Directorate to review the indicators and criteria for consistency across all elements. 

When the MAF results are complete, we recommend that the MAF Directorate work with the AoM/indicator leads to identify horizontal (cross department) issues and then develop appropriate action plans, communicate the plan and monitor the results.  The MAF Directorate would report on progress to the Assistant Secretaries and the Associate Secretary.

The benefit of assigning formal oversight responsibilities for MAF assessment methodology and management of horizontal issues is that this ensures the framework is sustained and, most importantly, holds stakeholders accountable for action plans.

4.2  Summary of the Recommendations

The following table demonstrates the alignment of the each recommendation detailed above to nine key areas (as identified by deputy heads) of the MAF.

Table 2 Alignment of Recommendations to Key MAF Areas

Key Areas

Recommendations for Evolution of MAF to a Performance Enabler

1: Risk-based Approach

2: Guiding Principles/ Golden Rules

3: Governance Body

4: Engagement & Communications

5: Horizontal Oversight

MAF vision and objectives

yes

yes

yes

yes

no

MAF governance, including roles and responsibilities

no

no

yes

no

yes

MAF methodology of assessments

yes

yes

no

yes

yes

Reliability and accuracy of MAF assessments

yes

yes

no

yes

yes

MAF reporting requirements

yes

yes

no

no

no

Systems supporting MAF

no

yes

no

yes

yes

MAF process

yes

yes

yes

yes

yes

MAF treatment of entities

yes

no

no

no

no

MAF alignment to GC's planning cycle

no

yes

no

no

no



Date modified: