We are currently moving our web services and information to Canada.ca.

The Treasury Board of Canada Secretariat website will remain available until this move is complete.

Five-Year Evaluation of the Management Accountability Framework


Archived information

Archived information is provided for reference, research or recordkeeping purposes. It is not subject à to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.

Annex D   International Comparisons

 

United Kingdom (UK)

United States (US)

European Union (EU)

Name of program/framework

Capability Review (CR)

President’s Management Agenda (PMA)

Common Assessment Framework (CAF) [“Study on the use of the Common Assessment Framework in European Public Administration”]

Type of program/framework

Organizational capability: The objective is to “...improve the capability of the Civil Service to meet today’s delivery objectives and to be ready for the challenges of tomorrow.” [“Take-off or Tail-off?  An evaluation of the Capability Reviews programme”]

Organizational performance.

Organizational capability: The objective is to improve quality of public service administration.

How long has the program been in place?

Since 2005.

Since 2001.

First released in 2000.  Major updates were made in 2002 and 2006. Adopted by approximately 1,775 public organizations in the EU by mid 2009..

Purpose of program

Assess capability as well as provide support for improvement.

Improve agency performance in 5 specific areas.

Program initiated by the EU Member States related to public sector quality management.

Brief description of framework

The CR contains 10 program elements in 3 broad areas of:

  • Leadership
  • Strategy
  • Delivery

The PMA contains 24 “standards” defined across 5 key initiatives:

  • Strategic management of Human Capital.
  • Competitive Sourcing.
  • Improved Financial Performance.
  • Expanded Electronic Government.
  • Budget and Performance Integration.

The CAF is patterned after the European Foundation for Quality Management (EFQM) business model and the model of the German University of Administrative Sciences in Speyer.  It contains nine criteria (five Enablers and four Results criteria).  These criteria are divided into 28 sub-criteria.

 

What indicators are used?

Five assessment categories are used;

0 = “serious concerns”; 4 = “strong-good capability for future delivery in place”

A “double scorecard” system is used.  Standards (indicators) are established to measure:

  • Current status of initiatives
  • Progress on improvement

This is scored by assessing the agency’s “Green Plan”  (in this case, meaning a plan to get to “green” on all initiatives).

Views “enablers” as organizational capability, “results” as output and outcomes and includes criteria for measuring both within the framework.  Based on self-assessment.

Is it an “outcomes-based model”; that is, what specific outcomes of management performance or organizational capability is being assessed?

Focus is on organizational systems and processes that promote delivery capability.

Current status scores verifiable through the agency’s Performance Assessment Rating (PAR).  Progress criteria are developmental and more subjective.

See above.


Assessment Methodology

 

United Kingdom (UK)

United States (US)

European Union (EU)

How is the assessment done?

Yearly external review of key capabilities.

Quarterly self-assessment with input from the President’s Management Council (PMC).

Internal review, usually be project teams appointed by the organization.

Who does the assessment, what are the characteristics of the assessors?

Capability Review Teams situated in the Cabinet Office.  A five-person team: 3 “experienced” people from outside central government and 2 director generals from other government depts.  Each team usually includes 2 members from the private sector (note: teams reduced to 3 in the second round of assessments). [National Audit Office Assessment of the Capability Review Programme, Feb 2009]

Internal, progress indicators and Green Plans reviewed by Office of Management and Budget (OMB).

Internal project teams.

Is it purely evidence-based or is assessor judgment used? What is the balance between evidence and judgment?

Judgment:  the review teams make judgments about the 10 elements on a score of 0 to 4 using an underlying set of 40 questions.

Evidence based: internal indicators noted in PAR. Judgment, PMC and OMB provide input into Green Plan initiatives.

Judgment based on the CAF and the evidence gathered for each sub-criterion on the basis of group consensus.

Is a risk-based approach used? (Are all component of the framework assessed or are certain components used at different times?)

All components are assessed.

All components are assessed.

A complete assessment is done each time, but there is flexibility in how the CAF is applied.

How are organizations selected for assessment? Are all organizations assessed or is there a selection process?

All central government agencies are assessed.

All are assessed in quarterly scoring process.

Self-assessment; voluntary.

Are different approaches used for organizations of different types and sizes (i.e., is it a one size fits all model)?

One size.

One size.

The CAF has to be implemented respecting the structure of the nine criteria and 28 sub-criteria and the scoring system.  Further, each organization and country has to take into consideration its own specificities.

Do the results of the program affect the senior management's remuneration? What are the outcomes of good/bad performance?

There are some linkages to salary decisions of Permanent Secretaries.  The focus is on improvement, therefore poor performance leads to assistance as required, but Permanent Secretaries are held accountable.

There are some linkages to both budget and Chief Operating Officer evaluations. The Executive Branch Management Scorecard provides overview of progress in all agencies.

Depends on how the framework is applied.


Outcomes of Assessment

 

United Kingdom (UK)

United States (US)

European Union (EU)

What are the immediate results of the assessment?

Each department must create an action plan to rectify problems identified. [National Audit Office Assessment of the Capability Review Programme, Feb 2009]

Agencies must create “Green Plans” demonstrating how they will improve on each of the criteria assessed.

Depends on application, but a recent survey of 133 organizations who have used the CAF found that 87% started improvement initiatives based on the assessment. [“Study on the use of the Common Assessment Framework in European Public Administration” – 2005]

What are the improvements in service delivery/performance results for citizens?

Difficult to separate out improvement from the CR from other initiatives.

Consideration of PAR results fits into the PMA. Chief Operating Officers are held accountable for results.

The survey suggests that the aforementioned improvement initiatives might or might not lead to improvement in results for citizens. Difficult question to answer due to attribution and time-lag issues.

Cost/value for money

Direct first round of reviews cost £5.5 million, reduced to £4.3 million in round 2; approximately £226,000 per department. Difficult to assess cost to departments for internal work on the review. Difficult to determine value for money given that full costs are not known. Departments also have a hard time separating the value delivered by actions linked to Reviews from other ongoing programmes. The Cabinet office charges departments £150,000 for the review.

N/A.

N/A.

Main benefits

Departments have designated board members who coordinate action on the CR. This brought a focus on management within departments and a means of connecting across departments for sharing lessons learned.

Primarily focusing senior management’s attention on important management issues. Other benefits include better fact-based decision making and better internal coordination.

Provides a focus and methodology for organizational improvement and is strongly based on the involvement of all staff in the installation of an organizational culture of total quality.

Lessons learned

Departments showing improvement on CRs, but no link to actual performance in terms of delivery noted.

Need to consider middle layer management as well as programs delivered through 3rd parties.

Need to consider comparison to organizations outside of government (i.e. best practice)

Yellow criteria can generally be objectively assessed, however there are difficulties with green criteria (e.g. documentation is often incomplete; some evaluations are difficult to substantiate). [“Review of OMB’s Improved Financial Performance Scorecard Process.”]

N/A.

Are results made public?

Yes, published on agency’s web sites.

Yes, published on department’s web sites.

The results remain the property of each organization.

Annex E   Evaluation Questions

Section

Evaluation Questions

Is MAF Meeting Its Objectives?

  • P1. Is MAF realizing stated objectives? Are the objectives appropriate or should they be broadened/narrowed? As part of this assessment, the relevancy and effectiveness of the 21 Areas of Management will be assessed. 
  • P2. Is MAF adequately assessing and contributing to improved public sector management?
  • P3. Should MAF measure other broad elements of management (e.g., policy development, leadership ability, management reporting systems, etc.)?
  • P4. To what extent does MAF support the multiple objectives of the government (e.g., deputy head support and advice, Committee of Senior Officials, TB support, horizontal assessment of management practices, etc.)?
  • P5. What are the ancillary benefits and uses of MAF to others?  Who do they accrue to (e.g. Parliament)?
  • P6. What impacts has MAF had (intended or non-intended)?
  • P9. In a post-FedAA world where deputy heads are Accounting Officers, does MAF need to change?
  • P11. Is there evidence overall other government organizations are using management frameworks that are performing better than MAF?
  • A11. To what extent can, or should, MAF be used to support other government initiatives (e.g., greening of government, gender-based analysis, policy compliance and expenditure management)?

Are MAF Assessments Robust?

  • A3. Is MAF methodology of assessment reliable, fair and reasonable?  To what extent is MAF providing a reliable and accurate assessment of management within and across federal organizations (e.g., do MAF results accurately reflect realities, including departmental performance)?
  • A4. To what extent should the MAF process be evidence-based?
  • A5. Is the present assessment approach appropriate (present approach versus independent or risk-based approaches)?  Is it the most efficient and effective way to obtain MAF results?  Are results communicated, used and reported effectively?
  • A6. Should MAF treat portfolio entities, small agencies and non-PMP entities differently?
  • A7. Is there value in having stability in the MAF tool?
  • A9. Are MAF reporting requirements excessively burdensome?  How can the process be streamlined?
  • A10. Is MAF adequately aligned to the GC’s planning cycle?
  • A12. What lessons can be learned from other jurisdictions to improve administration in the assessment of management performance?
  • P10. What can we learn from other jurisdictions?  How are other jurisdictions assessing public sector management performance practices?  What other tools exist both nationally and internationally and how do these compare to the MAF?  Are there lessons learned that can be applied to MAF (e.g., use of private sector)?

Is MAF Cost Effective?

  • P7. Is MAF cost effective and does it deliver value-for-money?  Is the level of effort and impact on the department justifiable?
  • P8. What is the optimum balance between required departmental resources/ investments and level of effort to ensure adequate (and not necessarily optimal) departmental performance?

Is MAF Governance Effective?

  • A1. Is MAF governance effective?  Are the roles and responsibilities, systems and approval processes supporting MAF efficient and effective?
  • A2. Is TBS the appropriate entity to assess and measure government management performance (e.g., does TBS have adequate capacity and level of experience)?  Should a 3rd party be considered?  If so, who should be the members of the 3rd party?
  • A8. How can you ensure that MAF allows for systematic and transparent conversations between deputy heads?

 

Annex F   MAF History

The History, Objectives and
Evolution of the Management Accountability Framework
16

The seed for the Management Accountability Framework (MAF) was planted in June 1997 when the government designated Treasury Board as the government’s management board. One of Treasury Board’s new responsibilities was to lead and provide expertise in the development of an agenda to improve management practices in federal departments and agencies.

These tasks were to be performed under the umbrella of the Modern Comptrollership Initiative (MCI).  The MCI was the result of the work performed over a two-year period by the Independent Review Panel on Modern Comptrollership in the Government of Canada from 1996 to 1998. The state of management in the federal system had been a longstanding concern which had been intensified in the context of the sharp cuts in federal program spending made in the mid-1990s as the government acted to reduce and then eliminate the federal deficit. Delivering federal services efficiently and effectively with sharply constrained human and financial resources required a heightened emphasis on how those resources were being managed and an effort to understand what modern public sector management should look like and what might be required to institute it. The MCI was the pivotal piece in the creation of MAF some five years later.

From 1998 to 2001, Phase 1 of the Modern Comptrollership Initiative was operating as a pilot project. Phase 2 was to be implemented government-wide in 2001 under the Modern Comptrollership Directorate. Phase 2’s premise was to enable front-line public servants to deliver services to citizens efficiently and effectively. The Directorate believed that the management framework necessary to achieve this purpose had to encompass key MCI concepts but be broader in scope. The framework would have to engage deputies directly and make them accountable for the management of their organizations.

The Modern Comptrollership Directorate continued with its work and by the end of 2002 it had reviewed 13 additional models or studies of modern management and categorized each as falling into one of five groups. By the end of this work, the Directorate had put together the central features of what became known as the Management Accountability Framework in late 2002. MAF was different from the MCI in that MAF was driven by a senior executive champion who was committed to the modernization of federal government management and who was convinced of the potential merits of this new tool.

In 2003, the Secretary of the Treasury Board saw MAF as a tool that would allow him to hold “facts-based conversations” with his deputy minister colleagues about their departments and their particular management problems and concerns. Also in 2003, the Coordinating Committee of Deputy Ministers endorsed MAF as the principal tool with which TBS would engage departments and agencies on management issues.

MAF’s Objectives

By early 2003, MAF had been transformed from a new set of ideas about management looking for support into a framework through which Treasury Board Secretariat would try to entrench the importance of modern comptrollership within the federal public service.

MAF reflected and brought together in one tool a range of TBS management improvement initiatives that were then underway as a result of the March 2000 publication of Results for Canadians: A Management Framework for the Government of Canada. Results for Canadians identified a number of management areas in which departments and agencies needed to improve their performance and it set out a forward agenda for management reform. MAF provided the first “explicit and coherent model for high organizational performance” with which TBS and departments and agencies could work together.

MAF was to focus on “management results rather than required capabilities.” It was about strengthening accountability to manage federal organizations effectively, serve Ministers and government, and deliver on results for Canadians. MAF was intended to help deputies and their executive teams manage their organizations more effectively by helping them pose critical questions that needed to be addressed or by helping them monitor and assess their own performance.

The Management Accountability Framework would also be a TBS oversight tool. Departments and agencies needed to be able to demonstrate progress in implementing the framework to TBS.

The three initial objectives of MAF were to:

  1. Use MAF as the basis for dialogue between TBS and deputy heads on the state of management practices.
  2. Frame TBS’s input into the COSO (Committee of Senior Officials) process for assessing deputy heads’ performance in managing their department or agency.
  3. Frame reporting on management so that the management practices and efforts to improve them could be more readily assessed.

In addition, MAF was expected to help integrate and streamline management reporting and information management within TBS. MAF was seen as a horizontal initiative within TBS that could help break down barriers between program sectors and policy centres.

The Evolution of MAF

Since its introduction in 2003, the actual collection and assessment of relevant departmental information is performed by TBS program and policy staff. The MAF Division now plays a coordinating role to ensure that the process and implementation of the MAF assessments is completed efficiently and on time.

From 2003 to 2005, MAF operated without an overall governance structure. There was no committee of senior TBS officials who were directly responsible for providing guidance and oversight to the project. There were no mechanisms in place to enforce process requirements such as deadlines. At the start of Round III in 2005, the MAF Division established an informal arrangement with the committee of TBS’s four program sector assistant secretaries (PSAS) that the Division could bring to PSAS problem issues and that PSAS would try to resolve them. From the end of Round III to the end of Round IV, a formal governance structure was put in place with TBS’s Management Policy Oversight Committee (MPOC) tasked with providing overall guidance and oversight.

Following Round III, a MAF Strategy Forum was put in place. Its basic purpose was to provide a venue in which disputes about assessment ratings could be resolved. This arrangement worked well because Strategy Forum did not have the same constraints on its time as MPOC did. What has evolved through the latter part of Round IV and Round V and continuing through Round VI is a far more effective overall governance structure involving POC and the Strategy Forum. POC reserves its time for big MAF issues such as overall direction setting, approval of methodology and timelines and oversight of the assessment process. Strategy Forum now provides the ongoing operational guidance and support that MAF requires, in particular, enforcing deadlines, resolving disputes, and reviewing individual areas of management and overall departmental assessments.

Since its inception, MAF has been an ambitious big assessment system used by all major federal departments, small agencies and micro-agencies. For every organization, the assessments currently cover 21 areas of management, 68 lines of evidence and some 300 criteria.

This was MAF in 2008 and the sixth assessment round. While there have been many changes in approach and methodology to reflect TBS’s growing experience with the assessment tool and the Secretariat’s desire to improve it in light of problems and opportunities that have been identified, the basics of MAF remain the same as they were in 2003.

The original 10 elements remain in place as do the expectations associated with each of the 10 elements that public service managers should meet. Associated with each of the elements are a number of areas of management designed to give a sense of the scope of each element and to also suggest how progress towards meeting expectations might be assessed. While the number of areas of management has changed over the years, their purpose has not. To gauge whether progress was being made towards meeting the expectations of the 10 elements as described by the areas of management, lines of evidence were developed that also remain in place today.

MAF Round I and Round II

The basic premise of Rounds I and II was that assessments should be based on knowledge and information about departments already available within the Treasury Board Portfolio. Areas of management were not rated during Round I because the ratings were subjective since there were no formal methodologies that defined how the information collected would be used to assess the areas of management. In addition, MAF was supposed to be an initiative that did not add to departmental work. In practice, however, TBP analysts often had to contact departmental officials to get information they were missing. Thus, MAF did, in fact, work with departments from the beginning.

MAF Round III

Important methodology changes were made during Round III about the basic premise concerning the information required for assessment purposes. The decision was made to expand the information base to include things that TBP staff should know about departmental management issues. As a result the number of areas of management increased to 41 for Round III.

Departments were invited to participate in defining the areas of management, which drew them further into MAF-related work. Formal arrangements were also made for the exchange of MAF-related information between TBP and departments. Each department designated a MAF contact person and the information was to be exchanged between the departmental MAF contact and the TBP program analyst for that department. This “single-point-of-contact” system was meant to regularize the information exchange and make it transparent. This would end the process by which different TBP analysts would obtain MAF-related information from different contacts in one department in an uncoordinated fashion.

Another consolidation innovation was the creation of the first formal MAF database. Its purpose was to facilitate the exchange of MAF information and assessment material within TBP and to consolidate all the information electronically. The information gathered during the first two rounds was added to the database to allow comparisons of Round III views about departmental performance with the previous years’ assessments.

During this round, for the first time, departments were allowed to look at and comment on their draft assessment. The areas of management were again rated during Round III as they had been in Round II -- subjectively.

MAF Round IV

The methodology changes made during Round IV were to address the issues concerning the assessment burden being too heavy, the assessments being too large and the major increase in the workload for both TBP and departments following their review of their draft assessments. For Round IV, the number of areas of management was reduced to 20 and formal methodologies for the assessment of all 20 areas of management were prepared. The methodologies were provided to departments and departments had the opportunity to provide TBP with information they thought would be relevant to their assessments. This removed the idea of “subjectivity” associated with the rankings.

The “single-point-of-contact” rule was strengthened through stricter enforcement and through the establishment of a MAF electronic mailbox. Departmental MAF contacts could speak directly with their MAF program contact in TBP as before. And in Round IV, departments were asked to review and electronic version of their assessment in the MAF database rather than receiving a paper copy.

MAF Round V

The MAF methodology was strengthened further in Round V through three major innovations. First was the introduction of a “maturity model” rating system whose purpose is to bring further objectivity to the rating process. The second was the development of the MAF portal that was accessible to authorized MAF users including TBP staff and departmental contacts to facilitate the information exchange, assessment and review processes as well as increasing transparency. The third was the implementation of standardized language to help improve the quality and consistency of written assessments.

MAF Round VI

Round VI remained fairly stable with only minor changes in the methodology. This decision was made following the announcement of a five-year evaluation of MAF that would take place during the round and which could result in major changes being implemented during Round VII or Round VIII.

One major change was the elimination of the draft assessment. For Round VI there was a single draft release that contained the ratings. Another important change was the definition used to assess micro-agencies, to reduce the reporting burden on micro-agencies while maintaining a consistent definition of what constitutes one. Other important changes included the test of a risk-based approach in selected areas of management related to the HR component and a limitation on the number and size of documents that departments and agencies could upload to the MAF Portal.

Annex G   Bibliography

The following references were used during the conduct of the project as well as in the preparation of this report.

Audit Commission. “Comprehensive Performance Assessment.

Australian Public Service Commission. “Senior Executive Leadership Capability (SELC) Framework.

CCAF-FCVI. “Institutional Foundations for Performance Budgeting: The Case of the Government of Canada.” OECD Journal on Budgeting. 2007, Vol. 7, No. 4.

Chartered Institute of Public Finance and Accountancy. The CIPFA FM Model: Statements of Good Practice in Public Financial Management – Getting Started (PDF version 419kb).

Collins, Jim. Good to Great. New York: HarperCollins. 2001.

Conrad, M. “Some Mandarins Merely Going through the Motions of DCRs.” Public Finance, 16 November 2007.

Cresswell A.M., D. Canestraro, and T.A. Pardo. “A Multi-Dimensional Approach to Digital Government Capability Assessment.” Center for Technology in Government, University of Albany, SUNY, 2008.

Falletta, S. Organizational Diagnostic Models: A Review and Synthesis. Leadersphere, 2005.

Halligan, J. “Accountability in Australia: Control, Paradox and Complexity.” Public Administration Quarterly, Winter 2007.

Harris, L. “Best Value Reviews of Human Resource Services in English Local Government.” Review of Public Personnel Administration, 24(4), 334-347, 2004.

HM Government. Doing the Business: Managing Performance in the Public Sector – An External Perspective. February 2008.

HM Treasury. “HMT DSO Delivery Plan: Professionalising and Modernising the Finance Functions in Government.” July 2008.

HM Treasury. Operational Efficiency Programme: Final Report. April 2009.

HM Treasury. “Management Accountability Framework.” Undated presentation.

HM Treasury. “Risk Management in UK Government.” Undated presentation.

Holkeri, K. and H. Summa. “Evaluation of Public Management Reforms in Finland: From Ad Hoc Studies to Programmatic Approach.” Ministry of Finance, Finland.

Interdepartmental MAF Network. “2009 TBS Evaluation of the Management Accountability Framework: RFP MAF Administration.” 2009.

Interdepartmental MAF Network. “2009 TBS Evaluation of the Management Accountability Framework: Potential Points to Consider.” 2009.

Kaplan, Robert S. and David P. Norton. The Balanced Scorecard: Translating Strategy into Action. Harvard Business Press, 1996.

McNamara, Carter. “Organizational Performance Management.” Authenticity Consulting, 2008.

National Audit Office. Cabinet Office: Assessment of the Capability Review Programme. Report by the Comptroller and Auditor General, HC 123 Session 2008-2009, 5 February 2009.

National Audit Office. The Efficiency Programme: A Second Review of Progress. Report by the Comptroller and Auditor General, HC 156 I & II 2006-2007, 8 February 2007.

National Audit Office. Managing Financial Resources to Deliver Better Public Services. Report by the Comptroller and Auditor General, HC 240 Session 2007-2008, 20 February 2008.

National Audit Office. Managing Financial Resources to Deliver Better Public Services: Survey Results. 2008.

National Audit Office. Value for Money in Public Sector Corporate Services: A Joint Project by the UK Public Sector Audit Agencies. 2007.

Office of the Auditor General of Canada. Report of the Auditor General of Canada to the House of Commons – Chapter 2: Accountability and Ethics in Government. November 2003.

Oliveira, Fatima and Josiane Désilets. “Interdepartmental MAF Network: MAF Round VI Solution Sheet.” Draft document, 26 January 2009.

Platt, Rodney K. “Performance Management.” WorldatWork White Paper. December 2003.

Poage, James L. “Designing Performance Measures for Knowledge Organizations.” Ivey Business Journal, March/April 2002.

Public Audit Forum. Finance Value for Money Indicators Guidance. May 2007.

Public Policy Forum. A Management Accountability Framework for the Federal Public Service: Outcomes Report of Roundtable with Public Administration Experts and Academics. Public Policy Forum Boardroom, Ottawa, Ontario, 16 May 2003.

Reid, Joanne and Victoria Hubbell. “Creating a Performance Culture.” Ivey Business Journal, March/April 2005.

Sims, Harvey. “The Management Accountability Framework – Genesis, Evolution, Current Uses, and Future Prospects: A Paper Prepared for the MAF Directorate.” Sussex Circle, 15 July 2008.

Sims, Harvey. “The Management Accountability Framework: A Paper Prepared for the MAF Directorate.” Sussex Circle, 16 July 2008.

Staes, P. “Study on the Use of the Common Assessment Framework in European Public Administration.” European Institute of Public Administration, 2005.

Staes, P. and N. Thijs. “Quality Management and the European Agenda.” EIPAScope, 2005.

State Services Commission, New Zealand. “The Capability Toolkit: A Tool to Promote and Inform Capability Management.” 2008.

State Services Commission, New Zealand. “Quality Management in Government Responsibility and Accountability: Standards Expected of Chief Public Service Officers.”

Sussex Circle. “MAF Round VI Questions.” 2009.

Sussex Circle. “Management Accountability Framework Limited Review: A Paper Prepared for the MAF Directorate.” 13 June 2007.

Sussex Circle. “Report on Interviews with Deputy Ministers Concerning Management Accountability Framework Round V.” 30 June 2008.

Treasury Board of Canada Secretariat. “Guide to MAF Assessments 2007-2008.” Management Accountability Framework Directorate, 2008.

Treasury Board of Canada Secretariat. Leading Management Practices Handbook. Hampton Inn, Ottawa, 3 October 2008.

Treasury Board of Canada Secretariat. “MAF Post-Mortem: Internal.” 5 June 2008.

Treasury Board of Canada Secretariat. “MAF Process and Timelimes.” Internal Training Session with Departments and Agencies, 22 & 26 September 2008.

Treasury Board of Canada Secretariat. “MAF Round VI Draft Ratings: Trends & Issues.” MAF Strategy Forum, 30 January 2009.

Treasury Board of Canada Secretariat. “MAF Round VI September 2008 Presentations, Ratings Descriptions, Methodology Reference and Guidance Documents.” Management Accountability Framework Directorate.

Treasury Board of Canada Secretariat. “MAF Training for TBP Managers and Analysts.” October 2008.

Treasury Board of Canada Secretariat. “MAF VI Methodology Reference Document to Assessing Areas of Management (AoMs) 2008-2009.” Management Accountability Framework Directorate.

Treasury Board of Canada Secretariat. “Management Accountability Framework External Post-Mortem.” 30 May 2008.

Treasury Board of Canada Secretariat. “Management Accountability Framework: Round V Findings.” June 2008.

Treasury Board of Canada Secretariat. “Overview of the Management Accountability Framework.” PricewaterhouseCoopers, December 2008 – March 2009.

Treasury Board of Canada Secretariat. "People Management and MAF VII: A New Approach; Briefing for TBS Priorities and Planning Sector" Presentation dated February 26, 2009

Treasury Board of Canada Secretariat. “Process Issues.” Extended POC Meeting, 3 July 2008.

Treasury Board of Canada Secretariat. “Proposed Changes to Areas of Management Methodology for Round VI (2008-2009).” Presented at Extended POC, 3 July 2008.

Treasury Board of Canada Secretariat. “Responding to MAF Round V Feedback & Post-Mortems.” Extended POC Meeting, 3 July 2008.

Treasury Board of Canada Secretariat. “TBS MAF Application: Round VI Internal Training Sessions.” October 2008.

Treasury Board of Canada Secretariat. “TBS MAF Portal.” Internal Training Session with Departments and Agencies, 22 & 26 September 2008.

The Treasury, New Zealand. Annual Report of the Treasury for the Year Ended 30 June 2008. 2008.

United States Government Accountability Office. “Review of OMB’s Improved Financial Performance Scorecard Process.” Report to the Subcommittee on Government Management, Finance, and Accountability, Committee on Government Reform, House of Representatives, 2006.

 


[1] Five-Year Independent Evaluation of the Management Accountability Framework Request for Proposal.

[2] A complete list of interviewees and consultation participants is included as Annex C.

[3] The MAF Network and SAAN are networks of departmental representatives at various levels aimed at sharing best practices.

[4] Why is Performance Management so difficult to implement? Canadian Government Executive. March 2009

[5] For example, studies conducted by the National Institute of Standards and Technology in the late 1990s indicated that Malcolm Baldridge National Quality Award winners outperform other companies in terms of total financial return.

[6] Performance Management White Paper.  Rodney K. Platt:  WorldatWork, December 2003.

[7] The President’s Management Agenda (PMA) was an initiative of the Bush administration. The Obama administration has identified the need for a Chief Performance Officer, but at this point, it is not clear what role the PMA will play under the new administration.

[8] Organisation for Economic Co-operation and Development (OECD). Public Sector Integrity: A Framework for Assessment, p. 86

[9] Assessment of the Capability Review programme – Report by the Comptroller and Auditor General.  National Audit Office.  February 2009.

[10] CAF 2006. CAF Resource Centre. European Institute of Public Administration.

[11] Exceptions include the Czech Republic, Slovakia and Romania who have made the use of the CAF mandatory in an effort to encourage public sector quality management practices.

[12] For example, the Federal Accountability Act, which included provisions for strengthening accountability within government including designating deputy ministers as accounting officers, was passed in December 2006.

[13] Operational Efficiency Programme: Final Report.  HM Treasury.  April 2009.

[14] TBS slide presentation – “MAF Round VI Draft Ratings – Trends and Issues”. January 30, 2009.

[15] Slide Presentation “People Management and MAF VII:  A New Approach; Briefing for TBS Priorities and Planning Sector” – dated February 26, 2009.

[16] This Annex was prepared by the MAF Directorate of Treasury Board Secretariat after completion of the evaluation, and is included in the report for information purposes.



Date modified: