Treasury Board of Canada Secretariat
Symbol of the Government of Canada

ARCHIVED - Mid-Term Evaluation of the Implementation of the Cabinet Directive on Streamlining Regulation


Warning This page has been archived.

Archived Content

Information identified as archived on the Web is for reference, research or recordkeeping purposes. It has not been altered or updated after the date of archiving. Web pages that are archived on the Web are not subject to the Government of Canada Web Standards. As per the Communications Policy of the Government of Canada, you can request alternate formats on the "Contact Us" page.

2. Evaluation Issues and Methodology

This evaluation covers the period from the coming into force of the CDSR in April 2007 to fall 2009. It is a mid-term evaluation and therefore examines early implementation and progress toward immediate outcomes. The evaluation focuses on those Treasury Board of Canada Secretariat activities intended to achieve the immediate outcome of increasing the capacity of departments to conduct activities related to the CDSR.

2.1 Evaluation Issues and Questions

The questions for this evaluation focused on activities associated with the implementation of the CDSR and funded by the allocations described in par. 1.2, and were designed to measure how much progress had been made toward achieving the immediate outcomes. The research questions for the evaluation are listed below.[2]

Relevance

R1. Is there a continued need for the CDSR?

  1. Do the principles of the CDSR respond to the identified need for improvement in regulations?

R2. Do the objectives of the CDSR align with federal government priorities and the Treasury Board of Canada Secretariat's strategic outcome?

Performance(Effectiveness)

P1. Is the CDSR on track toward achieving all of its intended outputs? (capacity to implement the CDSR)

P2. To what extent are TBS-RAS and departments and agencies satisfied with the outputs produced?

  1. Are the outputs produced by TBS-RAS responding to the capacity needs of departments and agencies?

P3. Has TBS-RAS experienced any barriers to producing the desired level of outputs?

P4. To what extent is TBS-RAS on track toward achieving expected outcomes within the next two and a half years?

  1. To what extent has there been an increase in capacity by departments and agencies to meet CDSR requirements?

P5. What barriers exist to being able to measure a change in the intermediate outcomes within the next two and a half years?

P6. Were there any unintended impacts of the implementation of the CDSR?

Performance(Efficiency and Economy)

P7. Is the CDSR being delivered efficiently and economically to produce desired outputs and outcomes?

P8. Are there any alternative design and delivery approaches that should be considered for the implementation of the CDSR?

  1. Is the centralized approach, with some cost-sharing funds for departments and agencies, an efficient model?

For each evaluation question, one or more indicators were developed, along with appropriate methods and sources of evidence. The relationships between the evaluation questions, the indicators, the methods, and the sources of evidence are shown in Appendix A.

2.2 Data Collection Methods

The CDSR Evaluation Matrix (Appendix A) uses multiple lines of evidence and complementary research methods, both quantitative and qualitative, to ensure the reliability of the information collected. Five main lines of evidence were used: document and literature review, interviews, a survey of departments and agencies, analysis of performance data from existing databases, and a review of financial data. Each data source is described below by line of evidence.

2.2.1 Document and Literature Review

The following types of policy, planning and reporting documents were reviewed and analyzed to assess continued relevance:

  • Speech from the Throne, budgets, policies and legislation
  • Mandate and program authority documents
  • CDSR Treasury Board Submission
  • Treasury Board of Canada Secretariat's Management Resources and Results Structure and its Program Activity Architecture
  • Reports and reviews:
    • Departmental Performance Reports and other progress or performance reports
    • Smart Regulation: A Regulatory Strategy for Canada
    • OECD Reviews of Regulatory ReformRegulatory Reform in CanadaGovernment Capacity to Assure High Quality Regulation
    • December 2000 Report of the Auditor General—Chapter 24: Appendix A—Government of Canada Regulatory Policy, November 1999
    • Other documents, including opinions and perceptions of regulated parties on change

Both print and electronic documents were reviewed, using a customized template to extract relevant information from the documents and to organize it according to the indicators and evaluation questions in Appendix A. Appendix B provides a complete list of the documents reviewed.

2.2.2 Interviews

Interviews served as an important source of information for the evaluation, providing qualitative and quantitative input on relevance, results, and effectiveness of the CDSR implementation. Because not all stakeholders could be interviewed, the evaluation team selected a sample to ensure that appropriate interests and organizations were represented. A total of twelve interviews were conducted as part of the evaluation. The key informants were four TBS-RAS employees and eight representatives from departments and agencies, including three low, three medium and two heavy regulatory submitters.

All interviews were conducted in person. Interviewees were contacted to schedule an appropriate time and were sent an interview guide (Appendix C) in advance. The findings of the interviews were compiled and summarized by evaluation question and indicator.

2.2.3 Survey of Departments and Agencies

The evaluation team sent a Web-based survey to the TBS-RAS distribution list of 70 contacts[3] involved in the coordination of regulations for their departments and agencies. The survey was posted online for three weeks, and two reminder emails were sent during this period. Of the 70 contacts that were sent the survey, 34 responses were received, of which 30 were considered valid. The response rate is shown in Table 1.

Table 1. Survey Response Rates
Survey Group Total Sent Received Removed Total Kepts Response Rate Confidence Interval
Departments and agencies 70 34 4 30 42.9 % 95% ± 13%

Templates were populated with the survey responses to analyze the data according to the performance indicators and evaluation questions identified in the Evaluation Matrix. Because some organizations were over-represented among the respondents, the data were reweighted so as not to skew the results toward the experience of a few departments. Appendix D shows the weights that were applied. As a result, the analysis was conducted using a weighted population of 19 respondents. Furthermore, not all participants answered all survey questions. Where the variance is important, the number of non‑respondents has been noted in the report. The detailed survey results are provided in Appendix E.

TBS-RAS requested a second analysis where the survey responses were weighted to represent the volume of regulatory activities carried out by departments and agencies over the last two years. The assigned weights are shown in Appendix D. Analyzing the survey in this way highlights the findings of organizations having a high volume of regulatory activity.

2.2.4 Performance Data

Since a performance measurement system specific to the implementation of the CDSR did not exist, a data collection template was developed to capture data on outputs. Existing administrative data and performance data (e.g., client satisfaction surveys) were also reviewed to assess production of outputs and progress toward immediate outcomes. The performance data results were summarized by evaluation question and indicator.

2.2.5 Financial Data

Financial data were analyzed to determine the trend in planned versus actual resource use. TBS‑RAS also provided estimates of the allocation of resources to each of the activity areas in the CDSR logic model. The financial data, combined with the output and outcome findings, provided the basis for determining efficiency (i.e., outputs relative to resource use) and economy (i.e., outcomes relative to resource use). The financial data results were summarized by evaluation question and indicator.

2.3 Limitations of the Evaluation Methodology

2.3.1 Concurrent Studies

Other studies were being carried out concurrently in TBS-RAS, including the Performance Measurement and Evaluation Best Practices Study and the Management-led Review of the Centre of Regulatory Expertise (CORE). Because the implementation of the CDSR is highly dependent on the Centre's activities, TBS-RAS was aware of the potential for duplication and was committed to coordinating the various projects. To avoid interviewee and survey respondent fatigue, different interviewees and lists of survey candidates were identified for each of the studies, while still ensuring that these participants were representative of the population within departments and agencies involved in CDSR implementation.

2.3.2 Performance Measurement Information

Because there was minimal reporting in the early part of the pilot project, only limited data were available to address issues of efficiency and economy, or of leveraging (i.e., the contribution of resources made by departments and agencies relative to those provided through the CDSR). It was hoped that the survey would provide an estimate of the level of additional resources provided by departments and agencies to implement the CDSR, but survey respondents were unable to provide this information. It was therefore not possible to quantify the degree of leveraging achieved.

2.3.3 Survey Response Rate

Although the survey response rate (43%) was not as high as expected, the responses that were provided to the survey aligned well with those of the interviewees. Furthermore, the respondents were fairly representative of departments having low, medium and high regulatory activity.

2.3.4 Conclusion

Although the evaluation methodology does have some limitations, it was designed to use multiple lines of evidence to draw conclusions about the CDSR, thus strengthening the reliability and validity of the evaluation results. Notwithstanding the limitations, the methodology meets the requirements of the Treasury Board Policy on Evaluation and associated standards.