This page has been archived.
Information identified as archived on the Web is for reference, research or recordkeeping purposes. It has not been altered or updated after the date of archiving. Web pages that are archived on the Web are not subject to the Government of Canada Web Standards. As per the Communications Policy of the Government of Canada, you can request alternate formats on the "Contact Us" page.
Establishing the Baseline - Government-Wide Summary and Analysis of IT Project Practices
June 1, 1998
Chief Information Officer Branch
Treasury Board of Canada Secretariat
The authors of this document, David Holmes, (Treasury Board of Canada Secretariat, Project Management Office) and Denis Godcharles, (Godcharles Goulet Fournier) would like to acknowledge the contributions of the following participants:
Marj AkerleyThis section is an introduction to the Enhanced Framework initiative within the federal government and how it will be used to implement and promote best practices in the management and delivery of Information Technology (IT) projects
The government is committed to delivering its programs and services more efficiently and effectively through the use of IT. Reviews of government IT projects conducted by the Treasury Board of Canada Secretariat (TBS) and the Office of the Auditor General (OAG) have identified issues with the government's management and delivery of IT projects.
To address these issues and enhance the framework for managing and delivering IT projects, a TBS Project Management Office (PMO) was formed. The purpose of the PMO is to provide guidance and support to departments, helping them ensure that the government's IT projects:
In May 1996, the PMO, in conjunction with operating departments, published a document of guiding principles and best practices that address project management issues experienced within the federal government. The resulting document, An Enhanced Framework for the Management of Information Technology Projects,1 provides guidance for improvements to IT project management practices.
One of the directions to be embraced includes the promotion and implementation of industry best practices in areas relevant to the Enhanced Framework. Currently promoted practices are detailed in the Enhanced Framework II: Solutions: Putting the Principles to Work,2 which is available through the PMO and on the Internet (www.tbs-sct.gc.ca). In order to expand and enhance this initial set of solutions and further assist departments in their improvement efforts, the PMO had a requirement to establish a baseline of project-related practices.
The purpose of this document is to present the results from a series of workshops that examined the federal government's existing practices in managing and delivering IT projects. Summary results from a government-wide perspective are presented in this document. Individual department results are presented in separate documents.
Sustainable improvements in IT project success rates can only be achieved through a clear understanding of an organization's project results and the practices that led to those results.
Deficiencies associated with the management and delivery of IT projects in the federal government have been documented.3 However, minimal information has been available about either the presence or absence of practices that led to these inadequate results. In order to better direct and guide improvement initiatives, a baseline that addresses practice strengths as well as deficiencies had to be established. A clear understanding of gaps or weaknesses would then enable departments to relate results to practices and thereby improve their ability to successfully manage and deliver IT projects. A continued lack of understanding of these practices and how they affect project results would likely lead to inefficient or inappropriate investments in implementing best practices, and delay improved returns on IT investments.
Finally, there was a need to develop a baseline across the government in order to provide a meaningful reference point for all departments. Currently available practice databases often have few occurrences of public sector organizations and may not reflect the "true environment" of the Canadian public service. This is the first time that this type of baseline has been produced for the federal government.
The resulting baseline has two key components. The first component is the individual departmental baseline. The second is this government-wide summary and analysis of the overall results. Within this context, the different uses of the baseline results are numerous. The departmental baseline will enable those responsible in departments to:
The TBS will use both components of the baseline to:
The baseline is a useful tool that sets the stage for significant improvement and provides a benchmark against which to measure progress.
Although many will have an interest in the baseline, it is targeted at two primary audiences within departments:
This section describes the approach used to establish the baseline.
The Software Engineering Institute's (SEI) Software Capability Maturity Model (SW-CMM) 4 and the corresponding CMM Based Appraisal for Internal Process Improvement (CBA-IPI) method was selected as the foundation for the baselining methodology. This method was preferred over other available methods because of its widespread support in the industry and its ability to provide the government with a thorough assessment of the processes currently implemented within departments.
The CBA-IPI approach, however, requires significant investment in terms of both human and financial resources, and it considerably impacts organizations5. As a result, the approach used was one that respected the principles and practices required in the CBA-IPI, yet minimized the impact on departmental participants. In essence, the methodology was streamlined to:
The streamlined methodology consisted of administering a questionnaire that was based upon one developed by the Software Productivity Centre (SPC) for their SoftGuideTM product. The SoftGuideTM questions apply to organizations currently operating at Level 1 or 2 of the SEI S/W-CMM. The terminology and degree of formality of the questions in SoftGuideTM are more suitable to the size and structure of Canadian government IT organizations than is SEI's Maturity Questionnaire. The SoftGuideTM approach has been used successfully in over 30 assessments.
The SoftGuideTM questionnaire contains 89 questions from the 13 Key Process Areas corresponding to Levels 2 and 3 of the SEI SW-CMM.6 In order to properly address the full scope of the Enhanced Framework, the SoftGuideTM questionnaire was expanded to:
Despite the substantial differences between departments—in size of IT expenditures, in IT management styles, and in IT Service Delivery Models—the questionnaire was applicable to all participating departments. Participants were able to relate to the questions and respond appropriately whether projects were traditional software development, commercial off the shelf (COTS) acquisition, or infrastructure upgrades. This shows that the Key Practice Areas assessed are generic and validates the baseline results as a tool that can provide guidance towards improvement. The final list of questions is provided in Appendix 1.
Twenty of the largest departments in terms of IT-related expenditures were solicited to participate in the baselining process. Their level of IT expenditure was based on 1996/1997 data from a Central Accounts report dated June 5, 1997. All departments responded positively and key representatives from each participated in a half-day workshop conducted for their department. The list of participating departments and participants is provided in Appendix 2.
All the workshops were conducted from November 1997 to March 1998 by the authors of this report. Everyone participating in the workshop was given a brief presentation that described the Enhanced Framework, the SEI SW-CMM, and the assessment process. This presentation preceded the administration of the questionnaire.
The possible responses7 to the questions were as follows:
Rather than strictly observing the Yes/No criteria defined above, common sense was used to determine when to respond Yes or No. The "80/20" rule was adopted: when 80 percent of projects within a department implemented the practice, participants answered Yes. Such an interpretation of the Yes/No criteria is not unusual in assessment methodologies.
In addition, participants often qualified their responses with "Yes, but…" or "No, but…" followed by an explanation. Sometimes discussion of the perceived need—or lack of need—for improvement in a specific area led to the decision on what to respond. Participants used the Comments section of the questionnaires to qualify their response or to record the results of their discussions.
During each workshop, participants reached an understanding as to what constituted a "project" within a given department. This consensus was necessary in order to provide a sound and consistent basis for responding to the project-related questions. While these definitions were not necessarily consistent across all departments, TBS PMO representatives enforced an acceptable range of definitions to preserve the integrity of the baseline results. This understanding was documented in the workshop record.
All answers were recorded and sent to the participating departments for internal review and validation. These validated results provide the basis for the baseline.
This section provides the workshop results summarized across participating departments, a detailed analysis of these overall results, and guidelines for reading and interpreting them. Where applicable, observations by the authors have been provided.
The purpose of the baseline is to stimulate government departments to improve their ability to successfully manage and deliver IT projects. The results presented in this section must be read and interpreted within this context. While the results provide valuable insight into the government's project management and delivery capability, it must be understood that the population surveyed constituted only a small sample of the entire IT practitioner and manager knowledge base. Readers must be careful to neither jump to hasty conclusions, nor judge the government's IT project management and delivery capabilities solely on the basis of this baseline.
Specific departmental baseline components can be compared against the various summary profiles provided in this report. Several analyses have been conducted and can be used for different comparisons. These comparisons can provide some indication regarding the department's position vis-à-vis other departments.
3.2.1 Overall Results
Overall results are given first, in order to provide an overview of the state of implementation of project practices across the government.
Figure 1 below presents the overall results. Each bar on the graph represents the percentage of satisfaction level for each Key Practice Area. Practices are on the "Y" axis while practice satisfaction levels are represented on the "X" axis.
Figure 1. Overall Results
Observations
The practices that achieved the highest satisfaction levels include:
The practices that achieved the lowest satisfaction levels include:
Analysis of Overall Results
Analysis of the overall results identify definite strengths in the management and delivery of IT projects as shown above.
Areas of improvement include:
3.2.2 Weighted Results
The same results are presented using a weighting factor. The weighting factor takes into account the relative size of each department, based on the amount of money it spends on IT relative to the total expenditure of the population of departments examined.
Figure 2 below portrays the weighted results.Each bar on the graph represents the weighted percentage of satisfaction levels for each Key Practice Area. The weighting factor was determined by computing the ratio of a department's IT expenditure over the total IT expenditure of the entire population of departments involved in this study. Weighted results were then obtained by adding the products of each department's satisfaction level and weighting factor. Practices are on the "Y" axis and satisfaction levels are represented on the "X" axis.
Figure 2. Weighted Results
Observations
The impact of the weighting factors on the overall results was generally positive, increasing the satisfaction levels of the following best practices:
The practices that achieved the lowest satisfaction levels include:
Analysis of Weighted Results
Weighted results generally improve the government's overall performance. This is primarily due to more mature processes found in larger departments.
3.2.3 Overall Results by IT Expenditure Category
The overall results are categorized based on the size of IT expenditure in order to allow readers to compare departmental performance against those in the same IT expenditure range. These results are provided in Table 1 below and in a graphical presentation in Figure 3. For the purpose of this report, four categories were identified:
IT expenditures were confirmed during the results validation process with the departments. Expenditures include all IT related expenditures (software development and maintenance as well as infrastructure capital and O&M expenditures).
Table 1. Overall Results by IT Expenditure Category
Table 1 is depicted below graphically.
Figure 3. Overall Results by IT Expenditure Category
Analysis of Results by IT Expenditure Category
Few trends can be identified from the results by IT Expenditure Category, other than the fact that there seems to be a correlation between the size of the department and the satisfaction level for each practice. In general, the bigger departments (that is, departments with the larger IT expenditures), are better at implementing and consistently applying processes. Nevertheless,
departments in the $50M to $100M category generally appear more satisfied with project practices than those in the over $100M or more category. These results can be explained in many ways, including the fundamental fact that departments with larger IT expenditures have the opportunity to repeat management and delivery processes more often and consequently they become better at it.
3.2.4 Results by IT Management Model
The overall results are presented with departments characterized as having either a "decentralized" or a "centralized" mode of IT management. Some departments have a central IT organization that essentially does "everything" for its clients. Others have a decentralized IT capability where the responsibility for networks, the computing infrastructure and applications are shared among
several groups. The responses of these two classes of departments were different, and the TBS PMO representatives wanted to allow departments to compare themselves against departments with the same management style as their own.
Figure 4 below provides a graphical representation of the results by IT Management Model. The lower bar of each pair represents the percentage of satisfaction levels for each practice achieved by the centralized departments. The upper bar represents the percentage of satisfaction levels for each practice achieved by the decentralized departments. Practices are on the "Y" axis and satisfaction levels are represented on the "X" axis.
Figure 4. Results by IT Management Model
Analysis of Results by IT Management Model
Results of the baseline categorized by management models reveal an interesting trend. It appears that centralized IT Management Models consistently achieve better satisfaction levels for the 15 key process areas with the exception of training. This trend may be explained by the fact that the departments that centralize the management of IT can better control their practices and the
resources implementing the processes. Upon reflection, this is an unsurprising and predictable result. However, there are many valid business reasons for decentralizing the management of IT and readers are reminded not to jump to hasty conclusions. Departments with a decentralized IT management model may wish to baseline the practices in different areas and use the result to identify
best practices and leverage them throughout the department.
3.2.5 Results by IT Service Delivery Model
The results are presented with departments characterized as having either an "in-source" or "out-source" IT service delivery mode. Some departments have considerable programming staff and do most of their work using internal resources. When using contracted resources, these departments will likely retain the overall responsibility for the project outcomes. Others use contracted
resources to do most of the IT project-oriented work. These departments tend to transfer most of the risks to these contracted resources. Again, the issues associated with each approach are different, and departments may find it useful to compare against others in their own IT service delivery category.
Figure 5 below provides a graphical representation of the results by IT Service Delivery Model. The lower bar of each pair on the graph represents the percentage of satisfaction levels for each practice achieved by the "in-sourcing" departments. The upper bar represents the percentage of satisfaction levels for each practice achieved by the "out-sourcing" departments. Practices are on the "Y" axis; and satisfaction levels are represented on the "X" axis.
Analysis of Results by IT Service Delivery Model
Overall results categorized by IT Service Delivery Models are also revealing with respect to the levels of satisfaction. It appears that in-sourced departments achieve better levels of satisfaction across all Key Practice Areas with the exception of requirements management, project tracking and oversight, and configuration management. Once again, this trend may be attributable to the
fact that the departments that in-source the delivery of IT can better control their practices and the resources implementing the processes.
Figure 5: Results by IT Service Delivery Model
3.2.6 Mapping the Results Against the Enhanced Framework Targets
This section maps the overall results to the objectives set out by the PMO in the Enhanced Framework II: Solutions: Putting the Principles to Work. This document sets improvement targets in terms the following plateaus:
In order to get a better sense of the overall results against these plateaus, Table 2 maps practice satisfaction levels to each plateau.
Table 2: Mapping the Results Against the Enhanced Framework Targets8
Analysis of the Mapping Results against the Enhanced Framework Targets
An area of real concern, also identified above in Section 3.2.1, is Project Governance as outlined in the Enhanced Framework. The activities involved are those that get the right project off on the right track: an effective governance structure within the department; a sound business case approach to selecting projects; a definitive project charter and a clear accountability
structure; a disciplined approach to training, developing and selecting project managers; and an effective risk management regime. All are key to the successful management and delivery of IT projects and it is particularly vital that this foundation be established properly.
Understanding that higher level practices can seldom be improved without the benefit of consistently implemented practices at the lower levels, this analysis provides guidance and direction to focus government-wide improvement initiatives.
3.2.7 Mapping the Results against the OAG findings
This section examines the baseline results against those of the OAG's main issues emerging from their review process of systems under development for the past three years (1995 to 1997)9. The latter results can be found on the OAG's web site at www.oag-bvg.gc.ca. OAG main issues are provided in the first column, along with the year they were identified (second column). Corresponding baseline references and satisfaction levels identified in the baseline are provided in the last two columns. Numbers in the third column refer to the questions found in Appendix 1
(PG=Project Governance, SE=Systems Engineering/Software Acquisition, CMM=Capability Maturity Model).
Auditor General's Findings | OAG Ref. | Baseline Ref. | Average Satisfaction Levels |
---|---|---|---|
Smaller chunks for projects | OAG '95 | PG 4.1, PG 4.2, PG 4.3 |
.35 |
Effective project sponsorship | OAG '95 | PG 2.1 | .70 |
Clearly defined requirements | OAG '95 | CMM 1 | .69 |
Effective user involvement | OAG '95 | SE 1 CMM 12 CMM 1.1,1.4, 2.3, 2.5, 3.5 |
.75 |
Dedicated resources to projects | OAG '95 | PG 2.4, PG 4.8, CMM 11.8 | .59 |
Taking charge | OAG '96 | PG 2 | .58 |
Define requirements | OAG '96 | CMM 1 | .69 |
Improve software development processes | OAG '96 | CMM 7, CMM 8 |
.34 |
Setting priorities | OAG '96 | PG 1 | .56 |
Measuring status of projects | OAG '96 | PG 4.7, PG 4.1 CMM 3 |
.44 |
Implementing new government guidelines | OAG '96 | CMM 7, CMM 8 |
.34 |
Planning | OAG '97 | CMM 2 | .69 |
Oversight | OAG '97 | PG 4.7, PG 4.1 CMM 3 |
.44 |
Quality assurance | OAG '97 | CMM 5 | .40 |
Analysis of the Mapping Results against the OAG findings
Mapping the Baseline results to the OAG findings identifies similar deficiencies in project management and delivery practices. Nonetheless, the baseline report identified possible improvements in the areas of:
a. Project sponsorship;
b. Requirements management;
c. User involvement; and
d. Planning.
It is recommended that departments use the baseline to confirm departmental practice improvement initiatives and to justify future plans for implementing industry best practices. It is further recommended that the steps identified below, be performed in the very near future in order to benefit from the momentum already set in motion by the participants:
The process of establishing a baseline of the current state of project practices is only the first step in implementing sustainable improvements and best industry practices as they relate to the management and delivery of IT projects within the federal government. The purpose of the baseline is to stimulate departments to improve their ability to successfully manage and deliver IT projects. This baseline document provides a snapshot of how the government is doing with regards to the implementation of these practices.
Each department should now, as indicated during the workshops that led to these results, develop a detailed action plan to initiate improvement activities. Based on workshop results, it is expected that this initial action plan will contain a subset of departmental plans already developed and approved for the upcoming year. The plan should provide high-level information regarding the background, business justification, and goals of the improvement initiative(s); the key work elements; the roles and responsibilities of stakeholders; key milestones; and resources assigned to the improvement activities. Templates and examples can be provided by the PMO to departments for their respective plans.
Departments should re-assess their organizations and measure progress. The PMO will make available an electronic and improved version of the questionnaire used to determine this baseline. Departments will be able to repeat this process within their organization and, possibly, to extend it to other areas not covered during this first pass. In some cases participants were not able to respond to the questions in a fashion that fairly represented the whole department. This was particularly true in some departments with a decentralized IT management style. It became apparent, in fact, that some departments should have several baselines to represent the different practices in place in different areas, rather than one baseline to represent the whole. Departments should make that determination in consultation with their practice implementation groups.
Details on how to implement industry best practices were provided as part of the workshops in a document entitled SEI IDEALSM Model. This document is also available on the SEI Web Site (www.sei.cmu.edu). TBS PMO is also sponsoring special interest groups that are focusing on the implementation of best practices as discussed in this report. Departments should make use of these tools.
Information Technology is critical to delivering government programs and services more efficiently and effectively. Departments cannot afford project failures. They also cannot afford not to harvest the benefits of the technology they deploy. The workshop results confirm earlier studies and reviews by the TBS, the OAG, and the private sector that there are outstanding problems with project management practices. The baseline results for each department provide a useful insight into necessary improvements areas and an inherent basis for priority setting.
Implementing industry best practices is a significant step towards operational effectiveness and sound comptrollership currently strongly promoted within the government. The opportunity offered to senior managers to embrace proven solutions to a widely recognized problem is an opportunity not to be missed. This observation is especially relevant in light of the major "Year 2000" projects now under way.
The overall summary presented in this document and the individual department baseline reports can be a useful tool for setting the stage for significant improvement in the area of IT project management. It can provide a meaningful benchmark against which each department can measure its progress.
1. Treasury Board of Canada Secretariat, An Enhanced Framework for the Management of Information Technology Projects, Ottawa, Ontario, May 28, 1996.
2. Treasury Board of Canada Secretariat, Enhanced Framework II: Solutions: Putting the Principles to Work, Ottawa, Ontario, March 1998.
3. The TBS review, documented in An Enhanced Framework for the Management of Information Technology Projects, identified deficiencies in IT project results. A number of OAG reviews, available on their web site at www.oag-bvg.gc.ca, also identified deficiencies. Private sector-led surveys conducted by the Standish Group in the United States and by KPMG in Canada also identified similar problems with IT project results. These reviews were conducted between 1995 and 1997.
4. Paulk, Mark C.; Curtis, Bill; Chrissis, Mary Beth; Weber, Charles V.; "Capability Maturity Model for Software, Version 1.1, CMU/SEI-93-TR-24"; Software Engineering Institute, Carnegie Mellon University, Pittsburgh, PA, February 1993.
5. It should be noted that the S:PRIME process, available through the Applied Software Engineering Centre and GrafP Technologies Inc. and used in some departments, was also considered since it is also founded upon the CBA-IPI but is less resource-intensive. It too, however, has more of an impact on organizations than was felt possible to cope with at this time.
6. Brief definitions of the Key Process Areas are provided in Appendix 1.
7. Unless otherwise noted, answers to the questions were based on the participants' knowledge and experience in their current environment.
8. Readers should note the Requirements Management Key Process Area has been moved to facilitate the reading of the mapping results.
9. It should be noted that the processes used by the OAG to determine the main issues with the management and delivery of IT in the government are different than those used by the TBS to establish this baseline. Whereas the OAG follows a targeted approach with a focus on specific initiatives within the government, the TBS approach examines the implementation of the same practices but at the organizational level.
Systems Engineering/Software Acquisition, in the context of this questionnaire, involves transforming an operational need into a description of the system configuration (including acquired systems) which best satisfies the operational need and integrating the efforts of all disciplines and specialties (software development, infrastructure, etc) to yield a software product.
The following departments have participated in the Treasury Board of Canada Secretariat exercise:
Department | Represented By |
---|---|
1. Agriculture and Agri-Food Canada | Marj Akerley Ray Blewett Brian Gallant Bruce Gordon Terry Harding |
2. Canadian Heritage | Larry Bristowe Crayden Arcand |
3. Citizenship and Immigration Canada | Christine Payant Don Hobbins |
4. Correctional Service Canada | David From N.D. Funk Doug McMillan Richard Johnston |
5. Environment Canada | Jim Klotz Henry Murphy Chak Lam Wong |
6. Fisheries and Oceans Canada | Réjean Gravel Robert Cosh William Lowthian Dianne Symchych |
7. Foreign Affairs and International Trade | Greta Bossenmaier David Lafranchise Bob Fraser |
8. Health Canada | Dorene Hartling Bob Wilkinson |
9. Human Resources Development Canada | Joan Delavigne Ron Ramsey Michel Gilbert |
10. Indian and Northern Affairs | Serge de Bellefeuille Yves Marleau Peter Oberle |
11. Industry Canada | Tom Racine Nabil Kraya Patty Pomeroy Pierrette Benois-Doris Jenny Steel Jim Westover |
12. Justice Canada | Aaron Caplan Janice Hatt |
13. National Defence | Barry Brock (Colonel) Wayne Harrison Bill Brittain |
14. Natural Resources Canada | Yvon Claude Paul Conway Peter McLean Ken Leblanc Leslie Whitney |
15. PWGSC | Dave Holdham Julia Ginley François Audet Debbie Jones |
16. Revenue Canada | Darlene Bisaillon Richard Chauret Barry Ferrar Gloria Kuffner Katherine Stewart |
17. Royal Canadian Mounted Police | Edward B. Cook Guy Millaire |
18. Statistics Canada | Dick Gross Michael Jeays Mel Turner |
19. Transport Canada | Kevin Collins Robert Lalonde |
20. Veterans Affairs | Heather Parry Frances Walsh Howard Williams |