Audit of Performance Measurement
January 2017
PDF Version (155 Kb, 18 Pages)
Table of contents
Acronyms
ADM |
Assistant Deputy Minister |
---|---|
CIO |
Chief Information Officer |
DCI |
Data Collection Instrument |
DRF |
Departmental Results Framework |
INAC |
Indigenous and Northern Affairs Canada |
MAF |
Management Accountability Framework |
MRRS |
Policy on Management, Resources and Results Structures |
PAA |
Program Alignment Architecture |
PMF |
Performance Measurement Framework |
PM |
Performance Measurement |
RPP |
Report on Plans and Priorities |
SO |
Strategic Outcomes |
TB |
Treasury Board |
Executive Summary
Background
The Audit of Performance Measurement was included in Indigenous and Northern Affairs Canada's ("INAC" or "the Department") 2016-2017 to 2018-2019 Risk-Based Audit Plan, approved by the Deputy Minister on March 4, 2016. The audit was identified as a high priority because of the importance of monitoring and assessing the results of departmental programs and due to the significant interest and concern across government in this area.
The Mandate Letters to the Ministers of the Government of Canada identify performance measurement as a priority and establish an expectation "to report regularly on your (departmental) progress toward fulfilling our commitments" made and "to help develop effective measures that assess the impact of the organizations for which you (Ministers) are answerable".Footnote 1
Performance measurement activities refer to those activities undertaken to support monitoring, measuring, evaluating and reporting on the performance of the Department's programs and services. Like all departments, INAC is managing two major performance measurement transitions: the implementation of the Treasury Board (TB) Policy on Results and the implementation of "Deliverology", which is a collection of results-based management practices to fulfill the Government of Canada's commitment to deliver results in public services.
Audit Objective and Scope
The objective of the audit was to assess the Department's readiness and capacity to implement the changes required under the TB Policy on Results as they relate to performance measurement and to identify relevant performance measurement considerations.
The scope of the audit included assessments of the adequacy and effectiveness of INAC's performance measurement system with a focus on the risk areas identified during the planning phase. The audit focused on the period from March 31, 2015 to August 1, 2016, however included the review of some earlier documention.
The audit scope included the assessment of compliance with program performance measurement and excluded performance measurement related to internal service delivery (e.g., finance, human resources, etc.). Specifically, employee performance management, time and expense performance, or areas of performance not directly related to measuring INAC's delivery of programs and services were not included within the scope of this audit.
Statement of Conformance
The audit conforms to the Internal Auditing Standards for the Government of Canada, as supported by the results of the quality assurance and improvement program.
Positive Observations
The audit identified a number of positive observations related to INAC's performance measurement, grouped by thematic area:
Leadership and Strategic Alignment
- Through facilitation by the Evaluation, Performance Measurement and Review Branch, and in collaboration with the Planning, Research and Statistics Branch, INAC programs have been able to establish and have approved Performance Measurement Strategies (which include performance measurement frameworks and all associated indicators) for all of their programs. Measurement efficiency was improved by consolidating Performance Measurement Strategies to 24 from 29.
- To support the transition to the Policy on Results and the Government's Indigenous agenda, INAC created the Assistant Deputy Minister (ADM) Committee on Results, drafted a Results Charter and Mini-Charters to align to core responsibility areas, designated a Chief Results and Delivery Officer and a Chief Data Officer, and has started to develop the Departmental Results Framework, Program Inventory, and Program Information Profiles (all requirements under the new policy).
- INAC, along with other external experts, supports the Evaluation, Performance Measurement and Review Committee and the Advisory Committee to Treasury Board, both leading practices in the federal government.
Clear Accountability
- Governance (roles and responsibilities) for program managers and staff with respect to program-level performance measurement are appropriately defined.
- Progress has been made in establishing accountabilities, roles and responsibilities with respect to performance measurement, which is a factor that will support successful implementation of the Policy on Results.
Client-focused Service
- Departmental knowledge and capacity has improved as a result of efforts to refine INAC's Performance Measurement Framework (PMF) and program-level Performance Measurement Strategies.
- Engagement with Indigenous communities to support co-development of useful performance information has been recognized as a priority (e.g., Simplified Reporting initiative has aimed to improve measurement efficiency of Data Collection Instruments and planning actions have commenced).
Performance and Results
- The Department has improved PMF indicator clarity, reliability, and measurement over the past two years. INAC ranked in the top quartile in 70% (12 of 17) of the most recent Management Accountability Framework (MAF) performance measurement categories.
Conclusion
The audit found that the Department has initiated relevant actions to support its readiness and capacity to implement the changes required under the new Policy on Results. To support the transition to the Policy on Results, opportunities to aid its implementation have been identified. These include: clearer, better communicated strategic direction; data governance consistent with requirements of the new policy; expanded understanding of data needs for planning in Indigenous communities that can be used for performance reporting; and, more consistent use of performance information across the Department to achieve strategic objectives.
Considerations
The audit team identified key areas where improvements are required in order to support departmental readiness and capacity to implement the Policy on Results and the Government's Results Agenda. Considerations are outlined below:
Leadership and Strategic Direction
1. As part of the implementation plan for the Policy on Results, the Senior Assistant Deputy Minister of Policy and Strategic Direction with the Head of Performance Measurement and the Head of Evaluation could coordinate the development of a series of activities that will ensure that the Policy on Results' objectives, requirements and accountabilities, are clearly understood, including roles and responsibilities, the implementation roadmap, and the importance of results-based management within the Department.
Clear Accountability
2. Establishing clear accountabilities had been identified as a success factor for the implementation of the Policy on Results. The same consideration identified in the previous section 'Leadership and Strategic Direction' should be considered to ensure clear accountabilities are in place.
Client-focused Service
3. As engagement across sectors and programs with Indigenous communities progresses, the Senior Assistant Deputy Minister of Policy and Strategic Direction, the Head of Performance Measurement and the Head of Evaluation can assist and provide advice to sectors to ensure that engagement with the Indigenous communities includes performance measurement and is aligned with INAC's Departmental Results Framework (DRF), data strategy, and broader implementation of the Policy on Results
Results and Performance
4. The Chief Results and Delivery Officer could work with program managers, and the Chief Information Officer as required, to develop and implement a sustainable enterprise approach/data strategy to advance INAC's capacity in managing and using performance data and information (e.g., define data, information management and information technology requirements for data collection/storage/accessibility and monitoring program delivery) while ensuring adequate coordination with other INAC data streamlining and/or development initiatives.
1. Background
The Audit of Performance Measurement was included in Indigenous and Northern Affairs Canada's ("INAC" or "the Department") 2016-2017 to 2018-2019 Risk-Based Audit Plan, approved by the Deputy Minister on March 4, 2016. The audit was identified as a high priority because of the importance of monitoring and assessing the results of departmental programs and due to the significant interest across government in this area.
The Mandate Letters to the Ministers of the Government of Canada identify performance measurement as a priority and establish an expectation "to report regularly on your (departmental) progress toward fulfilling our commitments" made and "to help develop effective measures that assess the impact of the organizations for which you (Ministers) are answerable".Footnote 2
Performance measurement activities refer to those activities undertaken to support monitoring, measuring, evaluating and reporting on the performance of the Department's programs and services. INAC is currently managing two major performance measurement transitions: the implementation of the Treasury Board (TB) Policy on ResultsFootnote 3 and the implementation of "Deliverology", which is a collection of results-based management practices to fulfill the Government of Canada commitment to deliver results in public services.
"Deliverology" and the Government's Indigenous Agenda
The Government of Canada indicated through the Cabinet Committee on Agenda, Results and Communications – the Committee responsible for developing the Government's forward agenda, tracking progress, and managing strategic communications – that departments will be responsible for implementing the core tenets of "Deliverology". Pioneered in the UK public sector, "Deliverology" is a performance measurement approach that involves best-in-class performance measurement practices to drive policy outcomes by focusing on key activities most likely to have the greatest impact.Footnote 4
Additionally, the Government of Canada has recently indicated through all Mandate Letters that performance measurement will be an important priority. The Mandate Letter to the Minister of INAC stipulates that "…our work will be informed by performance measurement, evidence, and feedback from Canadians. We will direct our resources to those initiatives that are having the greatest, positive impact on the lives of Canadians, and that will allow us to meet our commitments to them. I expect you to report regularly on our progress toward fulfilling our commitments and to help develop effective measures that assess the impact of the organizations for which you are answerable".Footnote 5 The Government has also made Ministerial Mandate Letters accessible to the public "for the first time in our country's history"Footnote 6.
Key Performance Measurement-related Policy Changes
The TB 2010 Policy on Management, Resources and Results Structures (MRRS) was designed to ensure that the government and Parliament have integrated financial and non-financial program performance information to support allocation and reallocation decisions in individual departments and across the government. The Policy on MRRS required that each department and agency put in place a common framework for the identification of programs and for the collection, use, and reporting of financial and non-financial information relative to those programs. This is further supported by the TB Policy on Evaluation, which was designed to ensure that credible, timely and neutral information on the ongoing relevance and performance of direct program spending is available and used to support evidence-based decision making.
TB released the Policy on Results on July 1, 2016, which replaced the Policy on MRRS, and the Policy on Evaluation. Transition to full implementation of the new policy is to be completed in 2017-2018. During the transition period, departments are expected to continue to support requirements under the Policy on MRRS and the Policy on Evaluation.
Relevant similarities between the Policy on MRRS and the Policy on Results
The Policy on MRRS and the Policy on Results are broad management frameworks designed to enable informed decisions about resource allocation and program delivery. Evidence-based performance measurement principles, financial and non-financial data utilization, horizontality (initiatives that cut across programs and departments), program design and management flexibility, and achievement of results for Canadians through consistent monitoring and reporting are attributes of both policies.
Relevant differences between the Policy on MRRS and the Policy on Results
The objective of the Policy on Results is to ensure that, at a broader level, government departments are focused on not just short-term priorities but longer-term mandates. The key differences between the two policies as they relate to performance measurement are defined in the table below.
Policy on MRRS | Policy on ResultsFootnote 7 | |
---|---|---|
Measuring | Program Alignment Architecture (PAA) organizes program-based spending based on Strategic Outcomes (SOs). PMF operationalizes SOs into performance indicators by program and sub-program. Performance Measurement Strategies (PMS) define the logic model, profile, theory of change, and set of relevant performance indicators, including targets, baselines, data collection frequency, and data sources for indicators. | Retires the PAA and SOs. DRF replaces the PMF. DRF aligns the Department's core responsibilities to results and describes how the results will be assessed. Program Inventories store Program Information Profiles, which describe program performance-related information similar to information captured in PMS. Meta data through data tags will provide alignment-related information without requiring a formal structure as was the case under the PAA. Qualitative indicators are introduced. |
Evaluating | Evidence-based assessment of relevance and performance of all programs once every five years (required under the Financial Administration Act and outlined in the now rescinded Policy on Evaluation). | Policy on Evaluation is subsumed under the Policy on Results, but continues as a critical function both by the Department and through centrally-led evaluations. Departments will have more evaluation coverage flexibility. Changes require identifying and chairing a committee of senior officials (a Performance Measurement and Evaluation Committee) to oversee departmental performance measurement and evaluation, and designating a head of performance measurement, and a head of evaluation at an appropriate level. |
Reporting | The Report on Plans and Priorities (RPP) outlines the Department's financial and non-financial goals for the year; program indicators are generally based on indicators contained in the PMF. The Departmental Performance Report (DPR) describes achievement of results based on expectations outlined in the RPP. Departments are required to table both documents. Evaluation Reports summarize findings of evaluations. |
Reporting structure for the RPP not yet defined. Evaluation reports will continue to summarize findings of evaluations. |
The Policy on Results does cover a broader spectrum than the Policy on MRRS. For example, it defines broader roles and responsibilities for: the Deputy Head; the Performance Measurement and Evaluation Committee (formerly a Departmental Evaluation Committee); program officials, and calls for the creation of a Head of Performance Measurement.
The Policy on Results and application of "Deliverology" to the Canadian federal context contains overlapping goals and approaches. In consideration of these factors, the Audit of Performance Measurement describes the current state of performance measurement, taking into consideration the comprehensive assessment of the state of performance measurement conducted and reported in the Department's 2013 Annual Report on the State of Performance Measurement, and gaps relative to the anticipated requirements under the Policy on Results and "Deliverology".
2. Audit Objective and Scope
2.1 Audit Objective
The objective of the audit was to assess the Department's readiness and capacity to implement the changes required under the TB Policy on Results as they relate to performance measurement and to identify relevant performance measurement considerations.
2.2 Audit Scope
The scope of the audit included assessments of the adequacy and effectiveness of INAC's performance measurement system with a focus on the risk areas identified during the planning phase. The audit focused on the period from March 31, 2015 to August 1, 2016 and included the review of some earlier documentation.
The audit scope included the assessment of compliance with program performance measurement and excluded performance measurement related to internal service delivery (e.g., finance, human resources, etc.).
3. Approach and Methodology
The Audit of Performance Measurement was planned and conducted in accordance with the requirements of the TB Policy on Internal Audit and followed the Internal Auditing Standards for the Government of Canada. The audit team examined sufficient, reliable and relevant evidence to provide a reasonable level of assurance in support of the audit conclusion. The audit approach included, but was not limited to:
- Interviews with INAC employees, including staff, managers and senior leadership in regions and at Headquarters; and
- Documentation review including DPRs, RPPs, performance measurement-related findings and recommendations from recent departmental audits and evaluations, business plans, quarterly reports, Performance Measurement Strategies (PM Strategies), PMF indicators, performance measurement-related studies undertaken by INAC, and secondary literature.
The audit relied on data analysis performed by the Department to assess the appropriateness of performance indicators contained in the PMF and reported in the DPR.
The approach used to address the audit objective included the development of audit criteria, against which observations and conclusions were drawn. The audit criteria were informed by attributes of effective performance measurement systems defined in the Department's 2013 Annual Report on the State of Performance Measurement, which provided an assessment of the state of performance measurement in support of evaluation at INAC and ensured compliance with the 2009 TB Policy on Evaluation and Directive on the Evaluation Function. Furthermore, the audit criteria considered key risk areas identified during the planning phase and considered requirements under the Policy on Results, which was published during the conduct phase. The audit criteria are included in Appendix A.
4. Conclusion
The audit found that the Department has initiated relevant actions to support its readiness and capacity to implement the changes required under the new Policy on Results. To support the transition to the Policy on Results, opportunities to aid its implementation have been identified. These include: clearer, better communicated strategic direction; data governance consistent with requirements of the new policy; expanded understanding of data needs for planning in Indigenous communities that can be used for performance reporting; and, more consistent use of performance information across the Department to achieve strategic objectives.
5. Findings and Considerations
Based on evidence gathered through a number of audit techniques including the examination and analysis of documentation, and the conduct of interviews, each audit criterion was assessed by the audit team. Where gaps were identified in the observed practice, they were evaluated for the purpose of developing considerations to aid the implementation and transition to the Policy on Results.
This section describes the findings and considerations identified by the audit.
5.1 Leadership and Strategic Direction
The audit expected to find evidence that senior management actively supports a performance measurement culture through performance measurement-related communications, routine discussion of performance measurement-related information in senior committees, clear prioritization of performance measurement, and a strategy for implementing performance measurement-related changes defined in the Policy on Results and expectations under "Deliverology".
Senior Management has taken steps to support a transition to the Policy on Results and "Deliverology", recognizing the transition presents a unique opportunity for INAC to improve performance measurement effectiveness and to support a performance measurement culture in the Department.
The Assistant Deputy Minister (ADM) Steering Committee on Results and a Director General-level Delivery and Results Committee were established to support implementation of the Policy on Results and the Government's Results Agenda. The audit noted that these senior level committees are working toward developing the core requirements of the Policy on Results. For example, INAC has developed a DRF, Program Inventory, Results Charter and Mini-Charters (expectations under "Deliverology"). The positions of Chief Results and Delivery Officer and Chief Data Officer have been designated.
Leadership challenges related to communications and prioritization of performance measurement were identified. Audit work indicated that performance measurement communications related to program results were infrequent and that support for results-based management in the Department is inconsistent. A review of some committee meeting minutes for the period in question found that performance measurement-related discussions and action items focused on program risk monitoring, updates to the MAF, and updates on specific initiatives. In addition, interviews suggested prioritization of performance measurement activities is unclear.
"Deliverology" and to a lesser extent the Policy on Results are viewed by those interviewed as an opportunity to develop a stronger culture of results-based management in the Department. However, a change management strategy that defines and communicates the transition journey (timing, sequencing, dependencies, milestones, and investment levels of human and financial resources) to full implementation of the Policy on Results and "Deliverology" has yet to be developed, as the new DRF and program inventory were still in progress at the time of the audit. It was observed that the Department has put into motion a plan to implement the Policy on Results, and that key success factors were identified and include:
- Leadership, collaboration and accountabilities;
- Communication and engagement;
- Capacities and tools; and
- External engagement with Indigenous peoples.
Consideration
As part of the implementation plan for the Policy on Results, the Senior Assistant Deputy Minister of Policy and Strategic Direction with the Head of Performance Measurement and the Head of Evaluation could coordinate the development of a series of activities that will ensure that the Policy on Results objectives, requirements and accountabilities, are clearly understood, including roles and responsibilities, the implementation roadmap, and the importance of results-based management within the Department.
5.2 Clear Accountability
The audit expected to find that accountabilities and roles and responsibilities related to performance measurement are clearly defined and well understood by staff.
INAC has worked in recent years to improve the governance of performance measurement. The Department has rationalized departmental program authorities, refined its PMF, and worked to clarify program-level PM Strategies, which included clarification of roles and responsibilities. The Department has developed a Policy on Performance Measurement Strategies that outlines the roles and responsibilities for performance measurement across the Department. Although out of scope for this audit, interviews suggest that performance measurement roles and responsibilities have been reinforced by the business planning process and quarterly reporting.
The audit also reviewed a selection of PM Strategies to assess the adequacy of descriptions of roles and responsibilities. Although program governance and stakeholders sections were consistent features of the PM Strategies reviewed, roles and responsibilities defined in the performance measurement matrix varied; some PM Strategies identify a position or number of positions and some identify a Directorate or Branch as responsible. While differences across PM Strategies related to governance were identified, they reasonably reflect program realities and are consistent with the flexibility afforded to program officers to develop strategies that suit the realities on the ground.
The Government's Indigenous agenda and ongoing work in and with Indigenous communities envisions an accountability structure grounded in a renewed nation-to-nation relationship. Towards this end, the Assembly of First Nations is leading a transformation to a new Indigenous Investment Management Framework, which will support public reporting on community, nation and population development as opposed to program-related outcomes. The Department is committed to working with communities to support this transition.
Although the audit found roles and responsibilities for program results are generally well defined and understood by staff, senior management recognizes that results are likely to become increasingly the responsibility of Indigenous communities. They noted that the impact of the process of reconciliation with Indigenous communities on departmental accountability for results and the implementation of the Policy on Results will need to be clarified.
Consideration
Establishing clear accountabilities had been identified as a success factor for the implementation of the Policy on Results. The same consideration identified in the previous section 'Leadership and Strategic Direction' should be considered to ensure clear accountabilities are in place.
5.3 Client-focused Service
The audit expected to find evidence of sufficient departmental capacity to execute performance measurement activities including knowledge of resources and available staff as well as evidence that the Department considered and incorporated Indigenous community needs into its performance measurement approach.
Interviews found that knowledge and understanding of performance measurement principles across the Department have improved as a result of work to expand coverage of program-level PM Strategies. Access to Senior Strategic Outcome Advisors, Policy and Strategic Direction results-based management specialists embedded in sectors to provide support and advice to senior management and support from Evaluation, Performance Measurement and Review Branch staff in the establishment and updating of PM Strategies were noted as useful enablers.
Capacity challenges underscored in interviews related less to available knowledge and ability of staff to analyze and report on program performance and more to staff availability from a timing perspective to conduct performance measurement activities. Audit work indicated limited performance measurement capacity including any initiatives to respond to changing/evolving needs (either from the Department or from Indigenous communities) as a result of operational and program administration requirements.
Consideration and incorporation of community needs into the performance measurement process was also assessed. In response to a series of independent enquiries, the Department has sought to reduce reporting requirements placed on Indigenous communities while continuing to provide Parliament and Canadians with information sufficient to assess program performance.Footnote 8 The Department has initiated consultations and pilot programs with Indigenous communities to support streamlined data collection and reporting obligations. The Simplified Reporting InitiativeFootnote 9 has identified community needs-related challenges that future pilot projects aim to address, and which consider Indigenous reporting capacity and relevance of reporting requirements for communities. The goals of the Simplified Reporting Initiative include:
- Simplify reporting through consultation with Indigenous communities, leveraging best practices and working with programs, regions and internal services to reduce reporting requirements;
- Increase reporting flexibility through web-based solutions and predictable reporting schedule; and,
- Improve data sharing with Indigenous communities through web-based solutions.
In addition to the Simplified Reporting Initiative, INAC has also completed a Data Collection Review and Approval initiative and has successfully reduced reporting requirements contained in Data Collection Instruments (DCIs), which is described in the following section.
Interviews indicated that some programs are developing engagement strategies. However, there is no oversight body to provide direction to ensure that these engagement strategies are well developed and to facilitate collaboration among sectors.
Consideration
As engagement across sectors and programs with Indigenous communities progresses, the Senior Assistant Deputy Minister of Policy and Strategic Direction, the Head of Performance Measurement and the Head of Evaluation can assist and provide advice to sectors to ensure that engagement with the Indigenous communities includes performance measurement and is aligned with INAC's Departmental Results Framework, data strategy, and the broader implementation of the Policy on Results.
5.4 Results and Performance
The audit expected to find that INAC had established quality and credible performance information, that it used performance information to make decisions, and that performance is monitored and supported by established systems and processes.
Quality and credible performance information
Developing quality and credible performance information is a critical attribute of effective performance measurement systems. Despite a combination of factors such as program complexity, long-term outcomes, and limited administrative control over the data collection process, INAC has worked to improve the quality and credibility of its performance information. As a result of such improvements, INAC improved and completed methodologies for performance indicators as reflected in TB's 2015-2016 MAF report. Furthermore, the Department's internal reporting and tracking tools were assessed to be fully aligned to the PAA to enable reporting on program results to the lowest level programs.
MAF results identified challenges related to indicator data availability. According to MAF, results for the Department's lowest level programs in 2015-2016 were not available, and the Department was unable to provide trend data covering at least two years for its performance indicators.
In response to MAF findings related to quality and credible performance information, the Department undertook a comprehensive internal review of its PMF indicators. The review identified four issues related to INAC's PMF indicators: lack of indicator clarity; expected result or indicator framing being too high level given actual program activity; measurement or data quality challenges; and, alignment concerns between indicator and expected results. Based on this review, the Department is working to reduce the percentage of problematic indicators by the end of 2016-2017. Audit work has also noticed that continuous work is being done to refine performance information quality and credibility and how performance information is used.
Findings of audits, evaluations, and horizontal evaluations conducted in the past five years, which included reference to performance measurement, were also considered. Of the 66 performance measurement-related findings, 20 findings related to quality and credible performance information and was the most common challenge category. Data availability challenges, indicator alignment, and inadequate measurement and reporting of program performance were common themes in the findings.
The audit indicated that the Department has demonstrated steady improvement in performance information quality and credibility. Implementation of the Directive on Results will require the involvement of program officials and Deputy Heads for all future requests for funding from TB to confirm that performance information is "valid, reliable, and accurately represented".Footnote 10 However TB provides insufficient information in its Directive on the criteria to be used for the verification of the validity, reliability and accuracy of performance information. As the implementation of the new policies and the Indigenous agenda move forward, the Department recognizes the importance of continuing to advance the quality and credibility of performance information, taking into account Indigenous peoples' needs and public reporting requirements.
Use of performance information for decision making
Interviews and documents reviewed indicate that the Department has faced challenges related to the consistent use of performance information. MAF results found that although senior management uses performance information during the year for risk mitigation, priority adjustment and resource allocation decisions, program efficiency and effectiveness information to establish priorities is not consistently used.Footnote 11 As well, an INAC internal assessment of its 2014-2015 PMF indicators found that 31 of 107 performance indicator targets could not be assessed due to measurement challenges.
Underutilization of collected performance information has also been noted as an issue. Based on interviews, full use of program data has been constrained due to limited trend data, outdated tools to perform required analysis, or misalignment between data collection and program planning needs. The audit noted opportunities exist to more efficiently and effectively use the performance information collected by the Department to make program management and improvement decisions. However, a critical element will be Indigenous community participation in defining critical elements of useful performance information.
Performance information processes and systems
The Department has completed a number of internal process enhancements designed to both "measure what matters" and to reduce the reporting burden placed on Indigenous communities. As a component of the Simplified Reporting Initiative, between 2013-2014 and 2016-2017, the Department reduced 21% of total Program DCIs, INAC's primary method of program-level performance data acquisition. Within the DCIs, the Department also eliminated 15% of all data fields. The Department has refined its data elements description approval process, clarified its data collection analysis criteria, strengthened Data Operations Services challenge function during the approval process, and developed a system that will support pre-population of known DCI data. The Department has also initiated a national pilot with 26 Indigenous communities to support greater Indigenous participation in data management and governance.
Audit work indicated that the collection of reliable information to report on progress against results defined in the Government's Indigenous agenda is a key departmental priority. Leveraging existing performance information to report on progress against commitments outlined in the Ministers' Mandate Letter has been challenging due to a combination of the horizontality of the commitments (involving multiple sectors and cooperation with different departments) and pre-existing data limitations discussed above. In addition to working to make data more accessible by developing an inventory of program performance data, storage locations, and permissions, the Department will need broad-based cooperation between programs, sectors (including the Chief Information Officer), regions, and Indigenous communities to make data collection and assessment practices suitable to the task of reporting on the Indigenous agenda.
Consideration
The Chief Results and Delivery Officer could work with program managers, and the Chief Information Officer as required, to develop and implement a sustainable enterprise approach/data strategy to advance INAC capacity in managing and using performance data and information (e.g. define data, information management and information technology requirements for data collection/storage/accessibility and monitoring program delivery) while ensuring adequate coordination with other INAC data streamlining and/or development initiatives.
Appendix A: Audit Criteria
To ensure an appropriate level of assurance to meet the audit objectives, the following audit criteria grouped by line of enquiry were developed.
Leadership and Strategic Direction
1.1 Senior management provide clear and consistent direction to staff in regard to managing for results.
1.2 Program managers and staff understand leadership's direction related to managing for results.
Clear Accountability
2.1 Managers and staff understand their roles and responsibilities related to performance measurement.
2.2 PM strategies define roles and responsibilities of managers and staff as well as accountabilities for results.
2.3 Performance-related recommendations from previous audits and evaluations assign roles and responsibilities for corrective action that are articulated in management action plans.
Client-focused Service
3.1 Performance measurement processes engage with Indigenous communities to determine needs, capacity, and incorporate information appropriately.
Results and Performance
4.1 Performance measurement at INAC includes clearly established baselines and targets, data sources, acquisition techniques.
4.2 Performance measurement resources are easily accessible.
4.3 Systems and processes for performance measurement data collection, storage, monitoring and verification exist.
4.4 Performance measurement information is used for program management, improvement, decision-making, policy development and reporting.
Appendix B: Relevant Legislations, Policies, Directives and Guidance
The following authoritative sources (i.e. Policies/Directives) were examined and used as a basis for this audit:
- TB Policy on Management, Resources and Results Structures
- TB Policy on Evaluation
- TB Policy on Results and related Directive
- TB Policy on Internal Audit
- INAC Policy on Performance Measurement Strategies