Audit of the Management Control Framework for Grants and Contributions 2013-14 Focus on Program Control Frameworks and Recipient Reporting

February 2015
Project #: 13-46

PDF Version (320 Kb, 44 Pages)

 

Table of contents

Acronyms

AANDC

Aboriginal Affairs and Northern Development Canada

ACRS

Asset Condition Reporting System

CFO

Chief Financial Officer

CI

Community Infrastructure

DCI

Data Collection Instruments

EIS

Education Information System

FNCFS

First Nations Child and Family Services

FSO

Funding Services Officers / Field Services Officers

GCIMS

Grants and Contributions Information Management System

G&C

Grants and Contributions

HQ

Headquarters

ICMS

Integrated Capital Management System

MCF

Management Control Framework

PSD

Policy and Strategic Direction

PSE

Post-Secondary Education

PSPP

Post-Secondary Partnerships Program

PSSSP

Post-Secondary Student Support Program

RO

Regional Operations

TB

Treasury Board

TPCOE

Transfer Payments Centre of Expertise

UCEPP University and College Entrance Preparation Program
 

 

Executive Summary

Background

The Audit and Evaluation Sector of Aboriginal Affairs and Northern Development Canada ("AANDC" or "the Department") identified an Audit of the Management Control Framework for Grants and Contributions in the Department's 2013-14 to 2015-16 Risk-Based Audit Plan, approved by the Deputy Minister on February 27, 2013. A horizontal audit of the management control framework for grants and contributions management has been completed each year since 2010-11. The scope of these audits varies each year based on an assessment of risks during the planning phase of the audit and is designed to focus on a selection of key controls from the Department's Management Control Framework for Grants and Contributions. This year's audit focused on the design, approval and implementation of program control frameworks and on recipient reporting requirements.

AANDC makes funding available to First Nations and other recipients through Grants and Contributions (G&C) for the delivery of programs and services, including education, land management, social development and community infrastructure. Total departmental spending on G&C was $6.4 billion, $6.7 billion and $6.5 billion for the fiscal years 2011-12, 2012-13 and 2013-14, respectively.

AANDC's transfer payment programs are administered in accordance with the Treasury Board Policy on Transfer Payments and Directive on Transfer Payments, which took effect on October 1, 2008. The Policy on Transfer Payments outlines the expectations that risk-based approaches are adapted to the design of transfer payment programs, the preparation of terms and conditions, and funding agreements, and recipient monitoring and auditing. The objective of the Policy and Directive is to manage transfer payment programs with integrity, transparency and accountability, taking into account the risks, and to ensure that programs are effectively focused on citizens and beneficiaries, and are designed to achieve various Federal Government priorities and expected results.

In order to meet the expectations of the Policy and Directive, the Chief Financial Officer (CFO) Sector established the Transfer Payments Centre of Expertise, which has put in place the Management Control Framework (MCF) for G&C in order for the Department to effectively manage and monitor G&C programs and to ensure compliance to the Policy and Directive. AANDC's CFO is accountable for the overall management of transfer payment funds and, as such, is the custodian of the MCF for G&C.

The MCF for G&C establishes roles and responsibilities for the delivery of G&C, specifically to program management (the design and implementation of a program) and transfer payment operations (operations of a program with recipients). The MCF for G&C represents the departmental expectations of how G&C are to be managed across regions and at headquarters, and includes controls grouped into the following four areas: program design and approval; program monitoring and reporting; funding agreement development; and, transfer payment monitoring and reporting.

Audit Objective and Scope

The objectives of the audit were to assess:

  1. the adequacy and effectiveness of departmental processes in supporting the design and approval of risk-based program control frameworks; and
  2. the adequacy and effectiveness of controls governing and supporting the collection and use of recipient reporting.

The scope of the audit covered the period April 1, 2011 to December 31, 2013 and included assessments of:

  • departmental processes which govern the development and approval of program control frameworks, giving particular consideration to whether these frameworks consider program and recipient risks and if they are designed to achieve maximum integration with other AANDC transfer payment management processes, tools and systems;
  • departmental processes for managing the design, approval and rationalization of recipient reporting requirements, while ensuring that sufficient information is being gathered to support the Department's stewardship and accountability reporting obligations;
  • processes for collecting, reviewing and analyzing recipient reports for a sample of three First Nation-focused programs (see Section 3.1 for an overview of the methodology used to select the sample);,
  • processes for using information gathered from the analysis of recipient reports in making risk-informed decisions, including decisions related to: developing future agreements and defining the nature of the relationship with the recipient; providing support to First Nation recipients; monitoring implementation of First Nation programs and fulfillment of agreement obligations; and, increasing or decreasing future reporting obligations; and,
  • regional delivery structures and human resource levels for the administration of grant and contribution programs in AANDC southern regions.

Statement of Conformance

This audit conforms with the Internal Auditing Standards for the Government of Canada, as supported by the results of the quality assurance and improvement program.

Observations

The Department has made progress in recent years in developing national policies for programs which aim to reduce variability in program delivery and funding approaches and streamline recipient reporting requirements. Regions are each proactively adjusting their organizational structures and internal role assignments to meet changing program and systems requirements; however each region is evolving independent of one another, making implementation of HQ-driven program control frameworks and systems challenging. An added complexity is that regions have made different levels of investment in program delivery staff and agreement management staff, contributing to variability in program implementation.

While the three programs examined had introduced some of the elements that would be expected in a strong program control framework, they included few risk-based approaches and generally lacked materials and training to support regions and recipients in implementation. In some instances, a lack of risk-based management practices led to over-control of low risk projects and recipients, while in other situations it manifested as insufficient control.

For the past two years the Department has been, and continues to be, focused on streamlining performance measurement strategies and recipient reporting requirements. Effective for the 2014-15 fiscal year, this will have resulted in a significant reduction in the number of reports being requested of recipients. While the Department has made progress in reducing recipient reporting requirements, the audit found that the data collected from recipients is not yet being used to its full potential by regions and programs. Programs and regions are working on rolling-out several new or improved program information systems aimed at centralizing data collection and supporting performance measurement. While implementation challenges associated with these systems has resulted in unexpected burden for recipients and regions, they are leading to clear improvements in the completeness and accuracy of data being collected.

Conclusion

The audit found that the Department does not take a horizontal approach to designing, approving or implementing program control frameworks. While the three programs examined had introduced some of the elements that would be expected of a program control framework, there was considerable opportunity to improve the consistency of approach across programs and the thoroughness of implementation in regions.

Although the audit found that the Department has made progress in reducing recipient reporting requirements for the programs included in the scope of the audit, the data collected from recipients is not being used to its full potential, nor are risk-based approaches being consistently used to target attention at areas of greatest need (e.g. limited risk-based reporting, risk-based compliance reviews and risk-based investments in community development / case management).

While regions are proactively adjusting their organizational structures and internal role assignments to meet changing program and systems requirements, each region is evolving independent of one another, making implementation of HQ-driven program control frameworks and systems challenging. An added complexity is that regions have made different levels of investment in program delivery staff and transfer payment management staff, resulting in varying degrees of program implementation across regions.

Recommendations

The audit team identified areas where management control practices and processes could be improved, resulting in the following four recommendations.

  1. The Chief Financial Officer and the Senior Assistant Deputy Minister, Regional Operations, in collaboration with the Assistant Deputy Minister, Northern Affairs Organization and program Assistant Deputy Ministers, should review and clarify options within existing departmental processes, governance structures, accountabilities, responsibilities and authorities for developing and approving program control frameworks and establish a single window approach to communicating program control frameworks to regions and recipients.
  2. The Chief Financial Officer and the Senior Assistant Deputy Minister, Regional Operations, in collaboration with the Assistant Deputy Minister, Northern Affairs Organization and program Assistant Deputy Ministers, should establish a function(s) to provide program design and program management expertise to HQ programs that are developing and implementing new and amended program control frameworks. This function could include a blend of existing expertise in program design and regional implementation with expertise in the development of risk-based program management regimes.
  3. The Chief Financial Officer and the Senior Assistant Deputy Minister, Regional Operations, in collaboration with the Assistant Deputy Minister, Northern Affairs Organization and program Assistant Deputy Ministers, should improve alignment of program Data Collection Instruments with program Performance Measurement Strategies for their respective programs to further streamline data being collected from recipients. This should include delineating information required for performance measurement from information being collected for possible compliance activities to allow for the application of risk-based reporting regimes (e.g. for projects, programs and recipients).
  4. The Chief Financial Officer and the Senior Assistant Deputy Minister, Regional Operations, in collaboration with the Assistant Deputy Minister, Northern Affairs Organization and program Assistant Deputy Ministers, should review key transfer payment management functions to promote greater consistency across regions, including regional organizational structures, classifications, capacity levels and role assignments.

Management Response

Management is in agreement with the findings, has accepted the recommendations included in the report, and has developed a management action plan to address them.

 

 

1.Background

The Audit and Evaluation Sector of Aboriginal Affairs and Northern Development Canada ("AANDC" or "the Department") identified an Audit of the Management Control Framework for Grants and Contributions in the Department's 2013-14 to 2015-16 Risk-Based Audit Plan, approved by the Deputy Minister on February 27, 2013. A horizontal audit of the management control framework for grants and contributions management has been completed each year since 2010-11. The scope of these audits varies each year based on an assessment of risks during the planning phase of the audit and is designed to focus on a selection of key controls from the Department's Management Control Framework for Grants and Contributions. This year's audit focused on the design, approval and implementation of program control frameworks and on recipient reporting requirements.

AANDC makes funding available to First Nations and other recipients through Grants and Contributions (G&C) for the delivery of programs and services, including education, land management, social development and community infrastructure. Total departmental spending on G&C was $6.4 billion, $6.7 billion and $6.5 billion for the fiscal years 2011-12, 2012-13 and 2013-14, respectively. In 2012-13, $4.4 billion of G&C funding was expended on First Nations-focused education, social development and infrastructure programming. The following table outlines the total G&C program spending by Region for the 2012-13 fiscal year:

Table 1: Total Program Spending By Region 2012-13 ($ millions)*
  NU AT QC ON MB SK AB BC NT YT HQ Total
Education 1 65 133 353 310 289 277 255 2 4 8 1,697
Elementary and Secondary Education 1 48 110 268 263 229 233 203 2 2 8 1,367
Post-Secondary Education   17 23 85 47 60 44 52   2   330
Social Development   121 150 285 357 254 315 185   25   1,692
Assisted Living   7 14 11 29 13 8 11   5   98
Family Violence   3 4 6 3 3 10 4   1   34
First Nation Child and Family Service   38 67 131 131 80 137 58   8   650**
Income Assistance   73 65 126 194 137 152 103   9   859
National Child Benefit Re-Investment       11   21 8 9   2   51
Community Infrastructure   43 82 244 206 144 141 156   10 13 1,039
Infrastructure Assets & Facilities   27 33 90 89 42 48 69   3 13 414
Education Facilities   3 13 66 45 34 34 20       215
Housing   2 9 27 19 20 20 17   3   117
Renewable Energy and Energy Efficient       1       1       2
Water and Wastewater Infrastructure   11 27 60 53 48 39 49   4   291
Other AANDC G&C Programs 3 115 89 137 102 115 80 279 34 111 1,234 2,299
Total 4 344 454 1,019 975 802 813 875 36 150 1,255 6,727
* Data compiled from GCIMS System in January 2014.
** FNCFS program spending includes $24M of funds spent on a small number of other social services, such as day care services.
 

AANDC's transfer payment programs are administered in accordance with the Treasury Board Policy on Transfer Payments and Directive on Transfer Payments, which took effect on October 1, 2008. The Policy on Transfer Payments outlines the expectations that risk-based approaches are adapted to the design of transfer payment programs; the preparation of terms and conditions and funding agreements; and, recipient monitoring and auditing. The objective of the Policy and Directive is to ensure that: transfer payment programs are managed with integrity, transparency and accountability, taking into account the risks; and that programs are effectively focused on citizens and beneficiaries, and are designed to achieve various Federal Government priorities and expected results.

In order to meet the expectations of the Policy and Directive, AANDC's Chief Financial Officer (CFO) Sector established the Transfer Payments Centre of Expertise, which has put in place the Management Control Framework (MCF) for G&C in order for the Department to effectively manage and monitor G&C programs and ensure compliance to the Policy and Directive. AANDC's CFO is accountable for the overall management of transfer payment funds and, as such, is the custodian of the MCF.

The MCF establishes roles and responsibilities for the delivery of G&C – specifically to program management (the design and implementation of a program) and transfer payment management (operations of a program with recipients). The MCF for G&C represents the departmental expectations of how G&C are to be managed across regions and at headquarters (HQ), and includes controls grouped into the following four areas, as outlined below:

The audit covered aspects of all four control areas as program control frameworks are subjected to the Department's program design and approval process and include controls related to the other three areas (program monitoring and reporting; risk-based funding agreement development; and, risk-based recipient reporting and monitoring).

In support of the Department's administration of G&C programming, AANDC collects reports containing financial and non-financial information from recipients. This information serves a variety of purposes, including supporting:

 

 

2. Audit Objective and Scope

2.1 Audit Objectives

The objectives of the audit were to assess:

  1. the adequacy and effectiveness of departmental processes in supporting the design and approval of risk-based program control frameworks; and,
  2. the adequacy and effectiveness of controls governing and supporting the collection and use of recipient reporting.

2.2 Audit Scope

The scope of the audit covered the period April 1, 2011 to December 31, 2013 and included assessments of:

  • departmental processes which govern the development and approval of program control frameworks, giving particular consideration to whether these frameworks consider program and recipient risks and are designed to achieve maximum integration with other AANDC transfer payment management processes, tools and systems;
  • departmental processes for managing the design, approval and rationalization of recipient reporting requirements, while ensuring that sufficient information is being gathered to support the Department’s stewardship and accountability reporting obligations;
  • processes for collecting, reviewing and analyzing recipient reports for a sample of three First Nation-focused programs (Capital Facilities and Maintenance Program; First Nations Child and Family Services Program; and Post-Secondary Education Programs). See Section 3.1 for an overview of the methodology used to select the sample;
  • processes for using information gathered from the analysis of recipient reports in making risk-informed decisions, including decisions related to: developing future agreements and defining the nature of the relationship with the recipient; providing support to First Nation recipients; monitoring implementation of First Nation programs and fulfillment of agreement obligations; and, increasing or decreasing future reporting obligations; and,
  • regional delivery structures and human resource levels for the administration of grant and contribution programs in AANDC southern regions.
 

 

3. Approach and Methodology

The audit was conducted in accordance with the requirements of the Treasury Board Policy on Internal Audit and followed the Internal Auditing Standards for the Government of Canada. The audit examined sufficient, relevant evidence to provide a reasonable level of assurance in support of the audit conclusion.

The principal audit techniques used included:

In order to develop a sampling methodology that addressed the audit criteria, as identified in Appendix A, a sample of programs, regions and recipients were selected for testing. The following outlines the approach used to select samples from each of the three categories.

3.1 Selection of Programs to Audit

The first element of the sampling methodology considers the programs to be selected for testing. Factors in the selection of the programs to be sampled included the following:

  • Size (dollar value) of programs;
  • Recentness of program design/redesign initiatives; and,
  • Coverage across program sectors.

Based on the above analysis and with the objective being to assess program control frameworks and recipient reporting requirements horizontally across regions and programs, the following programs were selected as part of the conduct of on-site field testing at HQ and in the regions:

  • Community Infrastructure Program (Capital Facilities and Maintenance Program);
  • First Nations Child and Family Services Program (FNCFS program); and,
  • Post-Secondary Education Programs (Post-Secondary Partnerships Program ("PSPP"), Post-Secondary Student Support Program ("PSSSP") and the University and College Entrance Preparation Program (UCEPP)).

3.2 Selection of Regions for Site Visits

The second element of the sampling methodology considered the regional offices to be visited. Factors in the selection of the regions to be visited included the following:

  • Size (dollar value) of selected programs (selected above) funded by regional offices; and,
  • Feedback obtained during the planning phase by individuals interviewed at HQ and regions.

Based on the above analysis and with the objective being to assess program control frameworks and recipient reporting requirements horizontally across regions and programs, the following regional offices were selected for on-site field testing:

  • Ontario (January 10-14, 2014);
  • British Columbia (January 27-31, 2014); and,
  • Manitoba (February 10-13, 2014).

To gain an understanding of how all regions are adapting their organizational structures and staff roles, we expanded our interviews from the three regions visited to also include teleconferences with senior departmental officials in the Alberta, Saskatchewan and Quebec regions. The audit also included some telephone discussions with a selection of officials from First Nation communities, identified in consultation with regional offices.

 

 

4. Conclusion

The audit found that the Department does not take a horizontal approach to designing, approving or implementing program control frameworks. While the three programs examined had introduced some of the elements that would be expected of a program control framework, there was considerable opportunity to improve the consistency of approach across programs and the thoroughness of implementation in regions.

Although the audit found that the Department has made progress in reducing recipient reporting requirements for the programs included in the scope of the audit, the data collected from recipients is not being used to its full potential, nor are risk-based approaches being consistently used to target attention at areas of greatest need (e.g. limited risk-based reporting, risk-based compliance reviews and risk-based investments in community development / case management).

While regions are proactively adjusting their organizational structures and internal role assignments to meet changing program and systems requirements, each region is evolving independently of one another, making implementation of HQ-driven program control frameworks and systems challenging. An added complexity is that regions have made different levels of investment in program delivery staff and transfer payment management staff, resulting in varying degrees of program implementation across regions.

 

 

5. Findings and Recommendations

Based on the evidence gathered through examination of documentation, interviews and analysis, each audit criterion was assessed and concluded upon. Where a significant difference between the audit criterion and the observed practice was found, the risk of the gap was evaluated and used to develop the conclusion and corresponding recommendations for improvement.

5.1 Design of Program Control Frameworks

The audit included examination of the design and approval processes used by three programs which have recently undergone changes to their program control frameworks. The three programs and the changes undertaken in each include:

  • Community Infrastructure Program (CI) updated its program control framework for major capital investments in 2009, moving to a national First Nations Infrastructure Investment Plan and introduced the Integrated Capital Management System (ICMS) to automate business processes and improve tracking of First Nations' infrastructure inventories and conditions. In 2013, an update was undertaken of the CI sub-program that funds assessments of the conditions for First Nation infrastructure assets (Assets Condition Reporting System (ACRS) assessments).
  • First Nations Child and Family Services (FNCFS) Program is one of several social programs funded by AANDC. In 2012-13, the program control framework for all social programs was updated, and some national requirements were instituted for FNCFS, including the introduction of a FNCFS Annual Business Plan and adjustments to the Consolidated Annual Report for all child and family services agencies. FNCFS is also in the process of introducing a new national system to capture program data (FNCFS Information Management System).
  • Audit of the Management Control Framework for Grants and Contributions 2013-14 Focus on Program Control Frameworks and Recipient Reporting, which funds post-secondary institutions to design and deliver university and college level courses tailored for First Nations and Inuit students, was updated for an April 2014 roll-out. Changes included a streamlined proposal intake and assessment process and a focus on programming in areas with high demand for labour. Also, a new information system and reporting regime was implemented in 2013-14 for the Post-Secondary Student Support Program (PSSSP) and the University and College Entrance Preparation Program (UCEPP). The Education Information System includes, among other things, a database for entering information collected on First Nations students who receive funding to pursue post-secondary education.

In each program control framework, we expected to find policy requirements, processes, implementation aids for regions and recipients, risk-based decision making frameworks, learning and development strategies and plans, recipient compliance monitoring processes, analysis and reporting to support regional program decisions, and a regime of quality assurance and continuous improvement activities being performed by the HQ program. A table showing the expected elements of a program control framework are depicted Exhibit 1.

Exhibit 1 – Expected Elements of Program Control Framework

Expected Elements of Program Control Framework
Description of Exhibit 1

Exhibit 1 depicts a circular flow-diagram that outlines the four core elements of a program control framework and the expected program control aspects for each element. The four core elements of program control frameworks are as follows: program design; program implementation; monitoring and measuring performance; and, quality assurance and improvement.

Program Design: The expected control aspects for the program design element are: Policy; Directives; and, a Risk-based Program Regime.

Program Implementation: The expected control aspects for the program implementation element are: Guides/Aids for Regions; Guides/Aids for Recipients; and, Learning and Development.

Monitoring and Measuring Performance: The expected control aspects for the monitoring and measuring performance element are: Recipient Compliance Monitoring; and, Regional Program Reporting.

Quality Assurance and Improvement: The expected control aspects for the quality assurance and improvement element are: Program Quality Assurance; and, Continuous Improvement.


 

As detailed in Table 2, we found that all three programs had significant gaps in their program control frameworks. While all three programs have established national-level policies that were subjected to senior-level review and approval, they were generally lacking in most other areas. Community Infrastructure had a reasonably complete control framework for its major capital projects and triennial assessments of on-reserve infrastructure (ACRS), but its operations and maintenance and minor capital projects were not well covered. The FNCFS program has introduced a risk-based compliance monitoring tool; however, we found that it had not been implemented in the regions we visited. FNCFS also funds evaluations of FNCFS agencies on a three-year cycle with a view to supporting continuous improvement. While these reviews were being completed by agencies, we did not see strong examples of the results being used to inform program decisions at the AANDC regional and program levels.

Table 2: State of Program Control Frameworks (PCF)
PCF Element FNCFS Community
Infrastructure
PSE
Policy/Directive Yes Yes Yes
Processes Partial Partial Partial
Risk-based Program Regime Partial No No
Implementation Guides /Aids for Regions No Partial No
Recipient Guides/Aids No No No
Learning and Development No No No
Recipient Compliance Monitoring No Partial No
Regional Program Reporting No Partial No
Program Quality Assurance No No No
Program Quality Assurance Partial Yes No
 

Based on our review of the design, change and approval processes applied to each program and our discussions with departmental officials, we determined that processes and responsibilities for undertaking program changes lack clarity. Firstly, involvement of the Chief Financial Officer (CFO) Sector, Regional Operations (RO) Sector, and other corporate functions varied from program to program. Secondly, it remained unclear as to who was responsible for supporting and challenging the majority of program control frameworks. While CFO Sector's Transfer Payments Centre of Expertise has core competencies in the areas covered by its financial- and agreement-focused directives, their expertise is admittedly not in program design and implementation. Likewise, while RO Sector's Planning and Business Integration Directorate has some capacity to work with programs undertaking major changes, they lack capacity and depth to support horizontal implementation and are not well positioned in the organization to play a challenge function on new and augmented program control frameworks.

All program and regional managers interviewed expressed the need for evolving a common approach across the Department with respect to the development and implementation of program control frameworks. This includes having common elements (e.g. policy, directives, processes, funding approaches, risk-based agreement management regimes, guides, tools, learning, program monitoring, recipient compliance, quality assurance and continuous improvement, etc.), common look and feel, common communication platforms for dissemination to regions and First Nations and a consistent annual cycle for rolling-out changes to regions and First Nations. The audit found that, in the absence of a clear departmental approach, programs concentrated predominantly on developing policies and rules providing limited support to regions and First Nations when they implement the programs. Some regions had evolved strong processes and guidance, but there was inconsistency in approach across regions visited for the programs included in the scope of our audit.

Through review of program design processes and interviews we noted that, while programs have traditionally invested most of their salary budgets in policy development expertise, some are moving to increase their program implementation expertise. However, all programs examined in our audit could benefit from shifting resources to hands on implementation expertise and making better use of regional expertise to design practical processes, guides, tools and training materials in support of program design and implementation.

5.2 Regional Delivery Structures and Staff Roles

Our audit found that regions have been adapting their delivery structures and internal staff role assignments to address changing program control frameworks, systems, and transfer payment management processes being rolled-out by HQ. To gain an understanding of how all regions are adapting their organizational structures and staff roles, we expanded our interviews from the three regions visited to also include teleconferences with senior officials in the Alberta, Saskatchewan and Quebec regions.

We found that each region is adapting its delivery model in different ways, with most regions moving responsibilities traditionally performed by Funding Services Officers (FSOs) to other agreement and program administration personnel (i.e. agreement development, management of recipient budgets and payment schedules, entering recipient reports into systems, assessing recipient reports to determine whether minimum program requirements have been met, and monitoring recipient compliance). In most regions, the FSOs continue to be the first line of communication with First Nations, are responsible for completing general assessments of recipients, work with communities in crisis, work with recipients in developing default remediation plans, and follow-up with recipients when reports are submitted late. In some regions, the FSOs retain some agreement management and program management responsibilities (e.g. CFS in Manitoba, minor capital projects in Ontario, capacity development programs in Quebec, assessment of select program reports in some regions, etc.).

The extent to which regions have established dedicated business units and created positions for agreement and program administration tasks appears to be linked in part to their relative size, with smaller regions having less executive and manager positions within which to manoeuver (i.e. maintaining appropriate spans of control for managers and Directors has an impact on organizational design). Appendix E includes an analysis of where the six types of regional staff are located in their respective regions.

The audit found that the differences in regional delivery structures and staff roles add complexity to the roll-out of program control frameworks, new systems, reporting changes and updates to transfer payment processes (e.g. compliance activities, recipient reinvestment plans for surplus funds on fixed agreements, etc.) We found that HQ program managers and other HQ-based functions are not always clear about whom to consult or brief in regions on changes and ongoing implementation issues, often providing briefings to regional program managers when it is the FSOs, heads of data units, or transfer payment management personnel who would benefit most from being consulted or briefed. We also found that the position classification categories (e.g. FI vs. AS vs. CR) and/or classification levels (e.g. AS-05 vs. AS-02) varied from region to region for certain roles, making it challenging for programs and the CFO Sector to gauge the level of guidance, training and oversight required by regional staff (see Table 4 later in this section).

To better understand the capacity of regions to implement program control frameworks, we analyzed each region's organizational design, resourcing levelsFootnote 1 and role assignments, focusing on three program areasFootnote 2: social programs; education programs; and, infrastructure programs. For purposes of our analysis, we grouped staff into six categories (Program Officers, Funding Services Officers, Other Transfer Payment Management Personnel, Data Entry Clerks, Community Development Officers, and Compliance Officers) according to the functions they perform. These functions do not necessarily equate to position titles or organizational units, which bare similar titles. For example, positions in Education directorates which are focused on data entry or compliance are included in the Data Entry Clerks and Compliance Officers categories, while program analysts, program officers, managers, directors, and administrative staff in education directorates are included in the Education Category. This distinction was necessary because organizational design and allocation of responsibilities differs from region to region.

Table 3: Comparison of Regional Staffing Levels for Certain Transfer Payment Management Functions ***
  AT QC ON MB SK AB BC
Funding Services (FSOs, Managers and Directors) 10.5 11 25 26 24.5 22.5 28
Transfer Payment Management (Details in Table 4) 5 10 13 12 11 10 11
Data Entry 0 2 15 4 17 6 10
Community Development 0 0 0 0 0 0 8
Compliance 3 3 7 6 0 1 6
Program Administration (Details in Table 5)
Infrastructure 5 12 45 20 21.5 20.5 38
Education 7 8 10 7.5 4 14 5
Social 11 8 9 6.5 10 23 15
Total 41.5 54 124 82 88 97 121
*** Data employed in our analysis was extracted from the AANDC HR information system, compiled based on the results of our audit and validated based on information from regional officials. Where a Director had responsibilities in both funding services and programs, their position was distributed equally to two role groups.
 

Table 3 compares the resourcing levels for each of the program and agreement management functions and highlights that certain regions have invested more resources in some functions than others. Considering that regional program management staff with responsibility for supporting education and social programs are typically assigned multiple programs, we were unable to delineate our data beyond the program cluster (i.e. figures presented in the table for "Education" and "Social" include resources devoted to all education and social programs).

Table 4 below shows the classification categories and levels for transfer payment management and administration personnel who perform agreement management tasks such as agreement development, budget management, and quality reviews. An interesting note is that the Alberta region leverages financial professionals to support financial related aspects of agreement development and management whereas other regions employ administration professionals and clerical personnel for these functions. Additionally, some regions tend to rely more heavily on positions with higher classification levels (e.g. Quebec and Saskatchewan).

Table 4: Resource Levels and Position Classifications for Transfer Payment Personnel ****
  AT QC ON MB SK AB BC
Financial and Audit Classifications           1.5 FI-03
0.5 FI-02
6 FI-01
 
Program Administration Classifications 1 PM-06 1 PM-06
3 PM-05
1 PM-06
2 PM-02
1 PM-06 3 PM-06
5 PM-05
  1 PM-04
Administrative and Clerical Classifications 3 AS-03
1 CR-04
1 AS-03
1 AS-02
4 CR-04
3 AS-03
1 AS-02
4 AS-01
2 CR-04
2 AS-03
2 AS-02
2 AS-01
5 CR-04
1 AS-04
2 AS-02
2 CR-05 1 AS-05
2 AS-04
1 AS-03
5 AS-02
1 CR-04
Total 5 10 13 12 11 10 11
**** For purposes of our analysis, Transfer Payment Management Personnel included individuals with responsibilities such as agreement development, cash flow management and administration of budget adjustments. It excluded compliance officers, data entry clerks, program officers, funding services officers and community development officers. Position data extracted from AANDC HR information system and compiled based on consultations with regional officials.
 

Considering that recipient demographics vary across regional offices, we assessed regional capacity according to the following three other factors: the amount of G&C program dollars administered by the regionFootnote 3; the number of First NationsFootnote 4 and Tribal Councils with whom the region administers funding agreements; and, the percentage of recipients (First Nations and Tribal Councils) in the region that are in default of their agreements and are undergoing some form of remedial action. This analysis highlighted additional discrepancies in regional capacity levels and is included in Appendix C.

5.3 Implementation of Program Control Frameworks in Regions

In section 5.1, we presented our findings related to the design of new and amended program control frameworks and highlighted that elements of the frameworks intended to support regional and community level implementation of programs were lacking. In section 5.2, we presented our analysis and findings on regional delivery structures and highlighted how they vary considerably across regions. In this section, we examine how these two findings intersect and impact the effectiveness of the implementation of program control frameworks.

In general, we found that regions have adapted different approaches to implementing the three programs examined in the scope of the audit. Considering that FNCFS programming follows provincial models and standards, comparisons across regions are not meaningful. Accordingly, we focused our comparative analysis on the Community Infrastructure and Post-Secondary Education programs, which are not striving to mirror provincial models and do permit meaningful comparison across regions.

5.3.1 Implementation of the Infrastructure Program Control Framework

Infrastructure programming is managed according to four main program areas: major capital projects (new infrastructure or improvements valued at $1.5 million or more); minor capital projects (new infrastructure or improvements valued at less than $1.5 million); operations and maintenance funding to maintain infrastructure; and, triennial assessments of infrastructure conditions (referred to by AANDC as Asset Condition Reporting System). For all three regions visited (Ontario, Manitoba and British Columbia), we found that major capital projects ($1.5 to $10 million) and triennial asset condition assessments were being managed according to national AANDC protocols. Regional approaches for delivering minor capital programming varied considerably from one region to the next, with one region applying rigorous oversight and another region providing formula-based funding allocations to all communities with relatively little monitoring of the project or verification of completion.

Two of the regions visited applied full oversight of all minor capital projects funded (from $100 to $1.5 million) to ensure that funds were being used for the intended purpose, with no consideration for the possibility of performing less work where a recipient had a proven track record of delivering projects on time and within scope and budget. Conversely, one region performed little oversight of minor capital projects at the time of the audit, again with little to no consideration of recipient and project risk or a recipient’s track record of success. In consultation with the HQ program, this same region was planning to move from a formula-based minor capital program to a proposal-based approach in 2014-15. Our discussions with the HQ program, regional staff and First Nations officials suggested that the consideration of the impact on First Nations as a result of this new approach could have been improved. First Nations were asked in January 2014 to submit one page proposals for all potential projects on their five year infrastructure investment plans for consideration by the Department.

Based on interviews with program officers and managers and our review of a selection of infrastructure project files, we found that slightly different philosophies had evolved in regions about the role of the Department vis-à-vis infrastructure, and sometimes different philosophies existed within a region. These philosophies ranged from a belief that AANDC Capital Officers must monitor project implementation to ensure that recipient funds are being expended as agreed upon with AANDC, while other Capital Officers worry that oversight might interfere with the First Nation’s responsibility to manage its own projects.

Table 5 below shows the breakdown of positions across regions and demonstrates that some regions invest more heavily in engineering and technical experts, while others invest more in project management and administrative capacity.

Table 5: Personnel Levels and Classifications for Infrastructure Management and Staff *****
  AT QC ON MB SK AB BC
Executive Classifications     2 Staff
EX-01
ENG-06
1 Staff .25 Staff .5 Staff 1 Staff
Engineering and Technical Classifications 4 Staff
ENG-05
ENG-04
5 Staff
EG-05
EG-03
ENG-04
ENG-03
16 Staff
EG-07
EG-06
2 ENG-05
ENG-04
ENG-02
6 Staff
ENG-04
10 Staff
EG-06
EG-04
ENG-05
ENG-04
ENG-03
11 Staff
EG-06
ENG-06
ENG-04
22 Staff
EG-06
EG-04
EG-03
ENG-05
12 ENG-04
ENG-03
PC-03
PC-02
Program Management Classifications   5 Staff
PM-06
PM-05
PM-04
1PM-02
25 Staff
PM-06
PM-05
13 PM-04
PM-03
PM-02
7 Staff
PM-06
PM-04
8 Staff
PM-06
PM-04
6 Staff
PM-06
PM-05
PM-04
12 Staff
PM-06
PM-05
PM-04
PM-03
Administrative and Clerical Classifications 1 Staff
AS-03
2 Staff
AS-02
CR-04
2 Staff
AS-01
CR-04
6 Staff
AS-03
AS-02
CR-04
CR-03
3.25 Staff
AS-01
2.25 CR-04
3 Staff
AS-02
3 Staff
AS-01
CR-05
CR-04
Total 5 12 45 20 21.5 20.5 38
***** Position data extracted from AANDC HR information system and compiled based on consultations with regional officials.
 

The differences in philosophy and delivery approaches across regions reinforce the need for national processes, tools and training to ensure that First Nations are receiving comparable services and opportunities. While capacity to prepare and implement these elements of the major capital program control framework is strongest in regions, strong coordination and support would be required by the HQ program. One region visited had addressed many of the gaps in the control framework by developing its own processes and guides for AANDC Capital Officers and recipient project managers.

The regions visited all calculated funding levels for infrastructure operations and maintenance based on the number and types of infrastructure present in each community and in accordance with maintenance cost schedules. While there was national consistency in terms of funding approach, there was little to no national direction or guidance on how to manage the implementation of this program component. Accordingly, none of the regions we visited had instituted risk-based monitoring of whether operations and maintenance funding was being spent on the up-keep of key infrastructure assets. We expected to find risk-based practices that included greater focus on recipients who had a history of depleting their infrastructure assets at an accelerated pace when compared against depreciation benchmarks, with less focus on recipients who had a proven track record of maintaining their infrastructure assets. We found no evidence of a risk-based approach to funding recipients or to monitoring recipient use of operations and maintenance funding.

5.3.2 Implementation of the Post-secondary Education Program Control Framework

Post-secondary education program funding is provided to post-secondary institutions through the Post-Secondary Partnerships Program (PSPP) and to students of First Nation communities through the Post-Secondary Student Support Program (PSSSP) and the University and College Entrance Preparation Program (UCEPP).

The audit found that national protocols are being rolled out for the 2014-15 fiscal year to guide allocation decisions for PSPP funding provided to institutions and that delivery of the student-focused programs has been left largely to the regions. The PSPP is being streamlined by focusing federal investments on developing and updating courses in disciplines that are most likely to improve labour market participation for graduates. Starting in 2014, the PSPP program is also moving to a streamlined national intake process with two annual fixed proposal intake dates, March 31 and April 30. We believe that these program changes will help to better align investments to areas of government priority and foster consistency and efficiency across regions. Notwithstanding these improvements, we did observe that the implementation of this new program approach was expedited, leaving regional offices and institutions only three months to react and prepare for the new direction. This example reinforces the importance of a clear annual cycle for rolling out major program changes, carried out in a consistent manner with due time allotted for identifying and addressing implementation risks and challenges of regions, and recipients.

The audit found that the student-focused PSSSP and UCEPP programs are administered by First Nations and Tribal Councils, with little in the way of national or regional program control frameworks. The HQ Education Program has developed national protocols on student and expenditure eligibility; however, there are no national protocols or guidelines in place to guide regions in the risk-based monitoring of agreements, in providing support to First Nations with administration of their programs, or in the management of surplus funds. Further, based on our review of the few national protocols that were in place, we found that they do not align to government priorities of encouraging employment training as they restrict any student who has previously obtained a University degree from pursuing a College program that could improve their employment chances. For example, a student who has successfully completed a University degree under the program would not qualify for further funding to a College program, but would qualify for funding to a Master’s level program.

We also observed that regions perform no monitoring of post-secondary funding and there were no national guidelines for managing recipient surpluses. A best practice was noted in one region visited whereby recipients who had not spent their full allocation in the prior year were required to submit proposals demonstrating that they had enough students to utilize their full allocation during the current fiscal year. If they were unable to demonstrate this, their funding was limited to the greater of their prior year spending or the amount they demonstrated they could spend through their proposal. Any funding not claimed by these recipients is made available to all other regional First Nations recipients through a call for proposals.

The absence of a national program control framework and more current guidelines for PSSSP and UCEPP has led to regions devising their own approaches to program management and has led to inconsistency across regions. Similar to infrastructure programming, we found that regional staff had evolved different philosophies on how post-secondary funding should be administered. Some regional staff saw it as appropriate to continue funding at historical funding levels even though First Nations regularly surplus funds, while others saw it as their duty to ensure that First Nations spend their annual allocated post-secondary funding on post-secondary programming. All regions included in the audit had resigned to not performing compliance verification of PSSSP and UCEPP, citing a lack of resources and/or a shift of compliance resources to other programs.

The findings vis-à-vis regional implementation of the PSSSP and UCEPP program control frameworks and the differing philosophies of regional staff reinforce the importance of national processes for allocating program funds, performing risk-based compliance reviews, supporting First Nations with program administration and managing program surpluses, as well as national training and guidelines for regional staff administering these programs.

5.4 Collection and Use of Recipient Reporting

In addition to assessing the design and approval of new and modified reporting requirements, covered in section 5.1 as part of our assessment of the design and approval of program control frameworks, the audit assessed the Department’s controls for the collection and use of recipient reporting. To this end, we performed sample testing in three regions to examine whether regional personnel are using recipient reports for recipient-level decision making. We also examined the issue from the perspective of the HQ program and regional program managers to determine whether they are supporting regional personnel with the necessary processes, training and systems to enable the effective and efficient use of reporting.

We found that the Department has been actively engaged over the past three fiscal years in evaluating the need for the data being collected from recipients and introducing new program information systems (Integrated Capital Management System (ICMS), Education Information System (EIS) and FNCFS Information Management System (IMS)) to streamline collection and processing of recipient data. Based on information obtained from departmental officials, we understand that the number of Data Collection Instruments collected by AANDC has been reduced considerably for the three program areas in question as a result of a commitment by the Government of Canada to reduce recipient reporting burden.

Table 6 shows that the number of distinct reports requested of recipients, of all reporting intervals, has been reduced by approximately two-thirds between 2011-12 and 2014-15. Note, however, that some of the reports required in prior years were merged over the period in question, so the reduction in the number of reports may be higher than the reduction in data points being collected. We were not able to obtain an analysis of the number of data points being collected year-over-year.

Table 6: Decrease in Recipient Reports between 2011-12 and 2014-15**
Program Cluster Number of Distinct Reports Required of Recipients
2011-12 2012-13 2013-14 2014-15
Social 42 23 20 10
Education 26 17 13 9
Community Infrastructure 15 17 17 5
** Data compiled from GCIMS System in January 2014.
 

At the time of our audit, two of these systems were in the process of implementing and debugging their data collection modules (EIS and FNCFS (IMS)) and had not yet implemented advanced data analysis and reporting functionalities. The data collection modules for the third system, ICMS, were fully implemented in all regions visited and some system-generated reports were available to assist in managing recipient agreements. Based on our review of a sample of reports in each region and discussions with regional and HQ program officials, we found that all systems are leading to marked improvements in the completeness and accuracy of data collected from recipients and entered into systems. This focus on data completeness and accuracy has translated into an increased demand on many First Nations who may not have been submitting complete or accurate reports in prior years.

Based on our review of documentation and discussions with region and HQ program staff, we observed that program information systems included more focus on gathering performance information for HQ programs and less focus on designing reporting for regional program delivery staff (e.g. there are few standard reports for program managers, program advisors and recipients, which are intended to support ongoing management of the program). We identified opportunities in all programs to leverage the information systems and recipient data to improve risk-based decisions regarding the various agreement management approaches. For example, ICMS includes tracking of what infrastructure is located in each community and the results of triennial inspections of infrastructure; together this information could be used to assess whether a given community’s infrastructure is depreciating at an expected rate and could be used to inform risk-based monitoring, management and investment decisions. As another example, the post-secondary module of EIS includes collection of transactional information that could be used to run automated scripts designed to identify potential anomalies (e.g. students funded from two sources, First Nations communities with inconsistent per student funding levels, First Nations who are not reporting sufficient transactional data to account for all of their funding, etc.).

Our regional fieldwork highlighted that AANDC FSOs and Data Entry Clerks devote considerable amounts of time to following up with recipients on late reports. We extracted information from the AANDC Grants and Contributions Information Management System (GCIMS) on all reports requested of and received by recipients in 2012-13. We found that only 17% of 2012-13 AANDC reports were received on time, 68% were received late, 9% had not been received as of February 1, 2014 and 7% had been cancelled and deemed unobtainable. As detailed in Exhibit 2, these results varied only slightly for the three main program clusters analyzed.

Report Lateness by Program
Description of Exhibit 2

The stacked histogram shown in Exhibit 2 depicts, by program and overall, the percentage of recipient reports received on time, received late, not received and cancelled. The three programs being compared in Exhibit 2 are: Education; Social Development; and Community Infrastructure.

Education

  • Reports received on time: 10% (392 reports)
  • Reports received late: 74% (2,819 reports)
  • Reports not received: 6% (214 reports)
  • Cancelled reports: 10% (410 reports)
  • Total number of reports: 100% (3,835 reports)

Social Development

  • Reports received on time: 22% (2,601)
  • Reports received late: 69% (8,195)
  • Reports not received: 2% (291)
  • Cancelled reports: 7% (791)
  • Total number of reports: 100% (11,878)

Community Infrastructure

  • Reports received on time: 15% (1,149)
  • Reports received late: 64% (4,994)
  • Reports not received: 17% (1,308)
  • Cancelled reports: 4% (353)
  • Total number of reports: 100% (7,804)

All programs

  • Reports received on time: 18% (4,142)
  • Reports received late: 68% (16,008)
  • Cancelled reports: 6% (1,554)
  • Reports not received: 8% (1,813)
  • Total number of reports: 100% (23,517)

 

We also evaluated regional reporting to determine whether regions had different levels of success in collecting reports on time. Exhibit 3 shows that while some regions have slightly better experiences than others, late reporting and cancellation of reports is a problem across the country.

We performed additional trend and correlation analysis to determine whether reporting lateness was higher or lower when considering a recipient’s general assessment score, Community Wellbeing Index score, and population size, but found no strong correlation with any of these factors.

The high frequency of reports that are cancelled and/or not received, coupled with our testing observations that regional personnel are not consistently using data from reports to make risk-based decisions, highlights the importance of re-evaluating whether the amount and type of information identified for collection is actually needed. At the time of writing our report, the Department was in the process of rolling out streamlined Data Collection Instruments (DCIs) to reduce the number of reports required of First Nations. The reductions are reflected in Table 6, found above in this section.

Report Lateness by Region
Description of Exhibit 3

The stacked histogram shown in Exhibit 3 depicts, by region, for all recipient reports received within the Education, Social Development and Community Infrastructure programs, the percentage of recipient reports received on time, received late, not received and cancelled.

Nunavut:

  • Reports received on time: 6% (14)
  • Reports received late: 82% (188)
  • Reports not received: 11% (26)
  • Cancelled reports: 1% (1)
  • Total number of reports: 100% (229)

Atlantic:

  • Reports received on time: 11% (199)
  • Reports received late: 64% (1181)
  • Cancelled reports: 7% (132)
  • Reports not received: 18% (328)
  • Total number of reports: 100% (1840)

Quebec:

  • Reports received on time: 6% (122)
  • Reports received late: 59% (1147)
  • Cancelled reports: 4% (85)
  • Reports not received: 31% (603)
  • Total number of reports: 100% (1957)

Ontario:

  • Reports received on time: 26% (1853)
  • Reports received late: 62% (4452)
  • Cancelled reports: 5% (368)
  • Reports not received: 8% (544)
  • Total number of reports: 100% (7217)

Manitoba:

  • Reports received on time: 8% (272)
  • Reports received late: 80% (2645)
  • Cancelled reports: 3% (96)
  • Reports not received: 9% (300)
  • Total number of reports: 100% (3313)

Saskatchewan:

  • Reports received on time: 21% (762)
  • Reports received late: 66% (2425)
  • Cancelled reports: 6% (233)
  • Reports not received: 7% (254)
  • Total number of reports: 100% (3674)

Alberta:

  • Reports received on time: 14% (307)
  • Reports received late: 70% (1495)
  • Cancelled reports: 6% (137)
  • Reports not received: 9% (184)
  • Total number of reports: 100% (2123)

British Columbia:

  • Reports received on time: 14% (1105)
  • Reports received late: 70% (5518)
  • Cancelled reports: 8% (647)
  • Reports not received: 8% (609)
  • Total number of reports: 100% (7879)

Northwest Territories:

  • Reports received on time: 8% (46)
  • Reports received late: 79% (445)
  • Cancelled reports: 3% (17)
  • Reports not received: 10% (58)
  • Total number of reports: 100% (566)

Yukon:

  • Reports received on time: 11% (57)
  • Reports received late: 66% (364)
  • Cancelled reports: 15% (85)
  • Reports not received: 8% (291)
  • Total number of reports: 100% (550)

 

5.5 Horizontal Findings and Recommendations

On the basis of our findings and analysis presented in sub-sections 5.1 to 5.4, we concluded that there are five systemic challenges or barriers that need to be addressed in order to improve the effectiveness of program control frameworks, which are:

  • requirements and processes governing the design and approval of Program Control Frameworks are unclear;
  • program control frameworks include few risk-based reporting and management regimes;
  • recipient reports do not delineate between information required for performance measurement and information required to administer recipient agreements;
  • gaps in governance processes for new information systems lead to implementation challenges that could have been avoided; and,
  • inconsistencies in regional delivery models and capacity levels lead to inconsistent implementation of program control frameworks.

5.5.1 Development and Approval of Program Control Frameworks

The audit expected to find clear departmental processes and governance structures which support the design and approval of risk-based program control frameworks. Further, we expected to find that:

  1. program control frameworks contain consistent elements from one program to the next;
  2. it would be clear what changes to program control frameworks require approval outside of the program Sector;
  3. adequate support was made available to programs developing program control frameworks;
  4. appropriate challenge functions were in place to support approval, including from regions, the Chief Financial Officer Sector and the Regional Operations Sector;
  5. recipients were being engaged in a timely manner, when possible and appropriate, to identify potential implementation challenges;
  6. a common look and feel was employed for program control frameworks to promote ease of use in regions;
  7. a single window approach to communicating program control frameworks to regions and recipients was in place to promote their adoption in regions; and
  8. all program control frameworks promoted risk-based regimes for reporting, monitoring and other aspects of agreement management.

The audit found a lack of consistency in the processes employed by programs to redesign program control frameworks coupled with a significant number of recent and ongoing program and system changes has resulted in unintended implementation challenges for most AANDC regions and the few First Nations administrators we interviewed. While the Department has a Management Control Framework for Grants and Contributions that includes requirements for the design and approval of programs, it does not establish clear roles, responsibilities, or protocols to govern the process. Further, there is no clear understanding of when changes are significant enough to warrant consultation and/or approval outside of the corresponding program Sector. As a result, each program creates program control frameworks according to its own understanding of what a program control framework should be and consults and seeks approval in a manner that it deems appropriate. For programs undergoing a redesign that requires Treasury Board approval of program terms and conditions, senior executives have ensured that approval occurs at a senior level. This said, it is not clear that Director General level committees (e.g. Directors General Implementation and Operations Committee) or internal challenge functions have been sufficiently engaged in addressing implementation risks and challenges prior to recommending approval of program control frameworks to senior management committees.

An added challenge is that, while HQ programs have adequate policy expertise, through interviews we found that they often lacked program implementation expertise, making it difficult for them to design program control frameworks that meet regional implementation needs. The Department would benefit from aggregating some of its program design and implementation expertise into a central function that supports and challenges programs that are developing or amending program control frameworks and/or introducing changes to recipient reports. Presently, the Transfer Payments Centre of Expertise (TPCOE) within CFO Sector considers aspects of program control frameworks which relate to transfer payment policies and directives, but is not equipped to support and challenge the component parts of program control frameworks that lie outside these policies and directives, for example:

  • developing and maintaining protocols on the elements to be included in program control frameworks;
  • managing the process by which other functions in the Department review and sign-off on program control frameworks prior to tabling at senior management committees for approval;
  • establishing a common look and feel for program control frameworks;
  • providing support and advice to programs that are developing new or amended program control frameworks, including assisting with the writing of documents;
  • assuming a challenge role around the implementation of new recipient reporting requirements;
  • managing the dissemination of new recipient reporting templates to regions to align implementation to an annual cycle;
  • providing support and advice to programs on how to consult with regions and First Nations when implementing major system changes;
  • supporting timely consultation with regional staff and First Nations to identify potential implementation challenges and risks;
  • promoting sharing of best practices among regions and with HQ programs that are developing new or amended program control frameworks;
  • maintaining a portal through which program control frameworks are disseminated to regions and First Nations; and,
  • developing and maintaining competency-based learning regimes or frameworks (possibly using a development track) for certain regional roles (e.g. FSOs, Data Entry Clerks, Compliance Officers, Community Development or Case Management Officers, etc.).
Recommendations:

1. The Chief Financial Officer and the Senior Assistant Deputy Minister, Regional Operations, in collaboration with the Assistant Deputy Minister, Northern Affairs Organization and program Assistant Deputy Ministers, should review and clarify options within existing departmental processes, governance structures, accountabilities, responsibilities and authorities for developing and approving program control frameworks and establish a single window approach to communicating program control frameworks to regions and recipients.

2. The Chief Financial Officer and the Senior Assistant Deputy Minister, Regional Operations, in collaboration with the Assistant Deputy Minister, Northern Affairs Organization and program Assistant Deputy Ministers, should establish a function(s) to provide program design and program management expertise to HQ programs that are developing and implementing new and amended program control frameworks. This function could include a blend of existing expertise in program design and regional implementation with expertise in the development of risk-based program management regimes.

5.5.2 Risk-based Program Management

The audit found that, for the programs included in the scope of our audit, regions and programs are generally not implementing risk-based reporting and management regimes to target limited departmental resources on projects and recipients of highest risk. In its 2006 report, the Independent Blue Ribbon Panel on Grant and Contribution Programs highlighted the importance of establishing risk-based reporting requirements and management regimes. This was later reinforced through changes to the Treasury Board Policy on Transfer Payments and Directive on Transfer Payments.
Our audit found examples of excessive control resulting in inefficiency and insufficient control impacting on program effectiveness. Examples of excessive control include:

  • certain regions require and review proposals and completion certificates for all infrastructure projects, even those valued under $1,000 for recipients with a history of meeting their obligations;
  • requiring that all recipients provide full reporting on post-secondary student spending and graduation results when the information could only reasonably be used for compliance monitoring (i.e. it could not reasonably be aggregated for reporting purposes or performance measurement purposes because cost information and student graduation dates do not align with the AANDC funding agreement cycle or the report due date); and,
  • requiring reporting of detailed outputs of all Child and Family Services Agencies, regardless of whether compliance activities are planned.

Examples of insufficient control include:

  • performing no monitoring or compliance testing of infrastructure operations and maintenance spending, even though some First Nation communities have a history of infrastructure degrading at a rate that far exceeds a normal depreciation curve;
  • performing no risk-based monitoring or audits of post-secondary education spending; and,
  • regions performing little or no compliance monitoring on Child and Family Services Agencies, despite the program having developed a risk based compliance planner and a draft compliance review checklist.

5.5.3 Delineation of Recipient Reporting Requirements

The audit included examination of the Performance Measurement Strategies (PMS) and Data Collection Instruments (DCIs) of the three programs included in the audit scope. We found that most of the data points in the DCIs were not directly supporting performance indicators in the PMSs. For two of the three programs included in the audit, infrastructure and education, we noted that most of the data collection strategies for measuring performance indicators in the program PMSs called for obtaining data from alternate sources, many of which had not yet been defined. Since January 2011, social programs have been updating their performance indicators and DCIs to ensure that data is collected from recipients where practical and used to report on program performance. During the period of the audit, a departmental initiative was underway to streamline PMSs and reduce insignificant indicators and data collection requirements.

2012-13 Community Infrastructure, Education and Social Program Essential vs Non-Essential Spending
Description of Exhibit 4

The bar graph shown in exhibit 4 demonstrates the amount of essential and non-essential for the Community Infrastructure, Education and Social Programs for 2012-2013. The bar graph shows that Social Program has had the most essential spending of three programs listed. On the contrary, the Education Program has the most non-essential spending of the three programs.

Community Infrastructure (all amounts in Canadian dollars)

Essential Spending: 502,000
Non-Essential Spending: 538,000
Total Spending: 1,040,000

Education Program

Essential Spending: 1,030,000
Non-Essential Spending: 669,000
Total Spending: 1,099,000

Social Program

Essential Spending: 1,350,000
Non-Essential Spending: 344,000
Total Spending: 1,694,000


 

We also found that none of the reporting regimes for the three programs examined employed a risk-based approach. All data was requested of all recipients, regardless of their history of meeting obligations under the agreement or their demonstrated capacity. We found that when programs neglect to delineate performance measurement reporting from program management and compliance reporting, it impedes the regions’ ability to apply risk-based reporting regimes. Moreover, the heavy burden of entering, reviewing and approving recipient reports leaves regional staff with little time to perform program management and recipient monitoring tasks. It also negatively impacts the role of FSOs by preoccupying their time with chasing down late reports to avoid funding halts, notwithstanding that most core programs have been deemed essential (see Exhibit 4) which restricts funding from being halted.

Recommendation:

3. The Chief Financial Officer and the Senior Assistant Deputy Minister, Regional Operations, in collaboration with the Assistant Deputy Minister, Northern Affairs Organization and program Assistant Deputy Ministers, should improve alignment of program Data Collection Instruments with program Performance Measurement Strategies for their respective programs to further streamline data being collected from recipients. This should include delineating information required for performance measurement from information being collected for possible compliance activities to allow for the application of risk-based reporting regimes (e.g. for projects, programs and recipients).

5.5.4 Regional Delivery Models and Capacity Levels

Over the past two decades, regions have evolved different organizational structures, competencies and capacity levels. Variances are evident in a number of ways, including: different delivery models (matrix-based teams vs. functional units); differences in role assignments between FSOs, program officers and other specialists (data entry, establishing program budgets, compliance, reviewing program reports, etc.); variances in staffing levels for programs and transfer payment management functions; different occupational group and classification levels of staff (e.g. engineers vs. project managers, financial professionals vs. administrative professionals); existence of compliance units and the focus of compliance work; existence of community development staff to work with communities in crisis; and varied investments in program guides, tools and systems.

These differences make roll-out of national program control frameworks very challenging. Firstly, HQ programs have difficulty distinguishing among the various roles of regional personnel, particularly as it relates to managing reporting, compliance monitoring, reviewing program reports, and supporting recipients with program implementation. Secondly, having staff with different competencies and experience levels performing the same activities can make it difficult to determine what training, guidance and other program implementation aids are necessary to support program roll-out. This is compounded by the reality that certain regions have lower relative capacity in certain programs and functions than others.

On the basis of the findings of the audit, we believe that the Department would benefit from promoting greater consistency in regional organizational structures, classifications, capacity levels and role assignments for certain key transfer payment management functions, including:

  • agreement development and tailoring of recipient cash flow and reporting schedules;
  • determination of initial recipient budget allocations and management of mid-year budget adjustments;
  • data entry for key systems including the Education Information System, the Integrated Capital Management System and First Nations Child and Family Services Information Management System;
  • providing program advice and expertise to recipients;
  • leading compliance reviews;
  • serving as program and functional experts on recipient compliance reviews;
  • working with communities in crisis and communities in default of their funding agreements;
  • performing ratio calculations on the basis of annual audited financial statements;
  • calculating recipient surpluses and deficits for purposes of managing unexpended funding; and,
  • Reviewing and approving recipient reinvestment plans where a recipient has incurred a surplus that is eligible for reinvestment.
Recommendation:

4. The Chief Financial Officer and the Senior Assistant Deputy Minister, Regional Operations, in collaboration with the Assistant Deputy Minister, Northern Affairs Organization and program Assistant Deputy Ministers, should review key transfer payment management functions to promote greater consistency across regions, including regional organizational structures, classifications, capacity levels and role assignments.

 

 

6. Management Action Plan

Recommendations Actions Responsible Manager (Title / Sector) Planned Implementation and Completion Date
Note: The following is an update on the status of management's responses/actions to address the recommendations contained in the Audit of the Management Control Framework for Grants and Contributions 2013-14 as at February 20, 2015.

Recommendation #1: Since the last Audit Committee, there have been subsequent discussions among senior management on the issue of appropriate governance for the oversight of the department's Grants and Contributions (G&C), the first element of the first recommendation of the report. As a result of these discussions, the Deputy Minister has approved the principle that there should be one committee that oversees the four key elements of G&C management: program design, program management, funding agreement development, and transfer payment management and reporting. The Deputy Minister has also decided to launch an overall review of the governance structure of the department. This review will factor in the issue of effective oversight of G&C based on the approved principle. The review's findings are expected to be completed in March 2015, for implementation in 2015-2016.

The G&C governance structure, once implemented, will then focus as, one of its first tasks, on the remaining elements under the first recommendation, with a timeframe of July 2015 for completion.

Recommendation #2:
The G&C governance structure, once implemented, will address this recommendation for completion by July 2015.

Recommendation #3:
Completed.

Recommendation #4:
Work underway for completion by August 2016.
1. The Chief Financial Officer and the Senior Assistant Deputy Minister, Regional Operations, in collaboration with the Assistant Deputy Minister, Northern Affairs Organization and program Assistant Deputy Ministers, should review and clarify options within existing departmental processes, governance structures, accountabilities, responsibilities and authorities for developing and approving program control frameworks and establish a single window approach to communicating program control frameworks to regions and recipients. Under the leadership of the Associate DM, the Chief Financial Officer, in collaboration with the Senior ADM, Regional Operations, the ADM, Northern Affairs Organization and Program ADMs will:
  1. Confirm the terms of reference of the governance structures and accountabilities for the approval of program control frameworks on an on-going basis, in a consistent manner across the department and bring forward to Operations Committee.
  2. Clearly define what a program control framework is and objective / intended purpose and ensure a common understanding.
  3. Create an inventory of existing program control frameworks so that adjustments to existing or the development of new frameworks occurs where requested going forward.
  4. Review other government department processes (such as HC, CIDA) with respect to the design/redesign, approval and implementation of program control frameworks that could be relevant to AANDC.
  5. Establish a common approach for developing and implementing program control frameworks, including:
    1. Common elements; and
    2. Common look and feel.
  6. CFO will facilitate a single approach on behalf of program ADMs (ex web-based internet single location) for disseminating program control frameworks to regions and ultimately recipients.
Chief Financial Officer

Senior ADM, Regional Operations

ADM, Northern Affairs Organization

Program ADMs

DG, Communications

(Approval – Ops committee)
Q1, 2015-16
2. The Chief Financial Officer and the Senior Assistant Deputy Minister, Regional Operations, in collaboration with the Assistant Deputy Minister, Northern Affairs Organization and program Assistant Deputy Ministers, should establish a function(s) to provide program design and program management expertise to HQ programs that are developing and implementing new and amended program control frameworks. This function could include a blend of existing expertise in program design and regional implementation with expertise in the development of risk-based program management regimes. The governance structure and framework will:

  1. Create a cross-sectional departmental 'community of practice' (as required), with appropriate management leadership for the purpose of providing program design and program management expertise for development and implementation of new and amended frameworks.
  2. Implement the newly proposed common approach for developing and implementing program control frameworks (including common elements, common look and feel and process mapping, reporting requirements and approval levels/committees).
Chief Financial Officer

Senior ADM,

Regional Operations

ADM, Northern Affairs Organization

Program ADMs
TBC – Decision to be taken by CFO/DM
3. The Chief Financial Officer and the Senior Assistant Deputy Minister, Regional Operations, in collaboration with the Assistant Deputy Minister, Northern Affairs Organization and program Assistant Deputy Ministers, should improve alignment of program Data Collection Instruments with program Performance Measurement Strategies for their respective programs to further streamline data being collected from recipients. This should include delineating information required for performance measurement from information being collected for possible compliance activities to allow for the application of risk-based reporting regimes (e.g. for projects, programs and recipients). The governance structure will ensure that the resulting program control frameworks reconcile the Program Performance Strategies (PMS), data requirements for program management and compliance activities, and the requirements of the existing Annual Report project.

The oversight by the governance structure will include a risk based approach to reducing the reporting burden, and will be implemented in a phased approach commencing Jan/Feb 2015.

Moreover, as part of reducing the reporting burden, no changes will be made to the Reporting Guide (Data Collection Instruments), for existing programs, after December 15th each fiscal year, in alignment with the current practice of making the final National Funding Agreement Models available for December 15th.
Chief Financial Officer

Senior ADM, Regional Operations

ADM, Northern Affairs Organization

Program ADMs

ADM, PSD
Q2, 2015-16
4. The Chief Financial Officer and the Senior Assistant Deputy Minister, Regional Operations, in collaboration with the Assistant Deputy Minister, Northern Affairs Organization and program Assistant Deputy Ministers, should review regional organizational structures, classifications, capacity levels and role assignments to promote greater consistency across regions for key transfer payment management functions. Within the framework of Departmental decision-making, including the costed-organization chart exercise, and in effort to promote greater consistency across regions for key transfer payment functions, the Senior ADM, Regional Operations, Regional Directors General and the ADM, Northern Affairs Organization, with the support of the Chief Financial Officer, Program ADMs and the DG, HRWSB, are undertaking a number of projects to bring a more consistent approach to organizational design and classification in regional offices.

These include the following inter-departmental projects and initiatives:
  1. The Corporate Services Review;
  2. The Compliance Review Project;
  3. The Funding Services Review;
  4. The First Nation AANDC Annual Report;
  5. The Block Review; and
  6. Case Management.
The timing of each of these projects will be coordinated through the Operations Committee and other governance bodies of the department.
Senior ADM, Regional Operations

ADM, Northern Affairs Organization

Chief Financial Officer

Program ADMs

DG, Human Resources and Workplace Services Branch
August 2014 and ongoing up to 2 years
  1. The Corporate Services Review (Q2 2015-16) (HR)
  2. The Compliance Review (Q4 2015-16) (Quebec RDG / ESDPP)
  3. Implementation of the Funding Services Review (Q2 2015-16) (BC RDG/CFO)* * A funding Agreement Management project undertaken by BC region on behalf of RO presents a model approach to management of funding agreements that can be used to validate current business processes and align resources to the most effective delivery of management services related to that funding.
  4. The First Nation AANDC Annual Report (Q4 2015-16) (CFO/CIO + AES + Ontario RDG)
  5. Block Review (Q3 2016-17) (on hold, Alberta RDG)
  6. Case Management (Phase 1: Q3 2015-16 and Phase 2: Q4 2015- 2016) (RDG Atlantic)

    Identify Opportunities for Organizational Design (on-going)
 
 

 

Appendix A: Audit Criteria

To ensure an appropriate level of assurance to meet the audit objectives, the following criteria were developed to address the objectives as follows:

Criterion #1: Departmental processes supporting the design of risk-based program control frameworks are clear, consistently applied, and appropriately support the achievement of the department's performance and stewardship objectives, while considering department's capacity to implement and risk tolerances.

1.1 The authorities, responsibilities and accountabilities of those involved in the design of program control frameworks (PCFs) are clearly defined and understood (e.g. TPCOE, Program Management, Regional Operations, etc.).

1.2 Departmental processes for the design of PCFs consider the feasibility of coordinating and aligning processes, systems and procedures with those of existing TP programs within the department and, to the extent possible, with those of other departments.

1.3 Departmental processes for the design of PCFs ensure that monitoring, reporting and compliance requirements reflect the risks specific to the program, its terms and conditions, the value of funding in relation to administrative costs, and the risk profile of recipients.

1.4 Departmental processes for the design of PCFs seek input from Department stakeholders (e.g. those responsible for program delivery, and those who currently administer other Departmental programs) to ensure that expected outcomes and appropriate policies and procedures, resources, systems and supporting tools are developed to facilitate consistency of program implementation and the achievement of objectives.

1.5 Departmental processes for the design of PCFs ensure that appropriate funding instruments are chosen to respect and achieve a balance between the principles of accountability, cost/benefit, risk management and treatment of program recipients.

1.6 Departmental processes for the design of PCFs consider the capacity, adequacy and availability of departmental resources (HR and financial) to meet the operational requirements necessary to effectively deliver the program and achieve its objectives.

Criterion #2: Departmental transfer payment governance processes and structures adequately support the approval of risk-based program control frameworks.

2.1 Governance processes, structures and delegated authorities for the approval of risk-based program control frameworks are clearly defined and understood.

2.2 Delegated authorities and approval requirements for key elements of the PCF design and approval processes are clearly understood and consistently applied (e.g. funding approaches, new program activities, performance measurement regimes, program administrative regime, etc.).

2.3 Departmental policy and principles for the management of transfer payment programs, including implied risk tolerances are clear and consistently understood (e.g. by oversight bodies) in approving the various elements of PCFs.

Criterion #3: Departmental processes for designing recipient reporting requirements are adequate to ensure that appropriate performance and financial information is gathered to support the Department in fulfilling its stewardship and accountability obligations, giving due regard for the importance of reducing reporting burden on recipients.

3.1 Departmental processes and authorities for the design and approval of recipient reporting requirements are clear and appropriate.

3.2 Departmental processes for the design of recipient reporting requirements support the development of performance measurement strategies (PMS) – including performance measures and indicators that are specific, measurable, relevant and time-bound – that align with the Department's Performance Measurement Framework (PMF) and the outcomes within program Terms and Conditions.

3.3 Departmental processes for the design of recipient reporting requirements include consideration of input from regions to ensure that information requirements support risk-based decision making on how programs are managed and funding is allocated.

3.4 Departmental processes for the design of recipient reporting requirements include consideration of input of First Nation recipients to ensure (to the extent possible) that due consideration is given to the impact on First Nation and the alignment of these requirements to the information needs of First Nations.

3.5 Prior to adding new reporting requirements, and in rationalizing existing reporting requirements during program renovation, Departmental processes ensure that programs consider the feasibility of using information already available to the Department, or available publicly.

3.6 Departmental processes ensure that programs employ a risk-informed approach to establishing the need for recipient reporting information as it relates to program compliance reporting and monitoring (degree/extent of reporting and frequency of reporting).

3.7 Departmental processes ensure that programs periodically re-evaluate recipient reporting requirements, including when changes are made to the PMF and when programs are renovated (i.e. other than minor changes, as prescribed by the Policy on Transfer Payments).

Criterion #4 Processes, tools, systems and learning and development activities are in place and adequate to support the efficient and effective collection and extraction of information from recipient reports.

4.1 Recipient reporting requirements are clearly communicated to recipients and AANDC program delivery staff (i.e. regional and HQ staff responsible for direct interface with recipients for purposes of program delivery staff).

4.2 Mechanisms (e.g. processes, tools, templates) are established and implemented to support recipients in complying with their reporting requirements.

4.3 Guidance, advice, and training is available to program delivery staff and funding services officers (FSOs) to support consistency and comparability information across regions and within regions.

4.4 Guidance and training where necessary, is available to recipients to support them in understanding and complying with their reporting obligations.

4.5 Information systems employed to house accountability and performance information (financial and non-financial) support analysis and reporting on expected outcomes. Note: this control objective focuses the alignment of reporting information to program performance indicators and the PMS rather than data completeness or accuracy.

Criterion #5: Program performance information collected from recipients is consolidated, analyzed and used by programs, and regions where applicable, to measure achievement of program performance indicators and contributions to strategic outcomes.

5.1 Program Performance Measurement Strategies and other program documentation clearly demonstrate how the performance information collected in recipient reports aligns to and supports performance measurement at the program-level (i.e. required to measure achievement of performance indicators).

5.2 HQ Programs, with support of regions, consolidate and analyze performance information in support of measuring achievement of performance indicators and contributions to Strategic Outcomes.

5.3 Where program delivery models and program outcomes are distinct at the regional level, regions (with support of HQ programs) consolidate and analyze performance information in support of measuring achievement of performance indicators and contributions to Strategic Outcomes.

Criterion #6: Performance and accountability information collected from recipients is analyzed and used to support risk-based decision-making in establishing and managing the recipient relationship.

6.1 Information from recipient reporting is used by FSOs and program officers to evaluate, on a risk-informed basis, recipient compliance with agreement obligations and minimum program requirements.

6.2 Information from recipient financial reporting is analyzed and used by FSOs and program officers to support the management of unexpended funds (e.g. reinvestment plans, recovery and reallocation).

6.3 Information from recipient reporting is analyzed by FSOs and program officers and is used to inform risk-based decision making with respect to the type of funding agreement to be entered into with the recipient, as well as the cash management provisions of the agreement and the nature and frequency of reporting.

6.4 Information from recipient reporting is analyzed by FSOs and program officers and is used to inform capacity development activities and support extended to recipients.

Criterion #7: Performance and accountability information collected from recipients is consolidated, analyzed and used to inform risk-based decisions on how programs are managed and funding is allocated.

7.1 Information from recipient reporting (performance and accountability – financial and non-financial) and other sources is consolidated and analyzed by regional and HQ managers to support risk-based program decision making.

7.2 Information from recipient reporting and other sources is used by regional and HQ managers to inform, and revise as necessary, program delivery approaches.

7.3 Information from recipient reporting and other sources is used by regional and HQ managers to inform decision-making in regard to the learning and development requirements of recipients.

7.4 Information from recipient reporting and other sources is used by regional and HQ managers to identify and address the capacity needs of both recipients and regional field officers/FSOs.

7.5 Information from recipient reporting and other sources is used by regional and HQ managers to reallocate program funds between recipients (and programs).

 

 

Appendix B: Relevant Policies/Directives

The following authoritative sources (i.e. Policies/Directives) were examined and used as a basis for this audit:

 

 

Appendix C: Analysis of the Relative Capacity of Regions

I. Relative Capacity of Regions Based on G&C Program Dollars Administered

When analyzing regional capacity relative to the amount of G&C program dollars Regions administer, we found that regions place emphasis on different types of personnel capacity. For example, Exhibit 5 demonstrates that British Columbia and Ontario regions invest more resources in supporting infrastructure programming compared to other regions. Similarly, Exhibit 6 below shows that Atlantic, Quebec and Alberta regions have invested relatively higher resourcing levels in Education programming, while Exhibit 7 below shows that Atlantic, Quebec, Alberta and British Columbia regions invest in higher levels of capacity for social programs. Some of the variances in investments in social program staff attribute to provincial delivery models because some provinces and provincial organizations assume all or part of the responsibilities for some social programming (i.e., Ontario and British Columbia).

Number of Positions per $50M Infrastructure Spending per Region
Description of Exhibit 5

Exhibit 5 demonstrates the amount of resources that are allocated to the Infrastructure program per region. The bar graph demonstrates that British Columbia and Ontario regions invest more resources in supporting infrastructure programming compared to other regions.

Atlantic Region: 5.889
Quebec: 7.292
Ontario: 9.212
Manitoba: 4.854
Saskatchewan: 7.439
Alberta: 7.447
British Columbia: 12.172


 
Number of Positions per $50M Education Spending per Region
Description of Exhibit 6

Exhibit 6 demonstrates the amount of resources that are allocated to the Education program per region. The bar graph demonstrates that the Atlantic and Alberta regions invest more resources in supporting Education programming compared to other regions.

Atlantic Region: 5.438
Quebec: 3.007
Ontario: 1.415
Manitoba: 1.208
Saskatchewan: 0.691
Alberta: 2.526
British Columbia: 0.981


 
Number of Positions per $50M Social Spending per Region
Description of Exhibit 7

Exhibit 7 demonstrates the amount of resources that are allocated to the Social program per region. The bar graph demonstrates that the Atlantic and British Columbia regions invest more resources in supporting the social programming compared to other regions.

Atlantic Region: 4.513
Quebec: 2.651
Ontario: 1.578
Manitoba: 0.908
Saskatchewan: 1.977
Alberta: 3.647
British Columbia: 4.044


 

Exhibit 8 below shows that the Quebec and Ontario regions have relatively lower levels of Funding Services Officers when considering the amount of G&C funding administered. While Exhibit 9 below shows that the Manitoba and Alberta regions have relatively lower levels of other transfer payment management personnel (Data Operations, Transfer Payment Management, Compliance Officers, and Community Development Officers) when considering the amount of G&C funding administered. An important note is that we did not examine whether the lower capacity levels in the regions were as a result of the region opting to invest in other programs not included in our analysis (e.g. lands management, estates and trusts management and economic development), therefore we make no inferences about whether, on the whole, the region’s capacity is higher or lower than the capacity of other regions.

Funding Services Positions per $50M Grants and Contributions (2012-13)
Description of Exhibit 8

Exhibit 8 demonstrates the amount of Funding Services officer positions that are allocated in relation to the Grants and Contributions funding administered per region. The bar graph shows that the Quebec and Ontario regions have relatively lower levels of Funding Services Officers when considering the amount of G&C funding administered to other regions.

Atlantic Region: 1.53
Quebec: 1.21
Ontario: 1.23
Manitoba: 1.33
Saskatchewan: 1.53
Alberta: 1.38
British Columbia: 1.6


 
Transfer Payments Positions  per $50M Grants and Contributions (2012-13)
Description of Exhibit 9

Exhibit 9 demonstrates the amount of Transfer payments officer positions that are allocated in relation to the Grants and Contributions funding administered per region. The bar graph above shows that the Manitoba and Alberta regions have relatively lower levels of other transfer payment management personnel (Data Operations, Transfer Payment Management, Compliance Officers, and Community Development Officers) when considering the amount of G&C funding administered.

Atlantic Region: 1.16
Quebec: 1.65
Ontario: 1.72
Manitoba: 1.13
Saskatchewan: 1.75
Alberta: 1.04
British Columbia: 2.00


 

II. Relative Capacity of Regions Based on Proportion of Recipients in Default of Funding Agreements

One of the observations made consistently by regional staff is that more attention of FSOs and program officers is required when working with communities that are in default of their funding agreements with AANDC (as determined by the AANDC Default Prevention and Management Policy).

To better understand capacity from this perspective, we compared the percentage of First Nations and Tribal Councils in default in each region to the average number of staffFootnote 5 available to support every community (both in default and not in default). We expected to find that regions with greater frequency of First Nations in default (Ontario, Manitoba, Quebec and Saskatchewan) would have more resources dedicated to supporting these First Nations. However, we found little positive correlation between the proportion of recipients in default in a region and the region’s personnel capacity to support these First Nations (see Exhibit 10).

Number of Staff/FN&TC vs Percentage of FN&TC under Intervention
Description of Exhibit 10

Exhibit 10 shows the comparison of the percentage of First Nations and Tribal Councils in default in each region to the average number of staff available to support every community (both in default and not in default) to verify if there is a correlation of higher number of staff with those regions with a higher number of First Nations that have defaulted. The graph demonstrated that, there is little positive correlation between the proportion of recipients in default in a region and the region’s personnel capacity to support these First Nations.

Atlantic:

  • Number of Staff for First Nations: 1.297
  • Percent of First Nation under intervention: 19%

Quebec:

  • Number of Staff for First Nations: 1.583
  • Percent of First Nation under intervention: 30%

Ontario:

  • Number of Staff for First Nations: 0.9
  • Percent of First Nation under intervention: 30%

Manitoba:

  • Number of Staff for First Nations: 1.206
  • Percent of First Nation under intervention: 57%

Saskatchewan:

  • Number of Staff for First Nations: 1.087
  • Percent of First Nation under intervention: 34%

Alberta:

  • Number of Staff for First Nations: 1.970
  • Percent of First Nation under intervention: 9%

British Columbia:

  • Number of Staff for First Nations: 0.761
  • Percent of First Nation under intervention: 6%

 

III Relative Capacity of Regions Considering Number of First Nations and Tribal Councils Served and Regional Staffing Levels

Regions with large numbers of smaller First Nations (i.e. small on-reserve community populations) sometimes observe that they have a disproportionately high number of agreements and reports to administer and review. We believe that this argument has merit for programs that are funded directly to all or most communities (e.g. education and infrastructure operation and maintenance), but is less applicable to programs where only a portion of communities are funded, such as FNCFS programming and major capital infrastructure projects.

Number of Education Staff per FN & TC
Description of Exhibit 11

Exhibit 11 demonstrates the number of education staff for each region per First nations. The bar graph demonstrates that some regions have invested considerably more salary dollars in education program staff per First Nation (Alberta, Atlantic and Quebec) than others.

Atlantic:

  • Number of Staff for First Nations: 0.189

Quebec:

  • Number of Staff for First Nations: 0.236

Ontario:

  • Number of Staff for First Nations: 0.071

Manitoba:

  • Number of Staff for First Nations: 0.110

Saskatchewan:

  • Number of Staff for First Nations: 0.056

Alberta:

  • Number of Staff for First Nations: 0.274

British Columbia:

  • Number of Staff for First Nations: 0.024

 

Exhibit 11 shows that some regions have invested considerably more salary dollars in education program staff per First Nation (Alberta, Atlantic and Quebec) than others. An interesting note is that the regions with the lowest number of First Nations served and the lowest amount of G&C dollars administered are the regions with higher relative capacity in their education program units, suggesting that regions have determined that they have a base capacity required to administer education programs.

 

 

Appendix D: Breakdown of Scoped Programs Terminology

The following three programs were examined and scoped into the audit. A breakdown of the sub-programs under each is as follows:

  1. Community Infrastructure Program (Capital Facilities and Maintenance Program)
    • Other Community Infrastructure and Activities (Infrastructure Assets and Facilities)
    • Education Facilities
    • Housing
    • Renewable Energy and Energy Efficient
    • Water and Wastewater Infrastructure
    • Emergency Management Assistance (infrastructure activities only)
  2. Post-Secondary Education Programs
    • University and College Entrance Preparation Program
    • Post-Secondary Student Support Program
    • Post-Secondary Partnerships Program
  3. First Nation Child and Family Services Program (a component of the Social Development Program)

Additional programs were considered for purposes of analyzing regional delivery structures and recipient reporting as follows:

  1. Elementary and Secondary Education
  2. Social Development Program
    • Assisted Living Program
    • Family Violence Prevention Program
    • Income Assistance Program
    • National Child Benefit Re-Investment Program
 

 

Appendix E: Analysis of Regional Role Assignments

The table set out below describes where program and agreement management personnel are located within each of AANDC’s southern region organizational structures. The purpose of the table is to demonstrate that regional delivery structures vary considerably across regions.

  Compliance Officers Data Entry Clerks Program Officers Transfer Payment Management Staff (TP Staff) Funding Services Officers (FSOs) Community Development Officers
Atlantic Region Grouped with TP Staff within Funding Services None
(done by Program Officers)
Program Teams under A/RDG and Director Programs & Partnerships Agreement Services Team within Funding Services FSO Team and Community Services Team within Funding Services None
Quebec Region Dedicated Compliance Team within Corporate Services None
(done by Program Officers/Assistants)
Program Teams under Director Funding Services and Director Programs & Partnerships Transfer Payment Team within Funding Services 1 FSO Team within Funding Services None
Ontario Region Grouped with Program Officers within Programs & Partnerships Grouped with Program Officers and Funding Services within Funding Services and Programs & Partnerships Program Teams under Director Engineering, Director Funding Services and Director Programs Transfer Payment Team within Funding Services 2 FSO Teams within Funding Services None
Manitoba Region Grouped with FSOs in 3 Zones within Funding Services Dedicated Data Unit within Funding Services Program Teams under Director Infrastructure and Housing and Director Programs & Partnerships Transfer Payment Team within Funding Services 3 FSO Teams within Funding Services None
Saskatchewan Region Grouped with Program Officers within Funding Services Grouped with Program Officers within Funding Services and Field Services and Programs Program Teams under Director Funding Services and Programs Grouped with FSOs in 3 Field Operations Zones Grouped with Transfer Payment Officers in 3 Field Operations Zones None
Alberta Region None
(done by Program Officers)
Grouped with Program Officers under Directors of Treaty Areas Program Teams under 3 Directors of Treaty Areas (i.e. One program cluster per Director) Band Audit and Transfer Payment Teams within Corporate & Funding Services 3 FSO Teams under Directors of Treaty Areas None
British Columbia Region Grouped with Program Officers within Programs & Partnerships Dedicated Data Unit within Funding Services Program Teams under Director Engineering and Director Programs and Partnerships Resource Services Team within Funding Services 2 FSO Teams within Funding Services Dedicated Team under Director Community Development
 
 
 
 

Did you find what you were looking for?

What was wrong?

You will not receive a reply. Don't include personal information (telephone, email, SIN, financial, medical, or work details).
Maximum 300 characters

Thank you for your feedback

Date modified: