Evaluation of the Multiculturalism Program

2. Methodology

The evaluation followed the scope and methodology set out in an evaluation plan developed during a planning phase prior to the commencement of the evaluation. The evaluation planning phase was undertaken between September 2010 and March 2011, and was completed in consultation with representatives from all areas of the Multiculturalism Program.

2.1 Evaluation issues and questions

The evaluation of the Multiculturalism Program was designed to address three broad themes: relevance, design and delivery, and performance. In keeping with the requirements of the Directive on the Evaluation Function (Treasury Board of Canada Secretariat, 2009), program relevance was assessed in terms of: (1) continued need; (2) consistency with respect to federal roles and responsibilities; and (3) alignment with government and departmental objectives and priorities. Program performance was assessed by examining program results in terms of: (4) effectiveness; and (5) efficiency and economy (Table 2-1). See Appendix B for the complete evaluation matrix, which includes specific indicators and methodologies for each evaluation question.

Table 2-1: Summary of evaluation issues and questions

Evaluation Issue Evaluation Question Section Reference #Footnote 10
Relevance Is there a need for Multiculturalism programming in Canada? 3.1.1
Is Multiculturalism in Canada the role and responsibility of the federal government? 3.1.2
Is the Multiculturalism Program consistent with departmental and government-wide priorities? 3.1.3
Design and Delivery How has the delivery of the Multiculturalism Program been adapted to meet the new program objectives? 3.2.1
Has an appropriate and effective governance structure for the Multiculturalism Program been put in place? 3.2.2
Design and delivery issues.Footnote 11 3.2.3
Has a performance measurement strategy that adequately supports the management and evaluation of the Multiculturalism Program been implemented? 3.2.4
Performance Have program participants increased awareness of core democratic values, Canadian history, institutions, ethnocultural, and/or religious diversity? 3.3.1
Have project and event participants increased their civic memory and pride in Canada and respect for its core democratic values? 3.3.1
Have project participants increased their intercultural / interfaith understanding? 3.3.1
Are federal and targeted institutions more aware of how to meet the needs of a diverse society? 3.3.2
Are federal and targeted institutions’ programs, policies, and services responsive to the needs of a diverse society? 3.3.2 / 3.3.3
Have international best practices on approaches to diversity been shared with relevant stakeholders and integrated into the design and management of the Multiculturalism Program? 3.3.3
Is the delivery of the Multiculturalism Program efficient? Are there alternative, more cost-effective approaches to achieve the intended results? 3.4.1

2.2 Evaluation scope

The scope of the evaluation included activities undertaken, and outputs produced, between fiscal years 2008-09 and 2010-11. Consequently, the Gs&Cs projects reviewed for the evaluation included some that had been funded under both the old and new program objectives.

2.3 Data collection methods

The evaluation included the use of multiple lines of evidence and complementary research methods to help ensure the strength of information and data collected. Following the completion of data collection, each line of evidence was analyzed separately using an evidence matrix, which was organized by evaluation question and indicator. A 2-day brainstorming session was then held with project team members to examine the findings from each line of evidence and to develop overall findings and conclusions. Each of the methods is described in more detail below.

2.3.1 Interviews

A total of 47 interviews were completed for the evaluation (Table 2-2). Interviews were undertaken with six key stakeholder groups. The interviews were conducted to respond to all of the evaluation questions in the evaluation matrix, covering areas of program relevance, design and delivery, and performance. The questions for academics focussed primarily on relevance and performance, while respondents from other categories were asked questions that covered the full spectrum of evaluation issues.

Table 2-2: Summary of interviews completed

Interview group Number of
interviews
CIC Senior Management (Regions and National Headquarters) 9
CIC Managers/Representatives of the Multiculturalism Program 11
CIC Regional Multiculturalism Program Officers 10
Federal Institutions 4
Provincial/territorial representatives 5
Academics/experts 8
Total 47

Interviews were conducted both in-person and by telephone. Different interview guides were developed for each stakeholder group and the interview questions were aligned with the evaluation questions identified in the matrix (see Appendix C for the interview guides). Interviewees were provided with a copy of the relevant guide in advance of their interview. The results of the interviews were summarized in an interview notes template and were then coded and analyzed to determine key themes. Where interview information is used in the report, it is presented using the scale shown in Table 2-3. Note that in some cases (i.e., where the number of interviewees was too small or where the question yielded more descriptive information) the responses were not coded and a summary approached to analysing the information was used.

Table 2-3: Scale for the presentation of interview results

All
Findings reflect the views and opinions of 100% of the interviewees.
Majority/Most
Findings reflect the views and opinions of at least 75% but less than 100% of interviewees.
Many
Findings reflect the views and opinions of at least 50% but less than 75% of interviewees.
Some
Findings reflect the views and opinions of at least 25% but less than 50% of interviewees.
A few
Findings reflect the views and opinions of at least two respondents but less than 25% of interviewees.

2.3.2 Project and event feedback forms

In conjunction with program representatives, the evaluation team developed feedback forms to be completed by project participants, event participants and event funding recipients (see Appendix D for the feedback forms). The feedback forms included questions to gather participant opinions on how they were impacted by the project or event, as per the expected program outcomes.Footnote 12 These feedback forms are intended to be used by the program as an on-going method of performance monitoring for the program.

At the time of analysis, 39 responses had been received from event participants, seven from event funding recipients, and 75 from project participants. In examining the responses by project and event description, it was possible to determine for which project or event the recipient had responded. There were a sufficient number of responses from two projects to include in the analysis: Multiculturalism and Media (41 responses) and the Citizenship Challenge (19 responses).

2.3.3 Project evaluations

As a requirement of project funding, recipients must complete a project evaluation at the end of the project. There were 32 project evaluations available for review and all were from projects that were funded under the old (pre-2010) program objectives. All 32 evaluations were reviewed to determine the types of information that was provided and whether this information could be used to examine the expected outcomes of the program. A sample of five evaluations was selected for further review in order to examine information related to project outcomes.

2.3.4 Telephone survey with project funding recipients and non-recipients

Nine funding recipients and nine applicants who did not receive funding (i.e., non-recipients) were surveyed over the telephone to gather views on the need for the program and the impacts of CIC funding, as well as impacts of the actual projects themselves (see Appendix E for the telephone survey questionnaire). The survey population was chosen from a total population of 45 non-funded applicants and 77 projects that were active at that time (i.e., project file was open). Both funded and non-funded applicants were selected to ensure regional representation and a mix of funding amounts (i.e., high, medium and low). Note that the telephone survey was not meant to be representative of the entire population of projects, and the number and type of respondents was dependent on the availability and willingness of organizations to participate.

Table 2-4: Number of telephone surveys completed, by region

Region # of Active Projects # of Recipients Surveyed # of Non-Recipients (2009‑10, 2010‑11) # of Non-Recipients Surveyed
British Columbia / Yukon 7 1 7 2
Prairies / Northwest Territories 17 1 14 2
Ontario 22 4 10 1
Quebec 11 1 7 1
Atlantic 3 0 4 2
National Headquarters 17 2 1 1
Total 77 9 43 9

2.3.5 Administrative data review

Many different types of administrative data were reviewed to obtain information on the operations of the program. Information from the Grants and Contributions Information Management System (GCIMS) and program documents were reviewed to obtain output information such as: number of funded projects and events; number of MCN and FPTORMI meetings held; and the number of submissions to the various public education and promotion initiatives.

Financial information gathered from the program was also analyzed to establish the overall costs for the program and examine how the funding was allocated by region.

Additionally, a typology of funded projects was developed using information from the project Request for Approval Forms (RAF). The projects included in the typology were those that were ‘active’ (i.e., currently on-going or had closed/or were to be closed in fiscal year 2010-11) at the time of the evaluation. Note that the projects covered six fiscal years (from 2006-07 to 2011-12). The active projects were separated into two groups: group 1 projects were funded under the continuous intake process, which was in place under the old program objectives; and group 2 projects were funded under the new CFP process. This typology was then used to examine the differences between the two groups (e.g., types of activities that were funded, target groups).

2.3.6 Multiculturalism Champions Network questionnaire

The MCN meets approximately twice per year. The October 2011 meeting provided an opportunity to gather input for the evaluation from the Multiculturalism Champions. More specifically, the evaluation gathered information on the impact of the MCN on federal institutions and some evidence of the usefulness of the Network. To this end, a questionnaire was designed and administered at the October MCN meeting (see Appendix F for the questionnaire). The questionnaire was completed by 34 MCN meeting participants, which represented approximately one-third of the Network membership.

2.3.7 Literature review

A literature review was conducted to examine the evaluation questions related to program need and the role of the federal government. The review was conducted by an external academic expert. The research included academic and technical journals, publicly available information from various governments (Canadian and foreign), conference proceedings, and articles by think-tanks and/or non-governmental organizations. The review also considered alternative approaches to multiculturalism programming.

2.3.8 Document review

A review of over 40 relevant program documents was conducted to provide background and context to inform an assessment of the relevance, and design and delivery of the Multiculturalism Program. Documents such as legislation (e.g., The Multiculturalism Act, provincial/territorial legislation), Speeches from the Throne and budget speeches, and policy and strategic documents were reviewed for contextual background and for information on CIC and GoC priorities. Additionally, third party reports (e.g., Management Review, Audit report), the call for proposals, funding guidelines, contribution agreements, and promotional materials for the public education and promotion initiatives were reviewed to provide an understanding of the program operations (see Appendix G for a list of documents reviewed for the evaluation).

2.4 Limitations and considerations

There are four key limitations that should be considered when reviewing the evaluation results. These limitations, their possible impacts on the analysis, and mitigation steps are discussed below.

  1. There are inherent challenges associated with measuring the outcomes of social programs such as the Multiculturalism Program, particularly due to the complex nature of the subject matter and the fact that multiculturalism can be defined in many different ways. Attribution of the program outcomes is also a challenge, as other factors may have also influenced impacts.
  2. Little on-going performance measurement is in place to gather information on project outcomes. Although all funding recipients are required to submit an evaluation at the end of their project, they report on the achievement of project objectives, which cannot easily be linked to either the Multiculturalism Program objectives or outcomes.
  3. Information gathered with respect to outcomes cannot be considered representative of all program participants: responses on feedback forms was limited to two projects with a small number of responses for each project; the MCN questionnaire gathered responses from about one-third of all Multiculturalism Champions (or their delegates); and the telephone survey is not representative of all funded recipients or non-funded applicants. The limitations with respect to the representativeness of the outcome data gathered meant that it was not possible to use the information to draw conclusions with respect to the outcomes of the program.
  4. The evaluation was conducted one year following the implementation of new program objectives and therefore only few projects funded under the new objectives were complete.

To address these challenges, the evaluation included various data collection methods to gather information to examine program outcomes.

Page details

Date modified: