Evaluation of the Strategic Plan for Settlement and Language Training under the Canada-Ontario Immigration Agreement (COIA)

2. Evaluation methodology

This section presents the evaluation methodology, including a definition of its scope, lines of evidence and strengths and limitations. For a more detailed discussion on the methodology, including each of the lines of evidence, see the Methodology Appendix.

2.1. Evaluation scope

Timeframe: The evaluation covered the period since the inception of the Strategic Plan in fiscal year 2006/07 to 2009/10. Since the year 2005/06 was largely devoted to the development of the Strategic Plan and not affected by its directions, information for year 2005/06 is included to present a baseline for an assessment of the value added of the Strategic Plan.

Funding: While the stated purpose of the Strategic Plan was to guide the new investment under the COIA, the Strategic Plan guided all projects delivered in Ontario under the ISAP, LINC, Host and ELT projects between 2006/07-2009/10. As it is not possible to separate the projects supported by the base funding from those supported by the new funds, an assumption was made that without the COIA, resources would remain constant at the 2005/06 level. Therefore, while the information on resources for 2005/06 is presented, it is only included as a base for comparison purposes.

CIC Programs: CIC main programs, such as ISAP, LINC, ELT and Host were included in the scope. Excluded are programs delivered in the province that were supported by other national initiatives (e.g., Canada’s Action Plan Against Racism) and national projects (e.g., Canadian Orientation Abroad).

MCI Programs: Discussion of the MCI programs is only partial and focuses mostly on the programs supported by COIA funding (MIIOs), co-funded projects (Ontario BTP/ELT) and the analysis of linkages between CIC and MCI programs (e.g., language). It does not extend to an examination of any MCI programs in detail. Examples of MCI programs are provided as they relate to specific strategies under the Strategic Plan to present a more comprehensive picture of the programs delivered to support newcomers, and to provide background information to inform the analysis of harmonization efforts between the two levels of governments.

Outcome Levels: The logic model for the Strategic Plan (provided in the Methodology Appendix) was used to define the evaluation questions. The evaluation focused on an assessment of the degree to which immediate outcomes were achieved through the implementation of the four Strategies, rather than an assessment of outcomes of the programs as they relate to clients. [Note 11]

2.2. Lines of evidence

The evaluation relied on multiple lines of evidence to thoroughly address the questions and indicators as specified in the evaluation matrix (see the Methodology Appendix). Lines of evidence for this evaluation are briefly described below, while a more detailed description of the methodology for each of the lines of evidence, including a description of processes and challenges, is provided in the Methodology Appendix.

  • Preliminary interviews (n=4): CIC, in collaboration with MCI, developed a methodology to assess the Strategic Plan (including the evaluation matrix) prior to engaging consultants to conduct the evaluation. Once engaged, the consultants conducted four interviews with CIC and MCI representatives to confirm and clarify the scope of the evaluation and identify potential opportunities and challenges.
  • Document review: Federal and provincial policy documents, CIC and MCI program documentation, documentation related to the work of the Settlement and Language Training Steering Committee and its two Working Groups, recent evaluation reports on settlement and language training programs, and selected publicly-available research and literature were reviewed (see the Methodology Appendix). The document review informed all evaluation questions.
  • Key informant interviews (n=34): In-depth interviews were conducted with CIC representatives at National Headquarters (NHQ) and in the regional and local CIC offices in Ontario, with MCI senior management and program staff, with municipal representatives and with representatives of SPOs. The interviews informed all evaluation questions.
  • Case studies (n=10): Case studies of specific COIA-funded projects or services delivered by SPOs were conducted. Case studies were selected to ensure that strategies 1 through 3 were covered (strategy 4 was covered through specific questions in each of the cases), that they included large urban and smaller centres, provided examples of projects that focused on specific target groups, and covered examples of new approaches, including projects co-funded by MCI and CIC, online initiatives, itinerant services and partnership initiatives. Each case study included a project document review, a set of interviews with management and staff of the organizations, and half included focus groups with clients and/or other stakeholders. Case studies were primarily used to assess performance and design and delivery.
  • Survey: An online survey of Service Provider Organizations who received funding from CIC was conducted. One survey was sent to each organization to prevent multiple responses. Most of the surveys were addressed to executive directors or, in the case of larger organizations, to managers in charge of the programs. Of the 350 surveys sent, 143 completed responses were received (for a 43% rate of completion). While the survey did not include organizations funded exclusively by MCI, a significant proportion of the respondents received funding from both CIC and MCI (50%). The survey was organized around the questions relating to the four Strategies and design and delivery questions. Among the respondents, 84 identified themselves as settlement service providers and responded to questions regarding Strategy 1; 76 identified themselves as language service providers and responded to questions regarding Strategy 2. Other organizations that answered the survey were mainstream organizations (defined as organizations that provide services to a broad range of clients, including newcomers (e.g., hospitals)) (60) and organizations that provide indirect services (e.g., research, capacity building) (55). [Note 12] All respondents replied to questions regarding strategies 3 and 4 as well as design and delivery. Survey data were used to assess performance, and design and delivery.
  • Administrative data analysis:
    • Project profile: Projects were identified through a unique record and were defined in terms of their objectives, the activities delivered, the delivery agent (i.e., SPO) and contribution agreement(s). An analysis of information was conducted after administrative data from various sources was combined into a single file. This exercise resulted in a database of 982 projects delivered in Ontario between 05/06 and 09/10, which represents 99.5% of the Grants and Contributions (G&C) expenditures (Vote 5) in the Region during that period. A project may deliver more than one activity and can be a multi-year (mostly for Strategy 1 or 2) or a one-year investment (mostly for Strategy 3 and 4). As many projects deliver more than one activity (e.g., individual and group activities under ISAP), the main unit of analysis presented in the report is an activity rather than a project. This information was used to assess performance: a comparison of the number of activities delivered in 2005/06 versus 2009/10 was used as an approximation of the value added of the new funding. This metric is based on volumes only, as budgets and expenditures cannot be allocated to activities.
    • Client information: Three databases were used to compile client information where available. Information on the ISAP, Host and LINC programs was obtained from CIC’s Immigration-Contribution Accountability Measurement System (iCAMS) [Note 13] database, while data on specific ISAP activities such as Settlement Workers in Schools, Library Settlement Partnership and Settlement Workers in LINC was obtained from the Online Tracking Information System (OTIS). Data on clients of the Enhanced Language Training program was available only for 2009/10 through the History of Assessments, Referrals and Training (HART system). Client information was used to assess performance.

2.3. Strengths and limitations

2.3.1. Strengths

  • Multiple data sources: Multiple data sources were used to inform various questions, thereby ensuring triangulation of data, and increasing confidence in the findings.
  • Baseline: One of the main strengths of this evaluation is the existence of a baseline. Fiscal year 2005/06 was used to provide a basis of comparison for progress made under the Strategic Plan. Throughout the report, information on the number of projects and expenditures are provided for both the baseline year and the most recent year under review. Therefore the value added of the additional funding in terms of the number of activities delivered can be assessed based on a comparison between the two years.
  • Consultations and feedback: Feedback from CIC and MCI stakeholders was sought at various stages of the evaluation, including during planning, the design of all data collection instruments, and at the time of preliminary analysis and the creation of the draft report. Program staff in the Regional Office was also consulted during the development of the project profile to make sure the information is as complete and accurate as possible. The feedback received throughout the process was used to ensure that the evaluation responds to stakeholders’ needs and that all relevant information was taken into account.
  • Instrument design: Recognizing that the interviewees and survey respondents, particularly municipal and SPO representatives, may lack awareness of the Strategic Plan, the design of the instruments was key in providing participants contextual information to ensure their input was relevant. The instruments were tailored to various groups of participants and included several appendices providing contextual and background information to assist the respondents in providing relevant and reliable information. The data collection instruments are included in the Methodology Appendix.
  • Survey responses: The response rate and the overlap between characteristics of the population of SPOs and survey respondents give a fairly high level of confidence in the survey findings (see the Methodology Appendix for the comparison of the SPO population with characteristics of respondents). The response rate of 43% is sufficient for the purposes of this evaluation as the profile of the respondents is similar to that of the SPO population in Ontario.
  • Newcomer demographics: To provide information on the basic characteristics of newcomers coming to Ontario, CIC Facts & Figures publications for Ontario were used. While the numbers are based on the intended province of destination rather than an actual province of residence at the time of the evaluation, triangulating this data with client data suggests that around 92% of current clients had initially identified Ontario as their province of destination. As a result, the information gives an accurate demographic profile of newcomers in Ontario.

2.3.2. Limitations

Below are the challenges that were encountered during the conduct of the evaluation and that had an impact on the interpretation of data:

  • Project information: The lack of a central, complete and comprehensive database that provides detailed project information by strategy presented a challenge. To develop such a database, data from various CIC databases and administrative documents was combined and projects were subsequently coded by strategy based on a predetermined coding scheme. Difficulties in the combination of data from various administrative databases became a challenge and significantly affected the timeline of the evaluation. While triangulation of data was used to validate information where possible (see Methodology Appendix for details on the process employed), some under- or over-estimation of the frequency of the delivery of specific activities is possible.
  • Client information: While it was possible, to a large extent, to determine the number of clients by program, there were challenges in determining the number of clients by strategy and the number of unique clients accessing all relevant services. This is due to a lack of linkages between the various databases used to collect information (e.g., ISAP A data is collected in iCAMS, while the SWIS component of ISAP is collected in OTIS). Therefore, while the number of clients served in each individual program is known to a large extent, the total number of clients by Strategy is not. In addition, while all SPOs report the number of clients for the LINC program, not all SPOs report for ISAP A and the Host programs. This results in potentially underestimating the number of clients benefiting from those Programs.

The limitations stated above have been taken into consideration during the design of the data collection tools, and the data collection and analysis phases of the project. The under-or over-representation of the frequency of some project activities and client information had some impact on assessing the full extent of the changes to the programs and their reach and, consequently, of the value added of the Strategic Plan. Information about the number of specific services is a reflection of the minimum change recorded rather than a complete picture of the change. This does not affect the general direction of the trends, only their full magnitude. As a result, these limitations do not have a significant impact on the overall findings, conclusions and recommendations found in this report.


Footnotes

  • [11] Outcome evaluations were recently conducted of the ISAP, Host and LINC programs. [back to note 11]
  • [12] Some of the settlement service providers may be also providing language services. [back to note 12]
  • [13] LINC client data are collected in the History of Assessment, Referrals and Training system (HARTs) and are subsequently downloaded to iCAMS. [back to note 13]

Page details

Date modified: