ARCHIVED – An Examination of the Canadian Language Benchmark Data from the Citizenship Language Survey

Summary statement and recommendations

Summary statement

An examination of data of the type employed in this study has the potential to yield useful, although incomplete, information about the language learning outcomes of adult immigrants. Our analysis has identified a number of ways in which the research design and data collection procedures can be improved. The data presented here reveal a variety of factors that predict participant CLBA scores. In fact, a regression model which included ten factors predicted over 41% of the variance in the scores. Given the complex array of influences on language learning, this is an unusually high multiple correlation. Among the statistically significant predictors were language training, education in Canada, age at immigration, immigration class, and city of residence. Perhaps the most noteworthy finding of the study was the effect of mother tongue. Members of both East Asian (with the exception of Tagalog speakers) and Southeast Asian language categories appear to have been disadvantaged relative to other language groups. This finding points to the need for language training that is targeted to listening and speaking skills for members of these language groups, especially since they comprise the largest cohorts of newcomers to Canada.

Although the analysis revealed a negative relationship between language training in Canada and CLBA scores, it must be recognized that there is no basis for believing that this is a causal relationship. Rather, those individuals who accessed language training most likely did so because they entered Canada with limited or no official language skills. This expectation is confirmed by the fact that LINC training, in particular, designed for beginners and low proficiency learners, showed the largest negative contribution to CLBA scores. The finding of a negative relationship between age of immigration and CLBA scores supports previous empirical evidence that ultimate attainment in a second language is affected by developmental factors, such that younger learners have an advantage. Another finding that is not altogether unexpected is the differences in CLBA scores across the three immigrant classes and the resulting significant contribution of immigration class to the multiple regression model. The higher scores for members of the independent class may be explained by selection criteria. Independent immigrants are chosen, in part, on the basis of their knowledge of an official language, as well as their formal education. Family class immigrants are not expected to meet the same stringent requirements, and because refugees, who received the lowest CLBA scores overall, are no longer selected on the basis of adaptability, prior language knowledge and formal education are not a consideration.

Recommendations

The following recommendations are based on our analysis of the limitations of the pilot study. We recommend that they be implemented in any further collection of data that CIC may undertake. These suggestions will lead, we believe, to more useful information for assessing the effectiveness of language training than is contained in the current data set.

1. Define a clear, sufficiently focused set of goals at the outset of the study.

The current study, as noted above, attempted to address several distinct issues at the expense of an adequate focus. For example, there is no reason to believe a priori that citizenship test scores and other citizenship related questions would contribute to the overall goal of the study to determine predictors of second language ability. A sharper focus on factors known to affect second language acquisition will lead to more meaningful outcomes in future studies. Although it may be tempting in work of this type to collect as much data as possible from participants, this approach can be counterproductive in that lengthy interviews increase the likelihood of assessor and participant fatigue as well as erroneous and missing data.

To ensure a maximally useful final data set, participant selection procedures must be consistent across the entire study. Evidence from “CIC Language Surveys: Sample Development and Data Management” (Government Consulting Services) suggests that this was not the case in the current data set. It is also clear that the questions in the survey were not finalized at the start of the study; after data collection was completed in Toronto and Vancouver, significant changes were made to the survey.

2. Develop and implement a clear training protocol for assessors and coordinators.

Any future data collection should entail face-to-face training workshops for assessors and others associated with the project. In addition, a training handbook should be prepared in which potential problems are anticipated and dealt with. It would be practical to employ a full-time trainer/coordinator to oversee the project. This individual could conduct the training sessions in all participating cities, observe assessors in their initial interviews, and debrief them regularly. The investment in training would add value, not only by eliminating the need for extensive and expensive data cleaning, but by resolving problems that might otherwise result in missing or uninterpretable data.

3. Streamline and improve consistency of data entry procedures.

An essential step in a large quantitative study of this type is the development of a codebook for data entry. To reduce the margin of error, and to facilitate analysis, all variables should be coded numerically. If future data are collected in the same manner, data entry should be centralized to ensure consistent recording. However, an alternative approach, which we believe to be superior, is the implementation of web-based data input requiring users to select appropriate responses from menus of choices. This type of data recording will minimize the potential for error and missing data. The assessors could enter the data directly, eliminating the need for separate data entry personnel.

In the future, occupations should be coded according to the NOC protocol.

Response codes should be included indicating whether the respondent did not understand the question or did not know, or the question was not applicable. This would be more informative than a blank response.

4. Collect more information on factors known to be relevant to second language acquisition.

The single, most important factor not included in this study was the CLBA score of participants when they were first assessed in Canada. Without this information, it is impossible to quantify the effects of language training or any linguistic progress that the participants may have made. There are many factors that influence language development, but without a sense of the participants’ starting points, the relative contributions of each cannot be evaluated. If data sources from the original assessments are not available, participants should be asked to report their first CLBA scores.  

Two additional variables that are known to be related to language learning are formal education and language training in the home country. Without knowing what resources the individual brings with him/her on arrival, it is very difficult to account for patterns and degree of language development.  It is well established that formal education is positively correlated with second language proficiency. With regard to language training in the country of origin, it would be useful to know what type of instruction the participants received both to understand what skills they have already acquired and to identify potential gaps (e.g., oral/listening skills, pragmatics, reading/writing skills, grammar). The fact that many English as a foreign language programs focus almost exclusively on reading and writing development often leads to limited oral skills in individuals who may have a relatively good command of vocabulary and grammar.

Although occupation in home country usually is not directly related to language acquisition, such information would be useful because participants’ personal linguistic goals and motivation may be tied to reentering their previous occupation. The linguistic requirements of those occupations may in some way determine the participants’ ultimate attainment in their second language.

More detailed information about language used at work should be collected. Although it is interesting to know which language is used most frequently, without a clarification of the actual types of language use, there is no helpful information regarding opportunities for ongoing linguistic development. Participants should be asked to specify the extent to which they use routinized, formulaic language (e.g., a waitress in a restaurant) versus conversational language (e.g., with clients and coworkers).  In addition, participants should report the extent to which they are required to read and write at work, as well as the level of complexity of the tasks they are expected to perform. Some of the language at work questions could be posed in scalar fashion, for instance, percentage of time an official language is spoken (0%, 10%, 20% … 100%). Furthermore, participants’ use of an official language could be probed more extensively to elicit information about language use at home and in social interactions outside the workplace. The latter, in particular, could lead to a better understanding of degree of integration.

While the three-way breakdown of immigration class in the current study was informative, more detailed specification of the independent and refugee classes may assist in the development of targeted language programming that would better address the needs of particular cohorts within the three classes. In the refugee class, for example, the language learning needs of the various categories of refugees may differ.

Length of language training should be specified in hours. Any increase in the time required to respond to this question would be offset by the increase in the usefulness of the answer.

Participants should be asked directly about the usefulness of their language training experiences. This could be achieved, for example, by having them respond on a scale to a question such as “How helpful have you found the language training you received?” 1 = not at all helpful, 5 = extremely helpful. They should also be asked to rate the degree of emphasis placed on particular language skills in the program(s) they attended (listening, speaking/pronunciation, reading, writing, grammar, vocabulary development).

Separate rather than combined assessments of listening and speaking should be used. Furthermore, the speaking component of the test should be digitally recorded for additional analysis. An informal consideration of the assessors’ comments indicated that a large number of observations related to the participants’ pronunciation. These comments could not be used in the current analysis because of a lack of standardization, but recordings could permit a thorough evaluation of pronunciation along with other speech variables such as fluency. Through appropriate analysis, data supplementing the CLBA test could be a source of extremely valuable information for researchers and curriculum developers. It would also allow for large scale cross-validation of this aspect of the CLBA assessment tool.

The CLBA test itself could be supplemented with an additional task, in which participants are asked to repeat a small number of utterances after a model (to be digitally recorded). From a methodological standpoint, this would permit an easy comparison across all participants because the content would be identical. Whether or not the participants could complete the task would also offer confirming evidence of their language proficiency.

5. Consider additional, complementary ways of evaluating the effectiveness of language training.

Although a survey of this type can help pinpoint factors that influence second language learning, as well as ways in which current federally-funded language training can be improved, it provides only a limited perspective on these concerns. In order to gain a better understanding of a more complete range of issues that affect second language attainment, language programs themselves, including curricula, classroom practices, qualifications of instructors, quality and appropriateness of assessment tools, and relevance of course content to learners’ goals should also be studied. Without taking these factors into consideration, it would be difficult to make effective improvements to existing programs.  One approach to addressing program efficacy would be to study innovative programs that report a high success rate, as measured by CLBA improvement.

6. Miscellaneous.

We suggest limiting the assessor comments category to matters that concern highly unusual information about the participants. Many of the remarks in the current study were repetitions of information that had already been collected, or highly idiosyncratic information that was of little use.

Page details

Date modified: