
Executive Summaries Nov 11, 2024
AI at the service of Canadian immigration: Chinook, a controversial tool
The use of artificial intelligence and technological tools in decision-making processes by Immigration, Refugees and Citizenship Canada (IRCC)
Canada is facing an ever-growing demand for immigration. For several years now, this has invariably led to a high volume of applications and, consequently, significant processing delays. A limited number of IRCC officers are available to process these applications. Today, there are reportedly over two million applications in all categories being processed. These combined factors have led to situations where officers often have only a few minutes to review the forms and documents submitted by visa applicants and reach a decision on an immigration application.
In response to this situation and in an imperative quest for efficiency, IRCC has designed and deployed new technological tools primarily aimed at helping decision-makers reach quicker conclusions. IRCC admits to using the Chinook tool since 2018 to process temporary residence applications. This tool was developed to improve the efficiency of application processing, but its development and its ability to perform certain analytical functions have raised a number of ethical questions.
How Chinook works
Initially based on Microsoft Excel and currently being deployed on a Cloud platform, Chinook extracts and presents information stored in the Global Case Management System (GCMS) in a more user-friendly and accessible manner. It extracts specific information according to certain indicators and provides a different visual representation to officers responsible for rendering decisions.
In itself, Chinook is not a pure artificial intelligence tool, but rather a form of interface that draws decision-makers’ attention to certain elements of a case, without necessarily providing the context or nuance that a more in-depth analysis of the case might reveal. It performs certain analytical functions based on criteria or indicators. For example, Chinook can use keywords to highlight certain favourable considerations, but it can also generate standardized refusal reasons. The tool does not self-generate indicators; they are designated by IRCC.
Alongside Chinook’s functionalities, IRCC also uses a number of advanced analysis and automation systems that operate independently of Chinook. These systems can categorize applications by their level of “complexity,” thus indicating to decision-makers which cases require particular attention.
These systems, for instance, can identify applications eligible for the facilitated process, such as visitor visa applicants who have had a similar application approved within the past ten years, aiming to improve process efficiency. They also identify complex applications that require more in-depth analysis. As with Chinook, IRCC points out that the system itself makes no decision; it refers the file to an officer who will determine whether the applicant is admissible to Canada and make the final decision.
Ethical and legal issues
The Federal Court has already considered the use of Chinook on several occasions. It recognized that the use of Chinook in the decision-making process does not constitute an unreasonable decision nor a breach of procedural fairness, as long as a person rendered the decision, not the system independently, and that this decision is not unreasonable in itself (Haghshenas v. Canada case). Human supervision is absolutely essential.
The use of self-generated refusal letters was also analyzed, and the Court again concluded that, in itself, this did not constitute a breach of procedural fairness. However, the Court established that such a tool should not replace a complete case analysis with all the forms, attestations and documents, the study plan, and other elements by the decision-making officer (Khosravi v. Canada case). The Court stressed the importance of officers not relying solely on incomplete information provided by Chinook and of them consulting the entire file to make a reasonable decision. To this day, the Court maintains that the substance of the decision must prevail over the means used to reach it.
Potential risks and biases
Although this is the current stance, in a context where immigration officers have on average only a few minutes to analyze a file and render a decision, are there not certain risks of bias based on the criteria used to sort and analyze applications? Could the use of these technologies influence the discretion available to an officer one way or the other? By classifying cases according to their level of “complexity,” does the system not participate in the initial stage of the decision-making process or suggest a form of recommendation that could be based on different biases? A number of questions remain.
To ensure transparency and fairness, all decisions made using Chinook must be documented and audited. The Standing Committee on Citizenship and Immigration also recommended that IRCC conduct additional governance monitoring with comparative analyses and an assessment of the algorithmic impact.
It is crucial that visa applicants understand how these technologies are used, so that they can submit their applications as complete and as accurately as possible. It is equally important that officers receive regular training on identifying possible biases to ensure a fair, transparent, and effective decision-making process. It will be interesting to follow the evolution of this tool, its use and the Court’s position.
While these technologies are not considered strict artificial intelligence tools, their use should not replace human decision-making. It must remain possible to understand and analyze the role of these tools in each decision rendered. It will be interesting to follow both the evolution of the systems and the jurisprudence concerning procedural fairness and discrimination that could be generated by these evolving tools.