November 11, 2024
Cited article by the Guardian
HRRC strongly opposes the use of AI tools that erode migrant rights and lead to unfair outcomes. HRRC also notes it is crucial to ensure that AI systems are transparent, fair, and uphold the principles of justice.
News Brief
A UK Home Office artificial intelligence system that proposes law enforcement action against adult and child migrants could make it too easy for officials to rubberstamp automated life-changing decisions. The system is one of several AI programmes UK public authorities are deploying as officials seek greater speed and efficiency. The Home Office disclosures show the Identify and Prioritize Immigration Cases (IPIC) tool is fed an array of personal information about people who are the subject of potential enforcement action, including biometric data and data about criminal convictions. IPIC has been in widespread operation since 2019-20.
As new details of the AI-powered immigration enforcement system emerged, critics called it a “robo-caseworker” that could “encode injustices” because an algorithm is involved in shaping decisions. Migrant rights campaigners called for the Home Office to withdraw the system, claiming it was being used to create cruelty. Jonah Mendelsohn, a lawyer at Privacy International, said the Home Office tool could affect the lives of hundreds of thousands of people. Fizza Qureshi, the chief executive of the Migrants’ Rights Network, called for the tool to be withdrawn and raised concerns the AI system could lead to racial bias.