Digital rights group Foxglove described the system as effectively being ‘speedy boarding for white people’ © Guy Corbishley/Alamy

The Home Office has scrapped an opaque algorithm used for visa applications after a legal challenge by campaigners who said it brought “entrenched bias and racism” into the immigration system.

The decision-making tool has been in use since 2015, but the Home Office refuses to say how it operates and its existence was only revealed last summer by the Financial Times.

The Joint Council for the Welfare of Immigrants and digital rights group Foxglove subsequently launched a judicial review of the system. This week, before the case reached court, the Home Office announced it would stop using the algorithm in order to head off the challenge.

In their legal submissions, the two campaign groups argued that the technology — which grades each visa applicant as green, amber, or red according to their level of risk — is racial discrimination, because those on a secret list of “suspect nationalities” are more likely to have their visas refused. Foxglove described the system as effectively being “speedy boarding for white people”.

The Home Office acknowledged it was “reviewing how the visa application streaming tool operates” but said it did not accept the allegations in the judicial review. The algorithm will be suspended from August 7 and officials will work on a redesigned system which will be in place this autumn.

Chai Patel, legal policy director of JCWI, said the streaming tool “took decades of institutionally racist practices, such as targeting particular nationalities for immigration raids, and turned them into software.”

He added that while JCWI would be withdrawing its claim against the Home Office in the short term, it would be “closely scrutinising” any new system they construct. “If it looks like it repeats the mistakes of the old system, and causes any sort of race discrimination, we’re ready to reopen the case,” Mr Patel said.

Martha Dark, one of the directors of Foxglove, said the use of the algorithm had “profound consequences” for visa applicants, from “whether you are able to reunite with your fiancé or family, to whether you can attend an academic course or a conference”.

She called for “proper public consultation about how these decisions are made”.

The streaming tool had attracted criticism from David Bolt, chief inspector of borders and immigration, who warned the Home Office in February that it needed to do more to “demystify” the use of this technology.

“The more cryptic the Home Office is seen to be about the way visa decisions are made, the more it will fuel concerns about bias and poor practice,” he said.

“The department’s reputation and the staff who work in this area would be better served if its first instinct were to be open and engaging rather than seemingly reluctant to reveal more than it absolutely has to.”

Nick Thomas-Symonds, Labour’s shadow home secretary, said it was “good news” that the use of the streaming tool would be stopped, but it was “disappointing, though sadly not surprising, that it took legal challenge for the Home Office to act”.

Letter in response to this article:

Justice system’s use of algorithms needs urgent oversight / From Simon Davis, President of the Law Society of England and Wales, London, WC2, UK

Get alerts on UK immigration when a new story is published

Copyright The Financial Times Limited 2020. All rights reserved.
Reuse this content (opens in new window)

Follow the topics in this article