Uber is facing legal action for alleged indirect racial discrimination against a driver who claims he was sacked after facial recognition software used by the company failed to recognise him.
In an employment tribunal claim filed this week, the Black driver, who has asked not to be named, alleges that Uber's British subsidiary deactivated his account after failing to recognise him in two separate photographs, leaving him unable to work.
The Independent Workers' Union of Great Britain (IWGB), which filed the claim on the driver's behalf, told Euronews Next that it had been able to verify at least 35 similar dismissals among its members since the start of the COVID-19 pandemic, but warned that "hundreds if not thousands more" could be affected.
The IWGB is calling for Uber to stop using its "racist algorithm" and reinstate drivers unfairly dismissed as a result of the software's alleged mistakes.
In a statement, Uber said that its facial recognition software was "designed to protect the safety and security of everyone who uses the Uber app by helping ensure the correct driver is behind the wheel".
The company said that the system includes "robust human review to make sure that this algorithm is not making decisions about someone's livelihood in a vacuum, without oversight".
In the employment tribunal claim seen by Euronews Next, the driver, who worked for Uber from 2016 until being dismissed last April, alleges that he was never offered the option of a manual photo check.
Uber has used Real-Time ID Check in the UK since April 2020, after London transport regulator TfL raised concerns about the safety of the company's passengers.
The Microsoft-made software works by comparing a selfie taken by the driver as they start work to a photo the company has on file. It says all drivers can opt for either automated checks via an algorithm or manual checks by humans.
A help page on Uber's website claims that in the event Real-Time ID Check cannot verify a driver's photo, both images will be sent to a "specialist team" who will manually verify the driver's identity.
Euronews Next asked Uber if it handles the specialist identity checks itself, but the company did not respond by the time of publication.
Racial bias has long been an issue highlighted by studies of facial recognition technologies.
A 2018 paper by computer scientists Joy Buolamwini and Timnit Gebru found that facial recognition technologies they studied - including Microsoft's - performed better with lighter skin types.
Every tech they reviewed performed worst with darker-skinned, female faces, a result repeated by an independent 2019 study of facial recognition technologies by the National Institute of Standards and Technology.
Microsoft president Brad Smith wrote in a 2018 blog post that "especially in its current state of development, certain uses of facial recognition technology increase the risk of decisions and, more generally, outcomes that are biased and, in some cases, in violation of laws prohibiting discrimination".
The risk of bias is particularly relevant in the case of Uber drivers in the UK.
A December 2020 TfL survey of private hire drivers in London found that over three-quarters of respondents who gave an answer were Black or Black British, Asian or Asian British, or of mixed race.
"Hundreds of drivers and couriers who served through the pandemic have lost their jobs without any due process or evidence of wrongdoing and this reflects the larger culture at Uber which treats its majority-BAME workers as disposable,” said Henry Chango Lopez, general secretary of the IWGB.
"Uber must urgently scrap this racist algorithm and reinstate all the drivers it has unfairly terminated".
Black Lives Matter UK, which is also supporting the case, said: "The gig economy which already creates immense precarity for Black key workers is now further exacerbated by this software that prevents them from working at all, purely based on the colour of their skin. Racist practices such as these must come to an end".