As Toronto police look to supercharge their facial recognition capabilities, documents obtained by Ricochet Media reveal a slew of controversial biometrics companies have formed a queue, each offering to provide them with the necessary technology.

Last December, the Toronto Police Service (TPS) published a request for proposal (RFP) seeking to upgrade the facial recognition system it currently employs. Documents obtained by Ricochet in a freedom of information request reveal that companies like Idemia responded to a similar request for information (RFI) two years prior, offering their facial recognition system as a suitable upgrade. Idemia currently provides its technology to neighboring Peel and York police and has been accused of racial bias.

“Torontonians should see this as a real threat,” says Beverly Bain of No Pride in Policing, an advocacy group focused on defunding Canadian police forces. “These machines are designed to target those who protest, including pro-Palestinian protesters. The idea is to weed those people out, to arrest them, and to punish them for protesting against genocide.” 

‘Significant advancements’

Facial recognition systems are a form of technology that attempt to verify a person’s identity by analyzing a picture or video of their face, mapping its distinctive features — like the distance between their eyes or the texture of their skin — and comparing them to a database of known faces. 

Since 2018, Toronto police’s forensic identification unit has employed a facial recognition technology called Neoface Reveal to match images obtained during investigations against the service’s mugshot database. 

“Torontonians should see this as a real threat. These machines are designed to target those who protest, including pro-Palestinian protesters. The idea is to weed those people out, to arrest them, and to punish them for protesting against genocide.” 

“However, significant advancements in the FR [facial recognition] solution industry have occurred since the program’s inception,” reads TPS’ full RFP, posted and made available for download on contract-awarding website MERX. “The TPS now seeks to leverage these developments to enhance FR [facial recognition] processes, to improve accuracy and the ability to evaluate images.” 

Beverly Bain, an organizer of the No Pride in Policing Coalition, and professor of Women, Gender and Sexuality Studies in the Department of History at the University of Toronto. Photo via CBC

In 2023, Toronto police put out a similar request for information (RFI) on MERX, seeking out presentations from companies for “innovations” in facial recognition technologies — specifically software that could match images against photos of people stored in their booking database, much like the software it currently employs and is looking to upgrade. 

Ricochet has obtained a copy of the full RFI, including amendments, as well as some of the documents submitted by vendors in response, in a freedom of information request.

The documents offer insight as to what specific upgrades Toronto police are looking for in its new system. 

In an amendment to the original RFI, TPS said it was looking for systems capable of intaking and processing between 8,000 to 10,000 images yearly. “When no investigative lead is provided, the software should have an unsolved database that would continuously compare probe images against new known images/videos ingested into the TPS booking database,” also reads Toronto police’s initial RFI. 

In a statement to Ricochet, Toronto Police’s Chief Information Officer Colin Stairs confirmed that NeoFace processes approximately 4,000 searches per year and that Neoface’s searches are not automated.

‘It clearly doesn’t work’

In total, Ricochet has learned that 11 “involved vendors” expressed an interest in providing Toronto police with new technology: TCG Digital, Cumberland Strategies, Facia AI Ltd., Genvis, Idemia, IMDS, NEC Corporation of America, Rank One Computing, Securaglobe Solutions, Shufti Pro, and Tech5 USA. 

NEC provides Toronto police with its current facial recognition technology, Neoface. Genvis’ facial recognition tech was piloted in Australia in the early days of the COVID-19 pandemic to police those in quarantine, raising concerns about privacy at a time when many countries opposed such surveillance. 

Parks, a Black man, was arrested for theft and assault on a police officer and spent 10 days in jail for a crime he didn’t commit.

However it is Idemia’s facial recognition technology — currently employed by neighboring Peel and York police — that has elicited the most media attention. In 2019, investigators in Woodbridge, New Jersey misidentified Nijeer Parks using Idemia’s facial recognition technology. Parks, a Black man, was arrested for theft and assault on a police officer and spent 10 days in jail for a crime he didn’t commit. “There’s clear evidence that it doesn’t work,” Nijeer Parks told CBC. 

Idemia is not the only company to face such accusations: in 2018, researchers at MIT and Stanford University found that three leading AI facial-analysis programs exhibited significant bias, performing worse for darker-skinned individuals and women.  

Idemia already supplies Toronto police with its Automated Fingerprint Identification System (AFIS), a computerized fingerprint search and storage system, as well as technology used in the collection of mugshots. Idemia did not respond to a request for comment. 

‘Walking licence plates’

The use of facial recognition technologies by police has long been controversial due to privacy concerns, fears of mass surveillance, and racial bias in accuracy. 

Toronto police was itself embroiled in a scandal in 2020 when it was discovered that some of its officers had used Clearview AI — a facial recognition technology that uses billions of images scraped from public websites like Facebook and Instagram to try to identify suspects, victims and witnesses in criminal investigations. Toronto police initially denied using the technology before admitting a month later that officers uploaded over 2,800 photos to the U.S. company’s platform. When the news broke, the Canadian Civil Liberties Association (CCLA) decried Toronto police’s use of Clearview AI as “a remarkable violation of public trust.” 

“They risk stripping people of their anonymity, reducing them to walking license plates.”

Today, director of the CCLA’s Fundamental Freedoms Program Anaïs Bussières McNicoll says police use of facial recognition technologies remains a problem because of how they tilt the power dynamic in police civilian-interactions further towards police. “They risk stripping people of their anonymity, reducing them to walking license plates,” she said. 

In an initial statement to Ricochet, Toronto police said that while they anticipated “improvements in the available software” with this procurement,  “it will be used in the same way as the current system, with no changes to processes or scope.” 

When asked about the intended unsolved database that would continuously operate, Chief Information Officer Stairs justified its implementation because Toronto police’s automated fingerprint identification system (AFIS) already has a similar feature. 

Canadian Civil Liberties Association’s Fundamental Freedoms program director Anaïs Bussières McNicoll. The CCLA has called TPS’s use of AI “a remarkable violation of public trust.”

“Implementing this feature for facial recognition would align its functionality with AFIS, ensuring consistency across investigative tools. Importantly, this does not change the fundamental parameters of how FRS [facial recognition] is used, nor does it alter the requirement for a human analyst to verify and approve any potential leads before they are distributed. Additionally, image retention policies remain fully compliant with existing procedures,” said Stairs. He declined to provide the names of the companies that had responded to Toronto police’s recent request for proposal.

Kristen Thomasen, an associate law professor at the University of Windsor who specializes in automated technologies and issues of surveillance, says however the technology Toronto police currently uses is already concerning. 

“Being booked doesn’t necessarily mean someone has committed any crimes, it just means you’ve come into contact with the police,” says Thomasen. “It’s concerning, even if there is an [AI] policy in place, if this is now running constantly, spitting out results. It could lead to state scrutiny on people who hadn’t done anything wrong but were just picked up and had a photograph taken.” 

Mission creep

What is most concerning to Thomasen about police use of facial recognition technologies is their potential for mission creep — the slow expansion of a police force’s mission beyond its original scope once a new technology is adopted. 

“Once you have the technology, it becomes very likely that new ways of using it are integrated into practice over time, especially when they know the public isn’t maybe as strongly informed about the use of that technology,” said Thomasen. 

She cites the growing trend of U.S. police forces relying solely on unproven facial recognition results to make arrests —  ignoring their own policies that demand supplementary evidence —  resulting in false arrests.

Over the past decade, there has been a rise globally in law enforcement using facial recognition technology. Advocates warn that the technology is increasingly being used as a surveillance tool on protest movements and activists, disproportionately targeting people of colour. Photo via DepositPhotos

“There is a tendency among people to defer to a computer result, which is natural,” explains Thomasen. “But I think that if you’re integrating that kind of technology into a high stakes set of circumstances where there’s a massive power imbalance, it becomes very difficult to guard against that end result.”

Bain agrees: “Police are not trained to think. The analysts are not thinkers. The analyst goes by what information the technological equipment spits out. And the AI equipment is already biased from the beginning.” 

Included in the service’s request is an evaluation key, which Toronto police will use to score proposals. The key gives high weight to solutions that can accurately match individuals wearing masks, hats, and hoods — objects often worn by demonstrators at protests to conceal their identity. McNicoll says the use of facial recognition technologies by police to monitor protests remains a perpetual concern. 

“Using this kind of technology could potentially reveal the political leanings of individuals or allow police to profile different individuals,” says McNicoll. “This could have a chilling effect on people’s ability or decision to gather, to express their opinions and engage in behavior that is absolutely necessary and vital for a healthy democracy.” 

“It could lead to state scrutiny on people who hadn’t done anything wrong but were just picked up and had a photograph taken.” 

In 2020, at the height of the Black Lives Matter protests that followed the death of George Floyd, at least six U.S. federal agencies — including the Federal Bureau of Investigation — employed facial recognition technology to identify protestors, according to a report from the U.S. Government Accountability Office. The same year, the New York City Police Department tracked down a BLM activist at his home using similar technology.

TPS officer Stairs told Ricochet in a statement that there’s been no change in the scope or parameters of how TPS uses facial recognition technology since its implementation. He adds that police do not use facial recognition for live surveillance or proactive searches for wanted individuals.

“The core purpose of our facial recognition system remains the same: to support criminal investigations by comparing legally obtained images against a controlled database. Increased usage of the system over time simply reflects the natural adoption of an established investigative tool — just as fingerprinting and DNA analysis became more widely used following their introduction,” he said. 

“It is strictly an after-the-fact investigative tool, used only when an individual is already the subject of a criminal investigation. This is fundamentally no different from a police officer manually reviewing mugshots to find a match — the technology merely improves efficiency.” 

Whichever solution Toronto Police purchases will be implemented within a year of the contract being awarded, “if not sooner,” reads the service’s RFP. According to MERX, solicitations for proposals closed on February 14, 2025.