Australian Federal Police officers have made more than 100 searches using Clearview AI, a controversial facial recognition company that identifies members of the public by matching them with more than 3 billion photos scraped from social media.
Documents released under freedom of information law indicates the AFP used the artificial intelligence technology on behalf of foreign law enforcement agencies.
Last week the Office of the Australian Information Commissioner and the UK’s Office of the Information Commissioner launched a joint investigation “into the personal information handling practices of Clearview AI”.
The FOI documents reveal that nine members of the AFP’s Australian Centre to Counter Child Exploitation (ACCCE) registered for a free trial of the software between December 2, 2019, and January 22, 2020, when they were directed to cease.
Of the 113 documents relevant to Clearview AI, 109 were released. One was excluded and 35 redacted because they “contained information communicated by a foreign government to the AFP for law enforcement purposes” or “relate to information provided by a foreign government”.
This would suggest the AFP may have used Clearview to conduct investigations on behalf of foreign law enforcement agencies.
The New York company, co-founded by Australian Hoan Ton-That, has been used by more than 2200 agencies, companies and individuals including ICE and government-related agencies in Saudi Arabia and UAE.
A report produced for the Office of the Australian Information Commissioner says: “The number of searches of Clearview’s database of images undertaken by staff … are unavailable due to the restriction on access to the Clearview platform.”
However, internal emails admit the AFP “has run more than 100 searches”. They included persons of interest in and outside Australia.
The ACCCE and child protection triage unit officers used Clearview in investigations into child abuse materials and “to attempt to identify and locate a suspected victim of imminent sexual assault”, an executive briefing said.
Individual police officers first received emails from Ton-That in November 2019. “Search a lot,” they read.
“Your Clearview account has unlimited searches. Don’t stop at one. It’s a numbers game. Our database is always expanding and you never know when a photo will turn up a lead.
“Refer your colleagues … If you think your colleagues might want to try Clearview out for themselves, just send their names and email addresses to help@clearview.ai and we’ll sign them up to.”
In 2009 Ton-That was sought by San Francisco police for creating ViddyHo, an alleged phishing scam accused of stealing gmail-users login credentials and spamming their contacts. “We had a bug in our code,” his website claimed after it was shut down and registered on a new domain.
An officer referring their colleague to Clearview wrote: “Hey, have you heard about this app? … The user uploads an image to the app which then runs across a database of images scraped from the internet to find a match … Bit creepy but very cool.”
Another officer told a colleague on December 9, 2019: “Just out of interest I ran [a suspect’s] mugshot through the trail Clearview FR system and got a hit on his Instagram account. The FR system looks very cool.”
There is no evidence a security or privacy assessment was conducted before police began using Clearview AI.
The AFP acknowledged using the software in April in response to a question on notice from shadow attorney-general Mark Dreyfus. It had previously denied using it in several FOIs, suggesting AFP leadership may not have been appropriately informed about ACCCE’s activities.
“Maybe someone should tell the media we are using it!” an officer wrote on January 21 in emails discussing reporting on Clearview AI.
Dreyfus described Clearview AI as a “deeply problematic service”.
“The Home Affairs minister must explain whether the use of Clearview without legal authorisation has jeopardised AFP investigations into child exploitation,” he said.
“The use by AFP officers of private services to conduct official AFP investigations in the absence of any formal agreement or assessment as to the system’s integrity or security is concerning.”
An early investor in Clearview AI was Peter Thiel, the Silicon Valley billionaire and member of Trump’s transition team. Thiel founded Palantir, a data-mining and analysis company used by law enforcement and immigration agencies around the world — including the Australian Criminal Intelligence Commission, the Australian Defence Force and AUSTRAC.
Crikey is committed to hosting lively discussions. Help us keep the conversation useful, interesting and welcoming. We aim to publish comments quickly in the interest of promoting robust conversation, but we’re a small team and we deploy filters to protect against legal risk. Occasionally your comment may be held up while we review, but we’re working as fast as we can to keep the conversation rolling.
The Crikey comment section is members-only content. Please subscribe to leave a comment.
The Crikey comment section is members-only content. Please login to leave a comment.