“Once somebody has your specific iris picture, you’ll never have the possibility to stay anonymous.”
That’s Michael Will, head of Bavaria’s data regulator, speaking to The Wall Street Journal (WSJ) Sunday (Aug. 18) about concerns over Worldcoin’s iris-scanning technology.
As that report noted, Germany is far from alone in its concern, with more than a dozen jurisdictions having either suspended Worldcoin’s operations or launched probes into the company, founded by OpenAI CEO Sam Altman.
As PYMNTS wrote last year, the Worldcoin project’s stated goal is to capture participants’ biometrics with its proprietary Orb to provide them with a “proof of personhood” — as well as the company’s digital currency — something that the company said “will become more important as increasingly powerful AI [artificial intelligence] models become available.”
In other words — and as the WSJ report pointed out — it’s a case of Altman using the technology from one of his companies to protect people from the potential downside of his other firm.
So far, the project has drawn a mix of both support — 6 million people have signed up — and skepticism. For example, regulators in Hong Kong raided the company’s offices there in May, while Spain and Portugal have both blocked its use. The Bavarian regulator is expected to release a decision in its investigation soon.
Their concerns are multifaceted, the report said. How does the company train its algorithms? What steps does it take to avoid scanning children? What does it do with user data? And several authorities have accused Worldcoin of telling operators of its Orb scanning device to encourage users to turn over iris images.
According to the report, privacy advocates say these images could be used to construct a global biometric database with little oversight.
Damien Kieran, Worldcoin’s chief privacy officer, told WSJ any venture like this one was bound to attract scrutiny, and the project was working with regulators to address concerns.
For now, WorldCoin has paused its image-sharing option for users while it develops a new process, he said, and its training materials don’t ask operators to induce users to share biometric data, said Kieran.
“We’ve built a technology that by default is privacy-enhancing,” Kieran said in an interview. “We don’t collect data to harvest it. We don’t sell data. In fact, we couldn’t sell it, because we don’t know who the data belongs to.”