Worldcoin: Why the iris is the most valuable biometric data | Technology

The Spanish Data Protection Agency (AEPD) took an unprecedented decision this Wednesday. Over the next three months, Worldcoin orbs will no longer be able to operate, which since July have been peering into the irises of some 400,000 Spaniards to validate their accounts and reward them with a batch of cryptocurrencies, which are now worth around 80 euros. . The data collected so far by Worldcoin, a company linked to Sam Altman, the godfather of ChatGPT, is blocked and therefore cannot be processed or shared until an international investigation decides whether it is legal or not for a private company to collect this type of data. of data.

This is the first time that the AEPD has taken precautionary measures. The director of the agency, Mar España, underlined its exceptional nature: “We acted urgently because the situation demanded it. Our decision is justified to avoid potentially irreparable damage. Not accepting it would have deprived people of the protection to which they are entitled.

Why this sudden speed in paralyzing the collection of high-resolution photographs of users’ irises? “Because a state of social alarm has been generated. I think the queues that formed in shopping centers and the fact that there were cryptocurrencies involved forced the AEPD to act quickly,” says Borja Adsuara, consultant and expert in digital law, who expresses his concern about “not focusing on what is important: the problem is not whether they give you money for your iris, but whether this data is processed correctly.”

The value of biometric data

There are many types of personal data. The most used in daily procedures are the first and last name, the address or the telephone number. All of them can be used to identify a specific individual, but they share another characteristic: the interested party can modify them.

On the other hand, other personal data remains with us for life. These are so-called biometric data: those which refer to the unique characteristics of each person, whether physiological, physical or behavioral. This type of information can be encoded and often remains unchanged over time. We have the same DNA from birth until death. The same thing happens with fingerprints (unless we burn them). The face changes over the years (we gain weight, we age, we lose hair), but there are algorithms capable of establishing unique patterns – for example measuring the distance between the eyes, the eyes with the nose or the mouth – which allow us to recognize people with a high level of success sustained over time.

The iris is, among the different biometric data, the one that most precisely identifies a person, according to David Arroyo, principal researcher of the Cybersecurity and Privacy Protection group of the CSIC, who warns that “if your iris is stolen, or, or instead, the alphanumeric pattern with which this biometric trait is stored can impersonate you in many places. Iris scanning is much more accurate than facial recognition. It is used less because the necessary sensor is more expensive and the adjustment of these systems is more complex.

Queues to have their iris photographed and register in Worldcoin at the small stand of the Stock Exchange on Avenida de América (Madrid). The images are taken by subcontracted employees of the Orbe, the silver ball. The company’s only sign reads: “The global economy belongs to everyone.”Pablo Mongé

In addition to its value as a personal identifier, an iris scan can provide much other information, both physiological and behavioral. “Through your gaze and the way your pupil dilates, you can tell what someone likes, what scares them, what they are interested in, and even certain cognitive characteristics, like if they have Parkinson’s disease,” explains Carissa Véliz, professor of philosophy at the University of Oxford and author of the book Privacy is power.

Iris scanning is generally limited to high security environments, as an additional means of identification to access certain facilities. “This allows for very robust authentication, but it leads to a lot of privacy issues, because the iris is something that is directly and unequivocally linked to a specific person,” Arroyo explains.

Special treatment

The particularities of biometric data make their legal treatment stricter than others. “European legislation considers them to be a special category of data. They can be captured either when Spanish legislation expressly allows it in certain cases, or when there is consent,” says Ricard Martínez, director of the Privacy and Digital Transformation Chair at the University of Valencia. “Spanish regulations say that, supposedly, when it comes to health and biometric data, you should be able to consent. But that doesn’t mean everything is possible. You could have the consent of the data subject and pursue an illegal or disproportionate activity, or violate a fundamental right. It’s more complicated than it seems.”

Proportionate use of this data is essential. In 2021, the AEPD fined Mercadona 3.5 million euros (it paid 2.5 million for accepting the voluntary payment) for using cameras equipped with facial recognition systems in 48 of its stores. The company argued that it installed the technology to detect people subject to a restraining order in its establishments. The agency considered that the objective pursued, namely to identify convicted persons, did not justify the collection of facial prints of all customers entering the chain’s supermarkets.

Returning to the case of Worldcoin, the orbs scan the iris and convert this image into alphanumeric code. This pattern is what identifies the user. “The problem is not that Worldcoin collected this data from 400,000 people, but that they make all these databases and images available to other algorithms and they don’t say exactly why,” explains Jorge García Herrero, data protection lawyer and law enforcement specialist. this regulation.

An American soldier scans the iris of an Afghan man south of Kandahar.Chris Hondros (Getty Images)

The great danger of biometric data is that it is used for non-legitimate purposes. In China, for example, facial recognition systems are used to monitor and persecute Uyghurs. It is suspected that when the Taliban regained control of Afghanistan in 2021, they turned to biometric identification technologies, such as iris scanning, to detect and suppress collaborators of the former regime. Biometrics is an unrivaled tool if you’re looking to crack down and, of course, biometric data can also be used to impersonate people.

What if I don’t care about privacy?

“I’m an ordinary citizen, Google already has all my data, I don’t think the eye contributes much,” a young man who was about to have his iris scanned at the La Vaguada shopping center in La Vaguada. there is Madrid. This is a recurring argument. Carissa Véliz, of the University of Oxford, considers it fallacious. “We tend to think that when something is personal, it’s individual, but when you share your personal data, in reality you also put others at risk, as we saw in the case of Cambridge Analytica,” he explains in reference to the scandal. carried out by the said consulting firm, which accessed the personal information of 50 million Facebook users to create profiles of American voters and target them with personalized electoral advertisements.

“You may not care about your privacy, but I do not consider it a right, but an obligation, because you can endanger your entire environment,” says David Arroyo, from CSIC. “This type of data is then used to characterize other people, and with it more sophisticated attacks are launched, such as phishing or disinformation,” he emphasizes. Even if the right of rectification is subsequently exercised and the biometric data collected is deleted, these will have already been used to train the tool, i.e. to make it more effective.

What worries experts about the Worldcoin affair is that it contributes to the standardization of a technology, iris reading, which has a double advantage. “If we let it establish itself as a legitimate form of verification, everyone will end up using it,” says Véliz. “I am very upset that the use of facial recognition to unlock phones is being normalized. I think it made people see technology as something natural. Hopefully the same thing doesn’t happen with iris reading.”

You can follow EL PAÍS Technology In Facebook And X or sign up here to receive our weekly newsletter.

Subscribe to continue reading

Read without limits

_