present developments on biometric recognition – Official Weblog of UNIO – Go Well being Professional

Maria Inês Costa (PhD Candidate on the Faculty of Legislation of the College of Minho. FCT analysis scholarship holder – UI/BD/154522/2023) 

In Portugal, greater than 300,000 individuals have already “bought” their iris scan to Worldcoin Basis, which in return presents them cryptocurrency. In March 2024, the Portuguese information safety authority (hereinafter, the CNPD) determined to droop the corporate’s assortment of iris and facial biometric information for 90 days with the intention to defend the proper to the safety of non-public information, particularly of minors, following within the footsteps of Spain, which additionally briefly banned the corporate’s actions for privateness causes.[1]

In a press release, the CNPD explains that the corporate has already been knowledgeable of this momentary suspension, which is able to final till the investigation is accomplished and a remaining choice is made on the matter. The adoption of this pressing provisional measure comes within the wake of “dozens of reviews” obtained by the CNPD within the final month, which report the gathering of information from minors with out the authorisation of their dad and mom or different authorized representatives, in addition to deficiencies within the info supplied to information topics, the impossibility of deleting information or revoking consent.[2] In CNPD’s press launch, one can learn that “[g]iven the present circumstances, in which there’s illegal processing of the biometric information of minors, mixed with potential infringements of different GDPR guidelines, the CNPD thought of that the danger to residents’ elementary rights is excessive, justifying an pressing intervention to stop severe or irreparable hurt.”[3]

In its detailed suspension choice, one is supplied with vital info, particularly the truth that, via complaints, it was revealed that some information topics solely turned conscious of the dangers concerned within the processing of their information as a consequence of media publicity of the matter, and that these dangers had been by no means correctly defined to them; moreover, they had been allegedly not supplied with info on the processing carried out, particularly on the info truly collected and for what functions, nor on train the rights supplied for within the regulation on the safety of non-public information; and, as reported by the media, there are a selection of residents who authorise this information assortment and subsequent processing as a result of they’re economically weak and/or usually are not absolutely conscious of the goals and implications of their participation within the Worldcoin undertaking.[4]

And what’s this undertaking all about? Cofounded by OpenAI CEO Sam Altman, the Worldcoin Basis’s white paper entitled “A New Id and Monetary Community”[5] outlines the targets of scanning individuals’s iris and facial biometrics. Based on the doc, “[i]f profitable, Worldcoin may significantly improve financial alternative, scale a dependable answer for distinguishing people from AI on-line whereas preserving privateness, allow international democratic processes, and present a possible path to AI-funded UBI.[6] Worldcoin consists of a privacy-preserving digital identification community (World ID) constructed on proof of personhood and, the place legal guidelines enable, a digital forex (WLD).”

For the corporate, in a world the place AI is turning into increasingly more highly effective, there may be an pressing want for “proof of personhood”, and probably the most viable method to difficulty it’s via customized biometric {hardware} – the Orb. The Orb captures high-quality iris photos with greater than an order of magnitude larger decision in comparison with iris recognition requirements, via which a ‘World ID’ is created – a “digital identification answer enabling customers to show their uniqueness and humanity anonymously […]”,[7] in keeping with Worldcoin.

Although the objective of Worldcoin considers the actions carried out are preserving of privateness, latest complaints and bans supply a contrasting perspective. Certainly, Eileen Guo and Adi Renaldi from the MIT Know-how Overview revealed, in April 2022, a protracted article exposing lots of the firm’s weaknesses and challenges it presents. As an example, they interviewed Iyus Ruswandi, an area Indonesian who was tempted to “promote” his iris to Worldcoin Indonesia, in December 2021. The representatives would gather the scans, in return for “free money (usually native forex in addition to Worldcoin tokens) to Airpods to guarantees of future wealth […] What they weren’t offering was a lot info on their actual intentions.”[8] Within the writer’s interview with Ruswandi, he said that representatives even had to assist residents arrange emails and go browsing to the net, main him to mirror on why Worldcoin was concentrating on low-income communities within the first place, moderately than crypto fans or communities.[9]

From the accounts gathered to date, it’s doable to witness how this apply has affected weak communities, from populations who wouldn’t have entry to probably the most up-to-date digital literacy, to kids, who can not validly consent to any such apply. However there may be additionally an inequality of data between those that have bought their irises and the corporate, just because the latter has apparently not been absolutely clear about its operations. On this context, it’s related to check with recital 20 of the EU AI Act which determines that “[i]n order to acquire the best advantages from AI methods whereas defending elementary rights, well being and security and to allow democratic management, AI literacy ought to equip suppliers, deployers and affected individuals with the mandatory notions to make knowledgeable choices concerning AI methods. These notions could differ with regard to the related context and might embody […], within the case of affected individuals, the data essential to grasp how choices taken with the help of AI will have an effect on them […]”.[10] (Writer’s daring). And though this reference to digital literacy on this recital includes completely different teams of individuals, it’s true that Article 4 of the AI Act states that “[p]roviders and deployers of AI methods shall take measures to make sure, to their greatest extent, a adequate degree of AI literacy of their employees and different individuals coping with the operation and use of AI methods on their behalf […]”, placing the emphasis on those that work carefully with the expertise.

The situation examined on this textual content is especially worrying, particularly as we’re coping with biometric information, which “can allow the authentication, identification or categorization of pure individuals and the popularity of feelings of pure individuals”.[11] As mentioned within the European Parliament’s 2021 examine “Biometric Recognition and Behavioural Detection”, to uniquely establish pure individuals, “sturdy”[12] biometric identifiers have to be captured, transformed into digital information and in the end right into a standardised template. These identifiers could be captured by acceptable bodily scanners, with the energetic acutely aware cooperation of the person, remotely with out such cooperation, or with the assistance of current different information. Thus, capturing biometric identifiers means changing an individual’s distinctive bodily traits into digital information, resulting in the “datafication” of people.

As a result of the options that uniquely establish an individual are a part of an individual’s physique, their assortment and use intervene with an individual’s private autonomy and dignity, the report stresses. As soon as a biometric template has been created and saved in a reference database, anybody in possession of that template can establish and find that individual anyplace on the earth, placing that individual at severe danger of being tracked and monitored. Additionally, it may be used to establish the person for an infinite variety of functions and conditions.[13] In actual fact, whereas the danger of fraud and the difficulties posed by poor information high quality or lacking information are diminished by fashions that use “sturdy” biometrics, these “additionally improve moral considerations, as they allow extra environment friendly public surveillance and can be utilized for the creation of elaborate profiles”.[14]

Based on Article 5(1)(h) of the AI Act, using “real-time” distant biometric identification methods[15] in areas accessible to the general public for the aim of sustaining public order is prohibited, until and to the extent that such use is strictly essential for outlined functions within the Regulation.[16] Relating to using “put up” biometric identification methods (in deferred time),[17] that is thought of a high-risk apply, and never prohibited just like the one described above, though the outcome is identical – large identification of topics with out their consent or data, one thing which is intrusive in nature. A lot criticism has been directed at the truth that the latter shouldn’t be fully prohibited, and that the previous consists of exceptions to its prohibition.[18] Therefore, that highlights how the apply of biometric identification carries a really excessive danger of threatening primary rights and safeguards,[19] and in the end democracy itself.

Now, when contemplating the operations of corporations which make use of biometric identification as its major exercise, and the needs for which all of the delicate information will probably be used could be very questionable, we step on to very harmful territory. On this regard, it’s related to think about Alfonso Ballesteros’ insights in his article “Digitocracy: ruling and being dominated”: “[d]igitocracy[20] appears to be a brand new type of authorities […] a brand new method to rule an unprecedented variety of individuals neatly and effectively. […] Rulers aren’t any extra trendy technocratic humanists than mere rational entrepreneurs searching for to earn cash. They’re postmodern entrepreneurs [who] have been capable of hybridise their financial pursuits with new postmodern concepts; specifically, people who blur the distinctions between artefacts and people, and a declared pretension to be appearing for the nice of humanity.”[21]

Lately, it is a matter for probably the most cautious consideration, as with out sturdy sufficient safeguards, we will probably be more and more topic to vested pursuits making use of our most delicate info, and that might result in a path of no return. Thus, within the face of a suggestion of “proof of personhood”, we must be involved as as to if that is weakening our very personal autonomy and dignity – in essence, our humanness – or if it can improve and, quite the opposite, defend our life in coexistence with expertise.

[1] See Elizabeth Howcroft, “Portugal orders Sam Altman’s Worldcoin to halt information assortment”, Reuters, 26 March 2024, See additionally Expresso, “Worldcoin: Comissão de Proteção de Dados suspende recolha de dados da íris”, 26 March 2024,

[2] CNPD, “CNPD suspende recolha de dados biométricos”, 26 March 2024,

[3] The complete textual content of the press launch is on the market at: 

[4] CNPD, “DELIBERAÇÃO/2024/137”, AVG/2023/1205, 4,

[5] Worldcoin Basis, “A New Id and Monetary Community”, Worldcoin Whitepaper,

[6] UBI stands for common primary earnings.

[7] Worldcoin Basis, “World ID – The protocol to deliver privacy-preserving international proof of personhood to the web”,

[8] Eileen Guo and Adi Renaldi, “Human and expertise – Deception, exploited employees, and money handouts: how Worldcoin recruited its first half 1,000,000 take a look at customers”, MIT Know-how Overview, 6 April 2022,

[9] Eileen Guo and Adi Renaldi, “Human and expertise – Deception, exploited employees, and money handouts: how Worldcoin recruited its first half 1,000,000 take a look at customers”.

[10] European Parliament legislative decision of 13 March 2024 on the proposal for a regulation of the European Parliament and of the Council on laying down harmonised guidelines on Synthetic Intelligence (Synthetic Intelligence Act) and amending sure Union Legislative Acts, P9_TA(2024)0138 (COM(2021)0206 – C9-0146/2021 – 2021/0106(COD)), (Hereinafter, EU AI Act or AI Act).

[11] EU AI Act, Recital 14.

[12] These are, in keeping with the examine, fingerprint, iris, or retina.

[13] European Parliament, “Biometric Recognition and Behavioural Detection – Assessing the moral elements of biometric recognition and behavioural detection methods with a give attention to their present and future use in public areas”, Examine requested by the JURI and PETI committees, Coverage Division for Residents’ Rights and Constitutional Affairs, Directorate-Normal for Inner Insurance policies, PE 696.968, August 2021, 44, 

[14] European Parliament, “Biometric Recognition and Behavioural Detection – Assessing the moral elements of biometric recognition and behavioural detection methods with a give attention to their present and future use in public areas”, 14.

[15] EU AI Act, Recital 17: “[…] ‘Actual-time’ methods contain using ‘stay’ or ‘near-live’ materials, akin to video footage, generated by a digital camera or different system with comparable performance. […]”

[16] EU AI Act, Recital 33: “These conditions contain the seek for sure victims of crime together with lacking individuals; sure threats to the life or to the bodily security of pure individuals or of a terrorist assault; and the localisation or identification of perpetrators or suspects of the legal offences listed in an annex to this Regulation, the place these legal offences are punishable by a custodial sentence or a detention order for a most interval of at the least 4 years within the Member State involved in accordance with the regulation of that Member State. Such a threshold for the custodial sentence or detention order in accordance with nationwide regulation contributes to making sure that the offence must be severe sufficient to doubtlessly justify using ‘real-time’ distant biometric identification methods.”

[17] EU AI Act, Recital 17: “[…] Within the case of ‘put up’ methods, in distinction, the biometric information have already been captured and the comparability and identification happen solely after a major delay. This includes materials, akin to photos or video footage generated by closed circuit tv cameras or personal gadgets, which has been generated earlier than using the system in respect of the pure individuals involved.”

[18] See Patrick Breyer, Sergey Lagodinsky and Kim van Sparrentak, “Defending privateness: biometric mass surveillance and the AI Act”, The Greens/EFA within the European Parliament, 6 March 2024,

[19] As an example, “[…] AI methods figuring out or inferring feelings or intentions of pure individuals on the idea of their biometric information could result in discriminatory outcomes and could be intrusive to the rights and freedoms of the involved individuals. Contemplating the imbalance of energy within the context of labor or training, mixed with the intrusive nature of those methods, such methods may result in detrimental or unfavourable therapy of sure pure individuals or complete teams thereof.” – Recital 44, EU AI Act.

[20] “Digitalisation as a type of authorities”.

[21] Alfonso Ballesteros, “Digitocracy: ruling and being dominated”, Philosophies 5, 9 (2020): 11,

Image credit: by Wojtek Paczeu015b on

Leave a Comment