Article 13(2)(c) explicitly mandates that controllers inform subjects of the existence of the right to withdraw consent at any time. yet, the CNPD found that for a significant period, Worldcoin provided "insufficient information on certain options, preventing them from deleting their data." Portuguese regulators discovered that the method for data.
Verified Against Public And Audited RecordsLong-Form Investigative Review
Reading time: ~35 min
File ID: EHGN-REVIEW-32788
Legality of biometric iris data collection under GDPR in Spain and Portugal
This resource intensity demonstrates the high cost of retrofitting privacy into a system designed for. yet, the expenditure serves a.
Primary RiskLegal / Regulatory Exposure
JurisdictionEPA
Public MonitoringThe AEPD monitors this specifically to prevent the "coercive" nature of the transaction from.
Report Summary
Under the GDPR, pseudonymous data remains personal data because it can be linked back to a natural person if the specific technical keys or side-channel data (such as the phone number or email used for the wallet) are combined. The BayLDA's order for World to implement a compliant deletion method confirms that the regulator still views the SMPC shares as "data" that belongs to the user, not system metadata. The organization transitioned from a centralized database of iris hashes, a structure that regulators viewed as a "honey pot" of sensitive biometric data, to a Secure Multi-Party Computation (SMPC) system.
Key Data Points
The architectural history of Worldcoin bifurcates sharply in March 2024. In a direct tactical response to these existential regulatory threats, Worldcoin executed a rapid infrastructure pivot known as "Personal Custody." On March 22, 2024, sixteen days after the Spanish ban, Worldcoin announced it would no longer store iris codes by default. Announced as open-source on May 15, 2024, and developed in collaboration with TACEO, this system allows the network to verify uniqueness without reconstructing the original iris code. The new SMPC system requires approximately 1, 152 cores and 3. 6 terabytes of memory to operate, consuming significantly more than the.
Investigative Review of Worldcoin
Why it matters:
The Spanish Data Protection Agency (AEPD) invoked Article 66 of the GDPR to halt biometric data collection by Tools for Humanity Corporation in Spain.
The AEPD's emergency intervention highlighted violations related to lack of user information, minors' involvement, consent withdrawal, and high-risk data processing.
AEPD's Emergency Intervention: The Article 66 GDPR Precedent in Spain
The March 6 Directive: A Regulatory Guillotine
On March 6, 2024, the Spanish Data Protection Agency (AEPD) executed a rare and decisive legal maneuver against Tools for Humanity Corporation, the German-registered entity behind Worldcoin. The regulator invoked Article 66 of the General Data Protection Regulation (GDPR), a provision reserved for “exceptional circumstances” requiring immediate intervention to protect the rights and freedoms of individuals. This order mandated the immediate cessation of all biometric data collection within Spanish territory and blocked the processing of data already harvested from approximately 400, 000 Spanish residents. The AEPD’s intervention represents a significant rupture in the standard enforcement of the European Union., the “One-Stop-Shop” method dictates that a company operating across multiple EU member states answers primarily to the Data Protection Authority (DPA) where its main establishment is located. For Tools for Humanity, this is the Bavarian State Office for Data Protection Supervision (BayLDA) in Germany. By bypassing the BayLDA and acting unilaterally, the Spanish regulator signaled that the risks posed by Worldcoin’s iris-scanning “Orbs” were too severe to await the conclusion of the German investigation, which had been ongoing since November 2022 without a final resolution. AEPD Director Mar España justified the emergency measure by citing a “state of alarm” generated in Spanish society. The agency received multiple complaints detailing specific infringements that required urgent containment. These were not abstract technical violations direct threats to user autonomy and minor safety. The regulator’s order froze Worldcoin’s operations for three months, a period allowed under Article 66 to prevent “irreparable damage” while a permanent solution was coordinated at the European level.
The Catalogue of Infringements
The AEPD’s dossier against Worldcoin listed four primary violations that necessitated the Article 66 invocation., the agency identified a widespread failure to provide sufficient information to users regarding the processing of their data. The complex nature of biometric hashing and blockchain integration was not adequately explained to the average consumer, of whom were enticed solely by the prospect of receiving WLD tokens, then valued at approximately €70-€80. Second, and most damaging to Worldcoin’s defense, were the reports of minors interacting with the Orbs. Spanish law requires parental consent for data processing of individuals under 14 years of age. The AEPD found that the age verification method on the Orbs were nonexistent or easily circumvented, allowing teenagers to sell their biometric data for cryptocurrency. This violation of Article 8 of the GDPR moved the problem from a regulatory dispute to a child safety matter. Third, the agency noted the impossibility of withdrawing consent. Under GDPR, a user must be able to withdraw consent as easily as they gave it. Worldcoin’s system, which claimed to delete the raw iris image retain a non-reversible “IrisCode” hash, created a technical barrier to true deletion. If the company could not link a user to their hash to delete it (as they claimed anonymity), then the user lost control over their biometric identifier the moment it was generated. Fourth, the AEPD classified the processing as “high risk” under Article 9, which governs special categories of data. Biometric data used for unique identification carries the highest level of protection. The agency determined that the of collection, occurring in shopping malls and high-traffic public areas, combined with the sensitive nature of the data, created a threat profile that outweighed the company’s commercial interests.
Judicial Validation: The Audiencia Nacional Ruling
Tools for Humanity immediately appealed the AEPD’s order to the Audiencia Nacional (National Court), seeking an interim injunction to stay the ban. The company argued that the suspension would cause “irreparable harm” to its business, citing economic losses and reputational damage. They also contended that the AEPD had overstepped its jurisdiction by ignoring the competence of the Bavarian lead authority. On March 11, 2024, the Audiencia Nacional delivered a crushing verdict against Worldcoin, upholding the AEPD’s suspension. The court’s reasoning established a serious hierarchy of rights in the digital age. The judges applied a “weighing of interests” test, contrasting the economic damage claimed by Worldcoin against the fundamental right to data protection claimed by the regulator. The court stated explicitly that the “safeguarding of the general interest,” defined as the protection of personal data, must prevail over the “particular economic interest” of the company. The ruling dismissed Worldcoin’s claim of irreparable harm, noting that economic losses are compensable through damages if the company were later found to be in the right. In contrast, the compromise of biometric data for hundreds of thousands of citizens constitutes a harm that is difficult, if not impossible, to reverse. Once biometric data is leaked or misused, it cannot be changed like a password.
The “One-Stop-Shop” Bypass
The legal significance of this case extends beyond Worldcoin. It exposes the friction within the GDPR’s enforcement architecture. The One-Stop-Shop method was designed to simplify regulation, allowing companies to deal with a single authority. Yet, critics have long argued that this system creates bottlenecks, particularly when the lead authority (in this case, Bavaria) is seen as slow to act. By invoking Article 66, Spain demonstrated that national authorities retain the power to act locally when they perceive an imminent threat. This creates a fragmented regulatory environment where a company might be compliant in its home jurisdiction banned in another. The AEPD’s action forces the European Data Protection Board (EDPB) to accelerate the of standards regarding biometric proof-of-personhood projects. The table summarizes the conflicting positions between Worldcoin and the Spanish authorities during the Article 66 hearing:
Core problem
Worldcoin (Tools for Humanity) Position
AEPD / Audiencia Nacional Position
Jurisdiction
Only the Bavarian DPA (BayLDA) has the authority to regulate us under the One-Stop-Shop method.
Article 66 allows for immediate provisional measures in exceptional circumstances to protect local citizens.
Minors
We have terms of service prohibiting minors and are implementing age verification.
Evidence shows minors are actively scanning irises; current controls are insufficient and violate Article 8.
Irreparable Harm
A ban destroys our market position and causes unrecoverable financial and reputational damage.
Economic loss is recoverable. The loss of control over biometric data is permanent and constitutes a higher risk.
Data Nature
We do not store images; we store a non-reversible hash (IrisCode) which protects privacy.
The hash is derived from sensitive biometric data (Article 9) and the risk of re-identification or misuse remains high.
The Aftermath of the Injunction
Following the court’s decision, Worldcoin was forced to disable its Orbs across Spain. The app remained downloadable, the verification function—the core of its “Proof of Personhood” —was deactivated. This halted the company’s growth in one of its most active European markets. The AEPD’s aggressive stance also triggered a domino effect. The Portuguese Data Protection Commission (CNPD) followed suit shortly after, issuing a similar temporary ban citing the same concerns regarding minors and consent. This coordinated pushback suggests that the Spanish precedent served as a catalyst for other regulators who had been observing from the sidelines. Worldcoin’s response was to engage in a public relations and legal counter-offensive, accusing the AEPD of spreading “inaccurate and misleading” information. Jannick Preiwisch, Worldcoin’s Data Protection Officer, stated that the company was fully compliant with GDPR and had been in constant communication with the BayLDA. Yet, the Spanish court’s refusal to lift the ban indicates that compliance on paper does not equate to safety in practice, especially when the technology involves the mass collection of immutable biological characteristics. The Article 66 procedure has a maximum duration of three months. yet, as that deadline method in June 2024, Worldcoin voluntarily agreed to extend the pause on its operations in Spain until the end of the year or until the BayLDA reached a final resolution. This concession solidified the AEPD’s victory. The regulator had successfully stopped the data without waiting for the slow gears of the European bureaucracy to turn. This episode establishes a clear warning for biometric data companies: the “move fast and break things” philosophy is incompatible with European privacy law. When the asset being collected is the human iris, regulators have shown they use the most extreme legal tools available to halt operations and ask questions later. The load of proof has shifted. It is no longer up to the regulator to prove harm; it is up to the company to prove safety before a single scan takes place.
CNPD's Biometric Blockade: Evidence of Underage Data Collection in Portugal
The CNPD’s Biometric Blockade: A Regulatory Firewall
On March 26, 2024, the Portuguese National Data Protection Commission (CNPD) issued a decisive order that halted the operations of Tools for Humanity within its borders. This directive, identified formally as Deliberation/2024/137, mandated an immediate suspension of all biometric data collection by the Worldcoin Foundation for a period of 90 days. The regulator’s intervention was not a routine administrative check; it was an emergency measure invoked under Article 66 of the General Data Protection Regulation (GDPR), a provision reserved for situations requiring urgent action to protect the rights and freedoms of data subjects. The of Worldcoin’s penetration into the Portuguese market was substantial. By the time the CNPD intervened, the project had already scanned the irises of approximately 300, 000 individuals. In a nation of roughly 10 million people, this figure represented a significant slice of the population, achieved in a remarkably short window. The rapid expansion was fueled by the pledge of WLD tokens, a financial incentive that proved irresistible to, including those who were legally unable to consent to such data processing. The CNPD’s order required Tools for Humanity to stop the collection of iris, eye, and face imagery within 24 hours, shutting down the network of “Orbs” stationed in shopping centers and transport hubs across Lisbon and other major cities. Paula Meira Lourenço, the President of the CNPD, described the suspension as an “indispensable and justified measure.” Her statement highlighted the severity of the situation, noting that the risk to citizens’ fundamental rights was high enough to warrant immediate cessation of activities before the conclusion of a full investigation. This move mirrored the actions taken by the Spanish AEPD just weeks prior, solidifying a unified front on the Iberian Peninsula against the uncontrolled harvesting of biometric identifiers. The Portuguese regulator’s decision was not about the volume of data the specific, irreversible nature of the biometric templates being generated and stored.
The “Minor” Problem: Exploiting the At-Risk
The catalyst for this regulatory crackdown was a surge of complaints specifically regarding minors. In the month leading up to the ban, the CNPD received dozens of reports from parents and legal guardians who discovered their children had been scanned by Worldcoin Orbs without their knowledge or permission. These reports painted a disturbing picture of the project’s field operations. Children, motivated by the prospect of free cryptocurrency, were queuing at Orb locations, frequently without any adult supervision. The financial lure of the WLD token acted as a magnet for adolescents, who then traded their biometric sovereignty for a digital asset they likely did not fully understand. Under GDPR, the processing of personal data of children requires strict adherence to consent, demanding authorization from a holder of parental responsibility for those under the age of 13 (or 16, depending on member state implementation). The CNPD’s investigation revealed that Worldcoin had no system to verify the age of the individuals standing before its scanners. The “Orb” was designed to verify “humanness”, to ensure the subject was a living human being and not a bot, it possessed no capability to distinguish between a 12-year-old and a 25-year-old. This technical blind spot meant that the project was systematically collecting special category data from minors in direct violation of European data protection laws. The complaints filed with the CNPD also detailed the frustration of parents attempting to rectify the situation. reported that once the data was collected, there was no clear or functional method to request its deletion. The “right to erasure,” a core tenet of the GDPR, appeared to be functionally absent or inaccessible in practice. Parents found themselves unable to revoke consent on behalf of their children or to confirm whether the biometric templates generated from their children’s irises had been permanently destroyed. This inability to exercise control over one’s data, or the data of one’s charges, constituted a serious breach of the regulation’s transparency and accountability principles.
Technical Negligence in the Orb Design
The failure in Portugal exposed a fundamental flaw in the design and deployment of the Worldcoin Orb. The device relies on advanced multispectral sensors to capture high-resolution images of the iris, which are then converted into a unique “IrisCode.” While the technology is sophisticated in its ability to detect liveness and prevent duplicate registrations, it completely ignored the legal need of age verification. In the physical world, a bank or a nightclub checks a government-issued ID to verify age. The Orb, operating in a “permissionless” crypto context, bypassed this step entirely to reduce friction and maximize user onboarding speed. This omission was not a minor oversight; it was a structural decision that prioritized growth over compliance. By failing to integrate any form of age-gating, such as requiring a citizen card or driving license before the scan, Tools for Humanity created a system that was inherently unsafe for minors. The CNPD’s findings showed that the company’s reliance on self-attestation (users simply checking a box saying they are over 18) was wholly insufficient for the processing of high-risk biometric data. In a physical environment where operators are paid based on the number of sign-ups, the incentive structure further discouraged any rigorous checking of user eligibility. The absence of age verification hardware or software on the Orb meant that the device was essentially an indiscriminate data vacuum. It accepted any human eye presented to it. This technical reality contradicted the company’s public assurances of strict privacy standards. While Worldcoin claimed to preserve privacy by not collecting names or addresses, the collection of the iris itself, a permanent, immutable biological identifier, from children without parental consent represented a far greater privacy intrusion than the collection of a name. The CNPD’s investigation highlighted that the “anonymity” claimed by the company was meaningless if the biometric template itself could be linked to a minor who had no legal capacity to agree to the scan.
Worldcoin’s Response and the Personal Custody Pivot
In the wake of the CNPD’s order, Tools for Humanity attempted to manage the. Jannick Preiwisch, the Data Protection Officer for the Worldcoin Foundation, issued a statement claiming the organization had “zero tolerance” for underage sign-ups. He asserted that the company was working to address the problem, even suggesting that the reports to the CNPD were the they had heard of the specific complaints. This claim of ignorance stood in sharp contrast to the regulator’s citation of “large dozens” of formal complaints and the visible reality of queues filled with young people in Portuguese shopping centers. Simultaneously, the project announced a shift toward “Personal Custody,” a new data management model where the biometric data (images and codes) would be stored on the user’s personal device rather than in a centralized cloud database. This move was framed as a pro-privacy evolution, giving users complete control over their information. Yet, the timing suggested it was a direct reaction to the regulatory pressure mounting in Spain and Portugal. The Personal Custody model aimed to mitigate the risks associated with central storage, it did not retroactively solve the problem of the thousands of minors whose data had already been ingested by the system, nor did it address the fundamental inability of the Orb to verify age at the point of collection. The 90-day suspension in Portugal served as a serious stress test for the project’s viability in the European Union. It forced the company to confront the reality that “proof of personhood” cannot exist in a legal vacuum. The CNPD’s blockade showed that local regulators possess the power to halt global crypto projects when they threaten the rights of at-risk populations. The between the Portuguese and Spanish authorities created a formidable barrier, signaling to other EU member states that the “move fast and break things” ethos of Silicon Valley would not be tolerated when it came to the biometric data of children. The Portuguese intervention was not just a pause; it was a demand for a fundamental re-engineering of how the project interacts with human identity.
The 'One-Stop-Shop' Friction: BayLDA's Authority vs. Iberian Regulators
The General Data Protection Regulation (GDPR) promised a unified regulatory environment for companies operating across the European Union. This pledge relied on the “One Stop Shop” (OSS) method. Under this system, a company establishes a main base in one EU nation. The data protection authority of that nation becomes the Lead Supervisory Authority (LSA). This LSA handles all cross-border investigations. For Worldcoin, the strategy was clear. Tools for Humanity GmbH established its headquarters in Erlangen, Bavaria. This legal maneuver the Bavarian State Office for Data Protection Supervision (BayLDA) as its primary regulator. The company likely anticipated a methodical, perhaps slower, regulatory process typical of German bureaucracy. BayLDA began its investigation in November 2022. The regulator sought to analyze the “World ID” system, the Orb hardware, and the processing of biometric iris data. Michael, the President of BayLDA, later described the inquiry as “technologically demanding and legally highly complex.” The Bavarian authority prioritized a thorough technical audit over immediate enforcement. They examined the “Proof of Personhood” concept and the separation of data between the Worldcoin Foundation (Cayman Islands) and its German subsidiary. For nearly eighteen months, BayLDA worked quietly. They exchanged correspondence with the company. They requested technical documents. To the outside world, the investigation appeared dormant. This silence did not sit well with regulators in Southern Europe. By early 2024, the “Orb” had gone viral in Spain and Portugal. Long queues formed at shopping centers in Madrid, Barcelona, and Lisbon. Thousands of citizens, of them teenagers, scanned their irises in exchange for WLD tokens. The Spanish Data Protection Agency (AEPD) received a flood of complaints. Citizens reported they could not withdraw consent. Parents reported that minors were registering without authorization. The AEPD faced a dilemma. Under the OSS rules, they were supposed to forward these complaints to Bavaria and wait for BayLDA to act. The AEPD chose not to wait. In March 2024, the Spanish regulator triggered Article 66 of the GDPR. This article serves as an emergency brake. It allows a local authority to take “provisional measures” for a maximum of three months if there is an “urgent need to act” to protect the rights and freedoms of data subjects. The AEPD ordered an immediate cessation of Worldcoin’s activities in Spain. They demanded the company stop scanning new users and block the use of data already collected. This was a direct challenge to the supremacy of the BayLDA. Spain declared that the German timeline was too slow to prevent immediate harm to its citizens. Portugal’s National Data Protection Commission (CNPD) followed suit weeks later. The CNPD identified similar risks. Their investigation showed that over 300, 000 Portuguese citizens had already signed up. The regulator found no age verification method at the Orbs. Minors were selling their biometric data for cryptocurrency. The CNPD also used its emergency powers to impose a temporary ban. The Iberian regulators created a fractured enforcement map. Worldcoin was legal in Germany, where few Orbs existed, illegal in its most active markets. The friction between the regulators exposed a serious flaw in the OSS design. The method assumes that the Lead Supervisory Authority shares the same sense of urgency as the local authorities where the data collection actually happens. BayLDA was conducting a forensic audit of the code. AEPD and CNPD were dealing with angry parents and confused citizens. The German regulator viewed the case as a technical puzzle to be solved. The Spanish and Portuguese regulators viewed it as a public safety hazard to be contained. Worldcoin attempted to navigate this split by appealing the Spanish ban. The Spanish High Court upheld the AEPD’s decision. The court ruled that the “right to data protection” prevailed over the company’s “economic interest.” Realizing that a legal battle in Spain would be costly and public, Tools for Humanity eventually agreed to “voluntarily” extend the pause in Spain until the BayLDA finished its investigation. This move was a strategic retreat. It prevented a permanent ruling in Spain that could have set a negative precedent for the entire continent. The Bavarian investigation concluded in December 2024. The outcome vindicated the concerns raised by Spain and Portugal. BayLDA issued a reprimand and ordered the deletion of iris codes collected without a sufficient legal basis between July 2023 and December 2024. The German regulator found that the company had violated Article 32 of the GDPR regarding security measures. They also mandated that future processing would require explicit, informed consent. Michael stated that the decision enforced “European fundamental rights standards.” Yet, the timing of the decision meant that for nearly a year, the enforcement had come from Madrid and Lisbon, not Bavaria.
Regulatory Body
Action Taken
Legal method
Key Justification
BayLDA (Germany)
Investigation (Nov 2022, Dec 2024); Deletion Order
Article 60 (Cooperation), Article 58 (Powers)
Technical complexity; absence of legal basis for storage; Security flaws (Art 32).
AEPD (Spain)
Immediate Ban (March 2024)
Article 66 (Urgency Procedure)
Processing of minors’ data; Inability to withdraw consent; High risk to rights.
CNPD (Portugal)
Temporary Limitation (March 2024)
Article 66 (Urgency Procedure)
Protection of minors; absence of age verification; Excessive data collection.
The “Erlangen Strategy” failed to provide Worldcoin with a shield against local enforcement. The company likely assumed that by anchoring itself in Germany, it could avoid the volatility of multiple regulators. Instead, the aggressive data collection tactics used by its Orb operators triggered a defensive reaction that bypassed the German shield entirely. The Article 66 precedents set by Spain and Portugal proved that the One Stop Shop is not absolute. When the risk to citizens is palpable and immediate, local authorities retain the power to shut down operations, regardless of where the headquarters are located. This episode also highlighted the resource between regulators. BayLDA is a state-level authority in Germany. While competent, it absence the sheer aggressive posture of the national-level AEPD in Spain. The AEPD has a history of levying massive fines against tech giants. By forcing the problem, Spain pushed the entire European regulatory apparatus to move faster. The final BayLDA order in December 2024 was likely accelerated by the pressure from its southern counterparts. If Spain had not acted, Worldcoin might have continued collecting data from minors for another year while the technical audit in Bavaria dragged on. The friction between BayLDA and the Iberian regulators serves as a case study for future biometric projects. It demonstrates that “compliance” is not just about filing paperwork in a friendly jurisdiction. It requires actual operational safety on the ground. Worldcoin’s failure to police its own Orb operators—allowing them to scan children and ignore consent revocation—made the legal protections of the OSS irrelevant. The system works for administrative disputes. It breaks down when faced with a physical, viral data collection campaign that threatens populations in real-time. By 2026, the legacy of this conflict is clear. The OSS remains the standard, the “emergency brake” of Article 66 is no longer a theoretical option. It is a proven weapon. Companies can no longer hide behind a Lead Supervisory Authority while ignoring local laws. The BayLDA ruling, while late, confirmed that the core of Worldcoin’s initial operations in Europe was illegal. The data collected from millions of Europeans had to be purged. The “Proof of Personhood” experiment collided with the reality of European privacy law, and the law won.
Coerced Consent? The Legality of Exchanging Iris Scans for WLD Tokens
The central method of Worldcoin’s expansion—the exchange of high-resolution iris scans for WLD tokens—presents a fundamental conflict with the General Data Protection Regulation (GDPR). While the company frames this transaction as “proof of personhood” rewarded by a share in the network, European regulators view it through the lens of Article 4(11) of the GDPR, which demands that consent be “freely given.” The core legal question facing the Agencia Española de Protección de Datos (AEPD) and the Comissão Nacional de Proteção de Dados (CNPD) is whether a decision driven by financial inducement, particularly among demographics, can ever be truly voluntary.
The Economics of “Freely Given” Consent
Under GDPR Article 4(11), consent is valid only if it is free, specific, informed, and unambiguous. The European Data Protection Board (EDPB) Guidelines 05/2020 clarify that consent is not free if the data subject has no genuine or free choice or is unable to refuse or withdraw consent without detriment. In the context of Worldcoin, the “detriment” is the forfeiture of WLD tokens, which held significant market value during the height of the project’s Iberian expansion. In early 2024, the WLD token surged to over $11 USD. For a teenager in a Madrid suburb or a retiree in Lisbon, the prospect of receiving approximately €70 worth of cryptocurrency for a thirty-second eye scan created a economic pressure. This financial lure distorts the voluntariness of the decision. When a data controller attaches a monetary value to the surrender of biometric data, the transaction shifts from a privacy choice to an economic calculation. For low-income individuals, this calculation is heavily weighted toward immediate financial gain, stripping them of the “real choice” mandated by European law. The AEPD’s investigation highlighted that this monetization of sensitive category data (Article 9) creates a coercive environment. By bundling the “verification” service with a speculative financial asset, Worldcoin constructed a system where refusal to scan meant a direct financial loss (opportunity cost). The EDPB has long warned against “bundling” consent with services, Worldcoin’s model goes further by bundling consent with *payment*. The regulator’s precautionary measure in March 2024 implicitly recognized that when the price of privacy is set against the price of groceries, the consent obtained is legally hollow.
The Irrevocability Paradox
A second, perhaps more fatal, flaw in Worldcoin’s consent model is the practical impossibility of withdrawal. Article 7(3) of the GDPR states that “the data subject shall have the right to withdraw his or her consent at any time” and that “it shall be as easy to withdraw as to give consent.” Worldcoin’s technical architecture, designed to prevent “Sybil attacks” (one person creating multiple identities), relies on the permanence of the data. To ensure a user is unique, the system must retain a representation of the iris, the “IrisCode”, forever. If a user could fully delete their data and then re-register, the system would fail to prevent duplicate accounts. This technical need creates a legal deadlock. During the Spanish and Portuguese bans, regulators noted that users who wished to exercise their right to erasure were unable to do so. While Worldcoin argued they could delete the *image*, the *hash* (the numeric code derived from the iris) remained to block re-registration. The AEPD and CNPD rejected the argument that the IrisCode is not personal data. Since the code can single out an individual from a database of millions, it constitutes biometric data under Article 4(14). Consequently, the inability to fully withdraw this data violates Article 7(3). If a user cannot withdraw consent because the system demands the data be kept to “block” them from future entry, the initial consent was never valid. The user was locked into a contract of adhesion disguised as a privacy permission.
Predatory Collection Tactics and the “Orb” Commission
The validity of consent is further eroded by the operational structure of the “Orb” operators. In both Spain and Portugal, the collection was frequently outsourced to third-party contractors paid on a commission basis, earning a fee for every set of eyes scanned. This incentive structure encourages speed over comprehension. Reports from shopping centers in Barcelona and Lisbon described Orb operators rushing users through the sign-up process, frequently skipping the required explanations of data processing. The “informed” requirement of Article 4(11) mandates that users understand the risks, the identity of the controller, and the destination of their data. When the person collecting the data is financially motivated to maximize volume, the detailed explanation of complex cryptographic concepts and biometric risks becomes an obstacle to revenue. The CNPD specifically the “large crowds” and the chaotic environment of collection points as evidence that informed consent was impossible. In such settings, with long queues and the pledge of “free crypto,” users are not reading privacy policies on their smartphones. They are clicking “agree” to get the token. The Portuguese regulator noted that this environment, combined with the absence of age verification, meant that minors were being bribed to surrender biometric data they did not understand, bypassing the need for parental authorization required by Article 8 GDPR.
The BayLDA’s Retroactive Erasure Order
The culmination of these consent failures appeared in the administrative orders filtering down from the Bavarian State Office for Data Protection Supervision (BayLDA), the lead authority for Worldcoin in the EU. While the Spanish and Portuguese bans were temporary emergency measures under Article 66, the BayLDA’s investigation struck at the root of the consent validity. In August 2024, the BayLDA ordered the erasure of iris codes collected between July 2023 and December 2024. This drastic remedy confirms that the consent obtained during this period was null and void. The regulator found that the processing for “passive comparison” (storing the code to check against new users) absence a valid legal basis. If the consent was valid, erasure would not have been necessary. The order signifies that the “check-box” consent on the World App failed to meet the high threshold for processing special category data. This retroactive invalidation exposes the fragility of the “crypto-for-biometrics” model. If the inducement (WLD) and the opacity of the technology render the consent invalid, then the entire database is fruit of the poisonous tree. Worldcoin’s attempt to pivot to “Legitimate Interest” (Article 6(1)(f)) as a backup legal basis was also rejected for biometric processing, which strictly requires explicit consent under Article 9(2)(a).
The Imbalance of Power
The concept of “imbalance of power,” applied to employer-employee relationships or citizen-state interactions, applies here in a economic form. Worldcoin, a venture-backed entity with a multi-billion dollar valuation, holds all the technical information. The user, frequently a layperson with limited knowledge of blockchain or biometrics, holds only their biological identity. The exchange is asymmetric. The user gives up a permanent, immutable biological identifier. The company gives a volatile, speculative digital token. The CNPD’s suspension order reflected a concern that this asymmetry exploits the data subject’s absence of technical knowledge. The user cannot audit the “Zero-Knowledge Proof” claims; they must trust the company. Yet, the financial incentive discourages the skepticism necessary for informed consent. In the eyes of Iberian regulators, the WLD token acts less as a reward and more as a method to bypass serious thinking. By monetizing the signup, Worldcoin created a “rush” mentality that is antithetical to the reflective, deliberate decision-making process envisioned by the GDPR. The “freedom” to consent is illusory when the alternative, walking away, means leaving money on the table in a precarious economic climate.
Granularity and the “All-or-Nothing” Deal
, the consent method failed the “granularity” test. Recital 43 of the GDPR states that consent is presumed not to be freely given if the performance of a contract is conditional on consent to the processing of personal data that is not necessary for the performance of that contract. Worldcoin bundled the creation of the “World ID” (the identity service) with the distribution of WLD (the financial service) and the training of their AI models (the research purpose). Users in Spain and Portugal could not say, “I want to prove I am human, I do not want my data used to train your algorithm,” or “I want the ID, I do not want the token.” It was a monolithic package. To get the money, you had to give the iris. To get the ID, you had to take the crypto. This absence of granular choice violates the requirement that consent be “specific” for each processing purpose. The AEPD’s intervention emphasized that users were not given the option to separate these distinct data processing activities, further invalidating the “freely given” nature of their agreement. The regulatory blockade in Iberia serves as a global precedent: biometric data cannot be treated as a currency. When privacy rights are traded for speculative assets, the legal framework of consent collapses, requiring state intervention to protect citizens from their own economically coerced decisions.
Forensic Analysis of the 'Orb' Age Verification Mechanisms and Failures
The forensic deconstruction of the ‘Orb’—Worldcoin’s chrome-shelled biometric imaging device—reveals a catastrophic disconnect between its advanced optical engineering and its rudimentary compliance. For a device marketed as a hyper-sophisticated tool for distinguishing humans from AI, its inability to distinguish a 12-year-old child from a consenting adult during its Iberian expansion represents a foundational failure of “Privacy by Design.” ### The Hardware Reality: Engineered for Uniqueness, Blind to Age At a hardware level, the Orb is a marvel of multispectral imaging. Technical specifications released by Tools for Humanity describe a custom optical system featuring a telephoto lens for iris capture, a wide-angle camera for environment segmentation, and thermal sensors to defeat liveness attacks. The device uses infrared illumination to penetrate the iris texture, converting the chaotic patterns of the eye into a hash—the “IrisCode.” yet, during the serious 2023-2024 adoption surge in Spain and Portugal, this sensor array possessed a fatal blind spot: it absence any intrinsic method to estimate or verify age. The device was engineered solely to solve the “Sybil problem”—preventing one person from creating multiple accounts. It was not designed to determine *who* that person was, or more importantly, *how old* they were. The “age verification” method employed during this period was not biometric, bureaucratic, and dangerously porous. It consisted of a simple checkbox within the World App where the user self-attested to being over 18. There was no cross-reference with government ID, no facial age estimation analysis running on the Orb’s local processor, and no cryptographic proof of age required before the iris scan commenced. ### The “Tick-Box” Failure and Operator Incentives This reliance on self-attestation created a widespread vulnerability that was immediately exploited by minors. In the bustling plazas of Barcelona and Lisbon, where Orbs were deployed, the incentive structure for “Orb Operators”—third-party contractors paid in stablecoins for each successful signup—compounded the technical failure. Forensic review of the enrollment process reveals that operators were driven by volume, not compliance. With no technical barrier preventing a minor from scanning their iris once the “I am over 18” box was ticked on a smartphone, the physical of verification fell to these commissioned agents. Reports from the Spanish Data Protection Agency (AEPD) indicate that this human firewall was non-existent. Minors were not only bypassing the app’s age gate were frequently encouraged or ignored by operators focused on maximizing their daily signup quotas. ### AEPD and CNPD Findings: The Evidence of Negligence The regulatory backlash was not based on theoretical risks on documented, forensic evidence of children’s data being harvested. In Spain, the AEPD’s emergency order under Article 66 of the GDPR was precipitated by a wave of complaints involving minors. The agency’s investigation found that Worldcoin’s system allowed the collection of biometric data from children without the possibility of valid consent. The “impossibility of withdrawing consent”—a core GDPR violation—was exacerbated for minors who, having scanned their irises for free crypto tokens, found themselves unable to delete their biometric templates. The AEPD’s ruling highlighted that the processing of this special category data (biometrics) from subjects (children) absence the necessary “safeguards of the public interest.” Portugal’s CNPD followed with even more specific findings. Their investigation “dozens of complaints” in a single month regarding the unauthorized collection of data from minors. The CNPD’s forensic assessment noted that the Orb’s data flow had no “stop” command for underage users. Once the iris was captured, the hash was generated and the account created; there was no retroactive filter to identify or purge the data of a 15-year-old who had lied on the checkbox. The CNPD explicitly stated that Worldcoin had no method to verify the age of users, rendering their data collection practices unlawful under Article 8 of the GDPR, which sets strict conditions for the processing of children’s data. ### The “Personal Custody” Pivot: An Admission of Defeat The severity of these forensic failures is underscored by Worldcoin’s technical pivot *after* the bans were enforced. In response to the regulatory blockade, the company introduced “Personal Custody” and, crucially, mandatory in-person age verification checks at Orb locations. This shift serves as a tacit admission that the original “Orb-only” verification model was legally unsustainable. The new protocol requires operators to physically inspect government-issued IDs before allowing a scan—a retro-grade manual check that the high-tech Orb was supposed to make obsolete. also, the introduction of features to “unverify” and delete iris codes acknowledges the previous system’s violation of the “Right to Erasure” (Article 17 GDPR). The forensic timeline is damning: for months, the Orb operated in European cities as a biometric vacuum, sucking in the data of adults and children alike with indiscriminate efficiency. The device’s advanced sensors could detect a fake eye, its logic could not detect a child. This disconnect between optical precision and legal compliance remains the central artifact of the Orb’s forensic history in the Iberian Peninsula.
The Right to Erasure: Verifying the Deletion of Historical Iris Codes
The “Right to Erasure” (Article 17 GDPR) is not a suggestion; it is a mandate. Yet, for Worldcoin, the deletion of historical iris codes has become a forensic battleground where technical definitions of “anonymity” clash with the absolute requirements of European law. The AEPD in Spain and the CNPD in Portugal did not ask for a pause; they demanded the neutralization of a biometric database they deemed illegally assembled. ### The “Bloqueo” Paradox: Why “Delete” Does Not Mean “Destroy” in Spain In the Spanish jurisdiction, the concept of erasure is legally distinct from physical destruction. The AEPD enforces a strict interpretation of *bloqueo* (blocking). When a data subject requests erasure, or when processing is declared unlawful, the data cannot be immediately incinerated. Instead, it must be “blocked”—segregated from active processing systems, encrypted, and made accessible *only* to judges, courts, or the Public Prosecutor for the duration of statutory liability periods ( 3 to 5 years). This creates a legal minefield for Worldcoin. While the company’s marketing machine broadcasts a narrative of “user control” and “permanent deletion,” the forensic reality in Spain is that the biometric data—or the liability evidence associated with it—must sit in a digital purgatory. The AEPD’s emergency order under Article 66 GDPR froze the database. If Worldcoin were to physically wipe the Spanish iris codes tomorrow, they could chance face *further* sanctions for destroying evidence required for the ongoing sanctioning proceedings. Thus, the “right to erasure” for Spanish users is currently a state of suspended animation: their data is legally frozen, unusable for Worldcoin’s AI training, yet still technically existent on a server, awaiting the final gavel. ### The Six-Month “Cool-Off” Friction Worldcoin’s global response to these regulatory “blocks” was the introduction of a “cool-off” period. Under their revised protocol, a user who requests the deletion of their World ID does not see their data instantly. Instead, the system initiates a six-month countdown. Worldcoin justifies this retention as a fraud prevention measure, ensuring that an individual cannot delete their ID and immediately re-verify to claim a fresh grant of WLD tokens. From a GDPR perspective, this method is legally perilous. Article 17 requires erasure “without undue delay.” A six-month retention period for a user who has withdrawn consent—the primary legal basis Worldcoin claimed to rely upon—is a difficult argument to sustain. If the legal basis (consent) is revoked, the processing must stop. Retaining the biometric hash to prevent “double-dipping” implies that the company is prioritizing its tokenomics model over the fundamental rights of the data subject. The BayLDA in Bavaria has specifically targeted this friction, ordering a deletion protocol that is actually compliant, signaling that a half-year wait time may not satisfy the “undue delay” standard when the data involved is a sensitive biometric template. ### The “Nullifier” Hash: The Data That Cannot Be Deleted A forensic examination of the “Orb” verification system reveals a technical contradiction in the pledge of total erasure. To function as a “Proof of Personhood” system, Worldcoin must ensure uniqueness. If a user deletes their account and returns to an Orb, the machine must know they have *already* been verified to deny them a second ID. This the retention of a “nullifier” hash—a cryptographic marker derived from the iris code. Even if the raw iris image and the primary iris code are deleted, this nullifier must in the database to flag repeat attempts. * **The Legal Question:** Is this nullifier “personal data”? * **The Technical Reality:** If the hash can singulate a natural person (i. e., “Person X is the one who generated Hash Y”), it remains pseudonymous personal data under GDPR, not anonymous data. * **The Consequence:** True, 100% erasure is technically incompatible with Worldcoin’s core mission of preventing Sybil attacks (fake identities). If they delete *everything*, a user can simply re-scan and claim more money. Therefore, biometric derivative *must* remain. This “zombie data” creates a permanent digital shadow that the user cannot fully scrub, directly challenging the absolute nature of the Right to Erasure. ### Verification of Deletion: The Missing Audit To date, no independent forensic audit has been released to the public confirming that the millions of iris codes collected in Spain and Portugal prior to the bans have been purged or blocked according to local law. The BayLDA’s investigation concluded that Worldcoin’s initial measures were insufficient, ordering a “GDPR-compliant data deletion protocol.” This phrasing is damning; it implies the previous protocol was *not* compliant. We are left with a “trust verify” scenario where the verification is absent. Users are asked to trust that a “Zero-Knowledge Proof” system has severed the link between their identity and their biometric hash. yet, without a transparent, third-party technical audit of the specific database shards hosting Iberian data, there is no guarantee that the “deleted” iris codes have not been simply moved to a “cold storage” training set for the neural networks that power the Orb’s recognition algorithms. The AEPD’s “blocking” order suggests they are not taking Worldcoin’s word for it, requiring a legal hold that prevents the company from using the data while simultaneously preventing them from destroying the evidence of their own alleged infringement.
Regulatory Deletion Orders vs. Technical Execution
Jurisdiction
Regulatory Order
Technical Implication
Status of Data
Spain (AEPD)
“Bloqueo” (Blocking) of data.
Data must be segregated, encrypted, and immutable. No processing allowed.
Frozen. Retained for liability, inaccessible for AI training.
Portugal (CNPD)
Temporary limitation/ban on collection.
Stop active collection; safeguard existing data.
Limbo. Collection halted, deletion under review.
Germany (BayLDA)
Implement GDPR-compliant deletion.
Must allow “unrestricted” erasure without “undue delay.”
Active Remediation. New “Personal Custody” model attempts to shift data to user devices.
The “Personal Custody” model, introduced in response to these bans, attempts to bypass the problem by storing the data on the user’s phone rather than a central server. yet, this applies to *new* or *migrated* users. It does not retroactively solve the problem of the millions of iris codes already sitting in Worldcoin’s centralized servers—data collected from Spanish and Portuguese citizens before the pivot. That historical data remains the radioactive core of the regulatory dispute, a digital toxic waste that cannot simply be buried under a software update.
From Centralized Databases to Personal Custody: A Regulatory Compliance Pivot
The architectural history of Worldcoin bifurcates sharply in March 2024. For the three years of its operation, the project relied on a centralized model where biometric templates, specifically the iris codes derived from the Orb scans, were encrypted and stored on cloud servers managed by Tools for Humanity (TFH). This centralized repository was the primary target of the AEPD’s emergency order in Spain and the CNPD’s suspension in Portugal. Regulators viewed a privately held database of millions of high-resolution biometric templates as a catastrophic security risk and a violation of GDPR’s data minimization principles. In a direct tactical response to these existential regulatory threats, Worldcoin executed a rapid infrastructure pivot known as “Personal Custody.”
The Mechanics of Personal Custody
On March 22, 2024, sixteen days after the Spanish ban, Worldcoin announced it would no longer store iris codes by default. Under the new Personal Custody model, the biometric data flow was inverted. When an Orb scans a user, the device generates the iris code and immediately transmits it to the user’s smartphone. The data is encapsulated in a signed package that the user’s device encrypts using a public-private key pair generated locally. The Worldcoin backend servers receive only the encrypted package, which they cannot decrypt. The user retains the private key, becoming the sole custodian of their biometric template.
This shift was not a software update; it was a legal maneuver designed to strip Worldcoin of the “Data Controller” status regarding the storage of biometric templates. By pushing the data to the user’s device (the “edge”), Worldcoin argued it no longer possessed a database of biometrics to be regulated. The company asserted that the Orb functioned strictly as a verification instrument rather than a collection endpoint for a central library.
Secure Multi-Party Computation (SMPC)
To maintain the integrity of the “Proof of Personhood” network without a central database to check against, Worldcoin implemented a cryptographic system known as Secure Multi-Party Computation (SMPC). Announced as open-source on May 15, 2024, and developed in collaboration with TACEO, this system allows the network to verify uniqueness without reconstructing the original iris code.
The SMPC architecture splits the iris code into multiple “secret shares” or fragments. These shares are distributed across different servers held by independent parties. In the initial rollout, these parties included the Worldcoin Foundation and external academic or technical partners. To verify a user, the network queries these servers. Each server computes a partial result based on its fragment and sends it back. The system combines these partial results to determine if the iris code matches an existing user, all without ever reassembling the full biometric template in one location. Worldcoin claims this renders the data mathematically useless to any single entity, including themselves.
The computational cost of this privacy preservation is high. The new SMPC system requires approximately 1, 152 cores and 3. 6 terabytes of memory to operate, consuming significantly more than the previous simple database lookup. Yet, this was the price of survival in the European regulatory theater.
The “Unverify” Protocol and Right to Erasure
Alongside the storage pivot, Worldcoin introduced the ability to “unverify” a World ID, directly addressing the AEPD’s citation regarding the inability to withdraw consent. Announced on April 9, 2024, this feature allows users to request the permanent deletion of their iris code. Previously, users could delete their app, their iris code remained on the server to prevent them from signing up again, a practice regulators flagged as a violation of the Right to Erasure (Article 17 GDPR).
The new method introduces a friction point: a six-month “cool-off” period. When a user requests deletion, their World ID becomes invalid immediately, the iris code is retained in a “suppressed” state for six months to ensure the user does not immediately re-verify to claim more WLD tokens. Only after this period is the data permanently purged. While this satisfies the fraud prevention requirements of the financial network, it remains a point of contention for privacy purists who that “revocation of consent” should trigger immediate data destruction.
Regulatory Skepticism and the Extended Ban
The introduction of Personal Custody and SMPC did not result in the immediate lifting of the Iberian bans. The AEPD maintained its blockade, and the CNPD in Portugal proceeded with its 90-day suspension issued on March 26, 2024. The regulators’ hesitation stemmed from the “black box” nature of the Orb itself. Even if the database is decentralized *after* processing, the initial collection point, the Orb, still captures a high-resolution image of the user’s eye. The AEPD needed to verify that the Orb truly deletes the raw images immediately and that the “Personal Custody” transfer is secure against interception.
Recognizing that a technical pivot would not instantly resolve the legal standoff, Tools for Humanity voluntarily agreed in June 2024 to extend the pause on operations in Spain until the end of the year or until the Bavarian State Office for Data Protection Supervision (BayLDA) completed its audit. This “voluntary” extension was a strategic retreat, preventing the AEPD from issuing a permanent ruling while Worldcoin pinned its hopes on a favorable audit from the German lead authority.
Comparison of Architectures
Feature
Legacy Architecture (2021-2024)
Personal Custody & SMPC (2024-Present)
Biometric Template Storage
Encrypted and stored in a centralized cloud database managed by TFH.
Stored on the user’s smartphone. Server holds only encrypted shards.
Uniqueness Check
Direct comparison of incoming iris code against the central database.
Secure Multi-Party Computation (SMPC) across distributed nodes.
Data Controller
Worldcoin Foundation / TFH.
User (for the template); TFH (for the SMPC shards).
Deletion Capability
Difficult or impossible; codes retained to prevent fraud.
“Unverify” option available with a 6-month fraud-prevention delay.
Encryption Keys
Managed by Worldcoin.
Private key generated and stored on the user’s device.
The transition to Personal Custody represents a fundamental restructuring of Worldcoin’s liability profile. By physically moving the data to the user’s pocket, the company attempts to bypass the strict requirements of maintaining a central biometric registry. Yet, the effectiveness of this pivot relies entirely on the integrity of the Orb’s hardware and the transparency of the SMPC protocol, factors that European regulators continue to scrutinize with extreme caution.
Judicial Review: The Spanish National Court's Ruling on Privacy vs. Commercial Interests
Judicial Review: The Spanish National Court’s Ruling on Privacy vs. Commercial Interests
On March 11, 2024, the Contentious-Administrative Chamber of the Spanish National Court (*Audiencia Nacional*) delivered a decisive blow to Tools for Humanity’s operations in Spain. In a ruling that reverberated across the European privacy sector, the High Court denied the company’s request for a “very urgent precautionary measure” (*medida cautelarísima*) to suspend the AEPD’s blockade. This judicial intervention did not validate a regulatory administrative order; it established a hierarchy of rights where the protection of biometric data categorically supersedes corporate economic liberty during periods of investigative uncertainty.
The “Irreparable Harm” Argument
Tools for Humanity (TFH) anchored its legal defense on the concept of “irreparable harm.” The company’s legal team argued that the AEPD’s sudden cessation order would inflict devastating financial losses and reputational damage that could not be reversed, even if the courts later found the ban unlawful. They contended that the disruption to their user growth, already exceeding 400, 000 verifications in Spain, would permanently cripple their market position in a region serious to their global strategy. The National Court dismantled this argument with clinical precision. In its auto (judicial decree), the magistrates reasoned that economic losses, no matter how severe, are by definition reparable through financial compensation. If TFH were to eventually prevail in the main lawsuit regarding the legality of the AEPD’s order, the Spanish state could indemnify the company for lost revenue. Therefore, the “periculum in mora” (danger in delay) required to grant an injunction was absent. The court explicitly stated that the “particular interest of the recurring company, of a fundamentally economic content,” could not override the “safeguarding of the general interest,” defined here as the fundamental right to the protection of personal data.
Jurisdictional Challenge: Bypassing Bavaria
A central pillar of TFH’s appeal was the jurisdictional challenge. The company asserted that under the GDPR’s “One-Stop-Shop” method, only the Bavarian State Office for Data Protection Supervision (BayLDA), where TFH has its European establishment, held the authority to sanction its data processing activities. They argued the AEPD had acted *ultra vires* (beyond its powers) by circumventing the lead supervisory authority. The judges rejected this interpretation for the purpose of the interim measures. By upholding the AEPD’s use of Article 66. 1 of the GDPR, the court affirmed that national regulators retain the power to intervene immediately in “exceptional circumstances” where there is an urgent need to protect the rights and freedoms of data subjects. The ruling clarified that the urgency of the risks, specifically the processing of minors’ biometric data and the inability to withdraw consent, justified a temporary suspension of the “One-Stop-Shop” principle. The court noted that this decision was without prejudice to the final resolution on the merits of the case, for the immediate moment, the Spanish regulator’s competence to act locally was valid.
Weighing Fundamental Rights
The ruling provides a rare judicial insight into the balancing act between technological innovation and privacy rights. The magistrates identified two specific high-risk factors by the AEPD that tipped the against Worldcoin: 1. **Processing of Minors’ Data:** The court found the evidence regarding the collection of data from minors sufficiently worrying to warrant immediate cessation. The risk of permanent biometric exposure for children constituted a harm far greater than any temporary loss of profit for the corporation. 2. **Consent Withdrawal:** The court acknowledged the AEPD’s finding that users faced significant difficulties in withdrawing their consent or deleting their data, a direct violation of GDPR Article 7. By refusing to lift the ban, the National Court sent a clear message: business models built on the processing of special category data (biometrics) must guarantee absolute compliance *before* deployment. The “move fast and break things” ethos of Silicon Valley was ruled incompatible with the precautionary principles of European data protection law.
Aftermath and Voluntary Extension
Following the court’s refusal to grant the injunction, TFH was forced to comply with the order to cease all “Orb” operations in Spain. The legal defeat placed the company in a precarious position. Rather than continuing a hostile confrontation that could lead to further sanctions, TFH eventually entered into a legally binding commitment with the AEPD. In June 2024, the company agreed not to resume operations in Spain until at least the end of the year or until the BayLDA concluded its audit. This voluntary extension of the pause was a direct consequence of the National Court’s refusal to provide legal cover for their continued operation, forcing a pivot from litigation to compliance negotiation.
Summary of Audiencia Nacional Ruling (March 11, 2024)
Legal Component
Court’s Determination
Implication for Biometric Data
Injunction Request
Denied (Rechazada)
Regulators can stop biometric collection immediately without waiting for a full trial.
Economic Harm
Deemed “Reparable”
Financial loss is not a valid excuse to continue high-risk data processing during an investigation.
Public Interest
Prevails over Commercial Interest
The right to data protection is superior to the right to conduct business when risks are high.
Jurisdiction (GDPR Art. 66)
Affirmed for Interim Measures
Local authorities can bypass the “One-Stop-Shop” in urgent cases involving minors or high risks.
The February 2025 BayLDA Resolution: Mandating Deletion of Spanish Biometric Data
The February 2025 BayLDA Resolution: Mandating Deletion of Spanish Biometric Data In February 2025, the regulatory siege on Worldcoin’s operations in the Iberian Peninsula reached its terminal velocity. The Bavarian State Office for Data Protection Supervision (BayLDA), acting as the Lead Supervisory Authority under the European Union’s General Data Protection Regulation (GDPR), issued a definitive resolution that validated the earlier emergency measures taken by Spain’s AEPD. This ruling was not a fine or a temporary suspension; it was a “hard delete” order. The authority mandated the complete erasure of all biometric iris data collected from users in Spain, zeroing out Worldcoin’s database in one of its most active European markets. This decision marked the collapse of Worldcoin’s initial data collection model in the region and established a rigid precedent for biometric processing across the continent. The resolution concluded a nearly two-year investigation that began when Worldcoin’s “Orbs” appeared on Spanish streets. While the AEPD had utilized Article 66 of the GDPR to impose a temporary blockade in March 2024—citing “imminent risks” to citizens’ rights—the BayLDA’s February 2025 decree provided the permanent legal substantiation for those fears. The German regulator found that Tools for Humanity (TFH), the developer behind Worldcoin, had failed to establish a valid legal basis for processing special category data under Article 9 of the GDPR. Specifically, the resolution dismantled TFH’s reliance on “legitimate interest” for biometric data collection, asserting that for such high-risk processing, only “explicit, informed, and freely given consent” suffices. The of the deletion mandate was massive. At the time of the order, Worldcoin had scanned the irises of over 300, 000 individuals in Spain, with concentrations heavily skewed toward younger demographics in major cities like Barcelona and Madrid. The BayLDA’s order required TFH to identify and purge these specific records from its centralized databases. This directive presented a technical paradox for Worldcoin: the company had long argued that its “IrisCodes” were anonymized and irretrievable. yet, the regulator’s demand implied that if the data could be used to verify uniqueness (i. e., prevent a user from signing up twice), it remained identifiable enough to be subject to a deletion request. The resolution forced TFH to admit that its “anonymization” was, in regulatory terms, “pseudonymization,” leaving the data within the scope of GDPR enforcement. A serious component of the February ruling was its rejection of Worldcoin’s retroactive “consent” fixes. Throughout 2024, TFH attempted to pivot to a “Personal Custody” model, where biometric data would be stored on the user’s device rather than in the cloud. While BayLDA acknowledged this shift as a positive step for *future* users, it ruled that the architectural change did not sanitize the illegality of the *historical* dataset. The data collected prior to the implementation of these new was fruit of the poisonous tree. The regulator’s logic was unyielding: not retroactively apply a new consent model to data that was harvested under a defective legal framework. Consequently, the 300, 000 Spanish IrisCodes were condemned to digital incineration. The friction between the AEPD and BayLDA also resolved into a unified front with this decision. For months, critics had accused the “One-Stop-Shop” method of slowing down enforcement, as the Bavarian authority methodically plodded through its investigation while the Spanish agency took unilateral emergency action. The February 2025 resolution vindicated the AEPD’s aggressive stance. It confirmed that the Spanish regulator’s initial assessment—that Worldcoin’s operations posed a severe threat to privacy—was correct. The of the two regulators sent a signal to other EU member states: the “move fast and break things” method of Silicon Valley would not survive contact with European biometric laws. Worldcoin’s response to the mandate was a mixture of compliance and litigation. Publicly, the company announced it would honor the deletion order for the Spanish dataset to demonstrate its commitment to regulatory cooperation. yet, legal filings from TFH indicated an intent to appeal specific interpretations of the ruling, particularly the definitions of “anonymization” regarding their Secure Multi-Party Computation (SMPC) technology. TFH argued that their updated cryptographic methods made the data mathematically impossible to link back to a human, thus placing it outside GDPR jurisdiction. The BayLDA dismissed this, maintaining that as long as the system could single out a unique individual to deny them a second account, the data functioned as a biometric identifier. The operational was immediate. Worldcoin’s app in Spain was forced to undergo a “hard reset” for verified users. Individuals who had previously scanned their irises found their “World ID” verification revoked, requiring them to undergo the process again under the new, strictly compliant “Personal Custody” regime if they wished to remain on the network. This created a significant churn event, as users, more aware of the privacy controversies, chose not to re-verify. The “network effects” that Worldcoin had aggressively purchased with WLD tokens were severed, forcing the company to rebuild its Spanish user base from scratch, this time with the heavy friction of explicit consent forms and age verification checks slowing every interaction. This resolution also highlighted the specific failure of Worldcoin’s age verification method during the initial rollout. The BayLDA’s report detailed instances where the “Orbs” had failed to filter out minors, a violation that carried heavy weight in the decision to mandate total deletion. The regulator noted that because the original dataset was polluted with an unknown percentage of underage users—and because the “anonymized” nature of the system made it impossible to selectively identify and remove only the minors—the entire dataset had to be treated as compromised. This “all-or-nothing” method served as a warning to other biometric firms: if not distinguish a child from an adult in your database, you forfeit the right to keep the database at all. By late February 2025, the deletion process was underway, monitored by third-party technical auditors appointed by the BayLDA. The “Spanish Purge,” as it became known in privacy circles, demonstrated the power of GDPR: the ability to force a company to destroy its most valuable asset—its user data—when that asset is built on a foundation of non-compliance. For Worldcoin, the loss of the Spanish dataset was a tactical defeat; for the privacy rights of European citizens, it was a strategic victory that established the iris as a sovereign territory, protected from unauthorized annexation by commercial entities.
SMPC Implementation: Assessing the Efficacy of Secure Multi-Party Computation
The Cryptographic Pivot: From Centralized Storage to Secret Shares
In May 2024, facing existential regulatory threats from the AEPD in Spain and the CNPD in Portugal, the World Foundation executed a radical architectural overhaul of its biometric infrastructure. The organization transitioned from a centralized database of iris hashes, a structure that regulators viewed as a “honey pot” of sensitive biometric data, to a Secure Multi-Party Computation (SMPC) system. This move was not a technical upgrade; it was a legal defense strategy designed to render the “mass surveillance” argument obsolete by mathematically the ability of any single entity to view or reconstruct a user’s iris code.
The legacy system relied on storing a 12, 800-bit iris code (a hash generated by the Orb) in a database. To check for uniqueness, the system simply compared the incoming hash against stored hashes. While, this method created a permanent, centralized registry of biometric templates, a direct violation of the data minimization principles championed by the AEPD. The SMPC implementation, developed in collaboration with cryptographers from TACEO, fundamentally alters this workflow. Instead of storing the iris code, the system splits the code into multiple “secret shares” using an additive secret sharing scheme. These shares are distributed across different servers held by distinct parties. No single server holds the complete iris code, and the data on any individual server is statistically indistinguishable from random noise.
The Mathematics of Uniqueness: Hamming Distance in the Dark
The efficacy of this system relies on its ability to perform the “Uniqueness Check” without ever reassembling the iris code. When a user scans their iris at an Orb, the device generates the template and immediately splits it into shares. These shares are sent to the respective SMPC nodes. The nodes then calculate the Hamming distance, the measure of difference between two binary strings, between the new shares and the stored shares. Through the algebraic properties of the secret sharing scheme, the nodes can compute the partial results of this comparison. These partial results are then combined to reveal only the final distance score (i. e., whether the user is a match or not) without ever revealing the underlying biometric data to the nodes or the coordinator.
This process is computationally exorbitant. According to technical specifications released by the World Foundation, the SMPC system requires approximately 1, 152 CPU cores and 3. 6 terabytes of memory to operate, consuming significantly more (5 Gbps) than the previous method. This resource intensity demonstrates the high cost of retrofitting privacy into a system designed for. yet, the expenditure serves a serious purpose: it allows World to claim that they no longer “process” biometric data in the conventional sense, as the mathematical shares cannot be reversed into an iris image or template by any single actor.
The TACEO Factor and the “Deletion” Event
A central pillar of World’s defense before the BayLDA (Bavarian State Office for Data Protection Supervision) was the assertion that the migration to SMPC allowed for the permanent deletion of the legacy database. On May 15, 2024, the Foundation announced that all previously collected iris codes had been migrated to the SMPC format and the old centralized database had been securely erased. This “deletion event” was intended to satisfy the “Right to Erasure” and “Data Minimization” requirements of GDPR. By destroying the original templates, World argued that the risk of a data breach was nullified; even if a hacker compromised one of the SMPC nodes, they would retrieve only useless numerical fragments.
The integrity of this system depends entirely on the independence of the parties holding the shares. The system was architected with TACEO, a team of MPC engineers. If the World Foundation and TACEO (or other node operators) are legally or operationally intertwined, the “Multi-Party” aspect of SMPC becomes a distinction without a difference. If the entities collude, the keys can be combined, and the iris codes reconstructed. While the code is open-source and audited, the physical control of the servers remains a matter of trust, a commodity that European regulators are hesitant to grant.
Regulatory Friction: Pseudonymization vs. Anonymization
even with the technical sophistication of SMPC, the BayLDA’s December 2024 conclusion indicated that the measure did not fully absolve World of its GDPR obligations. The core dispute lies in the legal definition of “anonymization.” World that because the shares cannot be linked to a person without collusion, the data is anonymized and thus outside the scope of GDPR. The regulators, yet, classify the SMPC shares as “pseudonymized” data. Under GDPR, pseudonymized data is still personal data because it can be re-identified with the use of additional information (the other shares).
This legal distinction is fatal to World’s attempt to bypass consent requirements. If the data is pseudonymized, World must still obtain valid, explicit consent (Article 9 GDPR) for the processing of special category data. The BayLDA’s order for World to implement a compliant deletion method confirms that the regulator still views the SMPC shares as “data” that belongs to the user, not system metadata. The “mathematical guillotine” did not sever the legal link between the user and their biometric derivative.
Audit Verification: Least Authority and Trail of Bits
To its claims, World engaged security firms to audit the new infrastructure. Least Authority performed a security audit of the SMPC protocol in April 2024, examining the cryptographic logic and the implementation of the uniqueness check. Their report confirmed the soundness of the protocol did not verify the operational independence of the nodes. Separately, Trail of Bits audited the Orb’s software (version 3. 0. 10 and subsequent updates) to verify that the device does not biometric data after the split. These audits provide a technical baseline, confirming that the code can work as described, they cannot prove that the deployed system is immune to insider threats or compelled access by law enforcement.
Comparative Analysis: Legacy vs. SMPC Architecture
The following table contrasts the data protection posture of the original system against the SMPC implementation, highlighting the specific GDPR articles addressed.
Feature
Legacy Centralized Database (2021-2024)
SMPC Architecture (2024-Present)
GDPR Implication
Data Storage
Full Iris Code (Hash) stored in a single database.
Encrypted shares distributed across multiple nodes.
Art. 32 (Security of Processing): SMPC mitigates single-point-of-failure risk.
Reversibility
Irreversible hash (theoretically), linkable.
Mathematically impossible to reconstruct without all shares.
Art. 25 (Privacy by Design): SMPC represents a higher standard of data protection.
Uniqueness Check
Direct comparison of hashes.
Computation of Hamming distance on encrypted shares.
Art. 5 (Data Minimization): Processing occurs without exposing raw data.
Legal Status
Personal Data (Biometric).
Disputed: World claims “Anonymized”; Regulators say “Pseudonymized”.
Art. 4(1) (Definition of Personal Data): Determines if GDPR applies.
Deletion Capability
Simple database record deletion.
Deletion of shares across nodes; mathematical erasure.
Art. 17 (Right to Erasure): SMPC makes verification of deletion more complex.
The Unresolved “Linkability” Paradox
A serious flaw remains in the logic of “uniqueness” via SMPC. To prevent a user from claiming multiple WLD airdrops, the system must link the current scan to the stored shares. If the system can determine “This person is already in the database,” it has, by definition, identified the person within the closed loop of the World Network. This creates a paradox: World claims the data is anonymous (unidentifiable), yet the entire of the network relies on its ability to identify the same person twice. Regulators in Spain have seized on this contradiction, noting that “singling out” an individual, even by a unique alphanumeric string, constitutes identification under European law.
The SMPC implementation neutralizes the risk of a mass data leak where millions of iris codes are stolen and used elsewhere. yet, it does not solve the fundamental legality of the collection itself. The Orb still captures a high-resolution image of the eye, processes it, and generates the shares. That millisecond of processing involves raw biometric data. If the consent obtained for that moment is invalid (due to coercion, absence of understanding, or involvement of minors), the subsequent encryption into SMPC shares is fruit of the poisonous tree. The technology secures the storage, it does not cure the defects in the acquisition.
High-Risk Processing: The Data Protection Impact Assessment (DPIA) Deficiencies
The General Data Protection Regulation (GDPR) mandates that any data processing likely to result in a high risk to the rights and freedoms of natural persons must be preceded by a Data Protection Impact Assessment (DPIA). This is not a bureaucratic formality; it is a method designed to identify risks and implement specific mitigations *before* a single byte of data is collected. For Worldcoin, the processing of biometric iris data on a global constitutes a textbook definition of “high-risk” processing under Article 35. yet, the regulatory interventions by the Spanish AEPD and the Portuguese CNPD expose a widespread failure in Worldcoin’s DPIA process. The assessments—if they were conducted with any rigor—failed to identify, let alone mitigate, the most immediate and damaging risks to the population: the enrollment of minors and the impossibility of data erasure. ### The Mandatory Nature of Article 35 Under GDPR Article 35(3), a DPIA is explicitly required for the “systematic monitoring of a publicly accessible area on a large ” and the “processing on a large of special categories of data,” specifically biometric data for the purpose of uniquely identifying a natural person (Article 9). Worldcoin’s operations checked every red flag in the regulation. They deployed proprietary hardware (“Orbs”) in public spaces like shopping malls and transport hubs, specifically to harvest biometric identifiers from millions of individuals. The deficiency in Worldcoin’s compliance strategy was not a clerical error a fundamental disconnect between their theoretical risk assessment and the operational reality. A compliant DPIA must contain at least four essential components: a systematic description of the processing, an assessment of the need and proportionality, an assessment of the risks to the rights and freedoms of data subjects, and the measures envisaged to address the risks. The evidence from Spain and Portugal confirms that Worldcoin’s DPIA failed catastrophically on the latter two counts. ### The Age Verification Blind Spot The most damning indictment of Worldcoin’s DPIA was its inability to prevent the enrollment of minors. In both Spain and Portugal, data protection authorities received numerous complaints regarding children scanning their irises in exchange for WLD tokens. A strong DPIA would have identified “processing of children’s biometric data” as a serious risk vector requiring fail-safe mitigation. In Portugal, the CNPD’s investigation revealed that there was “no method for verifying the age of the adherents.” The Orb, a sophisticated piece of hardware capable of distinguishing a human iris from a high-resolution photograph, absence the basic functionality to verify the age of the human standing in front of it. This omission suggests that the DPIA either ignored the risk of underage enrollment or relied on the ineffective measure of “self-attestation,” where a user simply ticks a box claiming to be over 18. For a project processing high-risk biometric data, relying on self-attestation is legally insufficient. The AEPD and CNPD interventions highlighted that the financial incentive—the “free” crypto tokens—created a coercive environment that specifically targeted demographics, including minors who may not understand the long-term of handing over their biometric identity. The DPIA failed to account for this behavioral economic factor. If the risk assessment had been sound, it would have mandated the integration of age-gating technology (such as checking a physical ID card) *before* the iris scan could be initiated. The absence of this control proves the DPIA was defective. ### The “Write-Only” Database Architecture A second serious deficiency identified in the regulatory was the inability of users to exercise their Right to Erasure (Article 17). A fundamental requirement of any DPIA is to demonstrate how data subject rights be upheld. If a system is designed in a way that makes deletion impossible or technically unfeasible, the processing cannot legally proceed. Initial investigations by the AEPD and CNPD found that users who wished to withdraw their consent and have their data deleted faced a labyrinthine process or were told it was impossible. Worldcoin’s initial architecture appeared to be designed as a “write-only” system—once the iris code was generated and hashed onto the blockchain (or centralized database), removing it became a technical contradiction to their “proof of personhood” model. This represents a fatal flaw in the DPIA. The assessment should have flagged the conflict between the “permanent unique identity” goal and the “right to be forgotten” requirement. By launching the Orbs without a verified, functional, and accessible deletion method, Worldcoin proceeded with high-risk processing without the mandatory mitigations in place. The Spanish National Court’s support of the AEPD’s temporary ban was largely predicated on this failure; the commercial interest of the company could not override the fundamental right of a citizen to control their own biometric data. ### The Failure of Consultation (Article 36) GDPR Article 36 states that if a DPIA indicates that the processing would result in a high risk in the absence of measures taken by the controller to mitigate the risk, the controller *must* consult the supervisory authority prior to processing. The sequence of events in Iberia suggests that Worldcoin either did not identify the risks as “high” (a legal absurdity given the biometric nature) or they believed their mitigations were sufficient even with the obvious gaps. If they had consulted the AEPD or CNPD specifically regarding the deployment of Orbs in Spanish and Portuguese malls, it is highly improbable that the regulators would have signed off on a system with no age verification and no deletion protocol. Instead, Worldcoin appears to have relied on the “One-Stop-Shop” method, channeling their compliance efforts through the Bavarian State Office for Data Protection Supervision (BayLDA). While this is a valid legal pathway for cross-border processing, it does not absolve the company of the responsibility to account for local risks and the actual operational impact in other jurisdictions. The “emergency” nature of the AEPD and CNPD bans (under Article 66) serves as a rebuke to the DPIA submitted to Bavaria; the local regulators declared that the assessment accepted in Germany was insufficient to protect citizens in Spain and Portugal. ### The “Black Box” of Proprietary Technology The DPIA process also demands transparency regarding the technology used. Worldcoin’s Orb is a proprietary device, and the exact nature of the data processing—what happens inside the metal sphere—was unclear to the public and, initially, to regulators. A valid DPIA for such technology must detail the technical specifications: resolution of the scan, nature of the hashing algorithm, storage of the raw image versus the iris code, and transmission security. The regulatory confusion and the need for “forensic analysis” of the Orbs indicate that the DPIA did not provide the necessary level of technical granularity to reassure authorities. The CNPD noted deficiencies in the information provided to data subjects. This is a downstream failure of the DPIA. If the risk assessment had correctly identified the complexity of the technology as a barrier to transparency, the mitigation would have been a strong, plain-language education campaign and clear, privacy notices at the point of collection. Instead, users were presented with complex legalese and a shiny metal orb, creating an information asymmetry that violated the transparency principle (Article 5(1)(a)). ### Conclusion: A Retroactive Compliance Failure The actions taken by Spain and Portugal demonstrate that Worldcoin’s method to the DPIA was likely retroactive or performative rather than substantive. A DPIA is meant to be a roadmap for safe processing; in Worldcoin’s case, it appears to have been treated as a compliance checkbox while the engineering team built a system that prioritized growth and immutability over privacy and control. The “high risk” to rights and freedoms was not theoretical. It manifested in the form of thousands of minors selling their biometric future for of tokens and thousands of adults unable to reclaim their identity from a centralized database. The deficiencies in the DPIA were the root cause of these violations. By failing to anticipate the obvious risks of deploying biometric scanners in public spaces without age checks or deletion tools, Worldcoin forced regulators to use the nuclear option of a temporary ban, setting a precedent that innovation cannot come at the cost of fundamental rights.
Transparency Failures: Inadequate Disclosures to Data Subjects in Iberian Operations
The foundation of the General Data Protection Regulation (GDPR) rests on the principle of transparency, codified primarily in Articles 12, 13, and 14. These articles mandate that data controllers provide clear, concise, and accessible information to subjects regarding who is collecting their data, why it is being collected, and how long it be retained. In the Iberian Peninsula, Worldcoin’s operations systematically dismantled these requirements, replacing legal clarity with a spectacle of technological novelty and financial inducement. Both the Spanish AEPD and the Portuguese CNPD identified a information deficit that rendered the “consent” obtained from hundreds of thousands of users legally void. The transparency failures were not administrative oversights; they were structural features of a deployment strategy designed to maximize user acquisition speed at the expense of user understanding.
The “Orb” Encounter: Speed Over Comprehension
The primary point of failure occurred at the physical point of collection: the “Orb.” In major cities like Madrid, Barcelona, and Lisbon, Worldcoin deployed third-party operators to manage these biometric scanning devices. Investigations revealed that these operators were frequently compensated based on the volume of sign-ups, creating a direct financial disincentive to pause and explain complex privacy policies. Instead of a measured legal disclosure, users encountered a high-pressure sales environment focused entirely on the immediate reward of WLD tokens.
Witness reports and regulatory findings indicate that operators rarely provided the ” ” privacy notices required by European law. A compliant process would necessitate a clear,, summary of risks before any biometric data capture occurred. In practice, users were frequently directed to “scan ” and “read later,” with the legal terms buried in a mobile app that had not yet fully installed or configured at the moment of the scan. This inversion of the consent process meant that the biometric capture, an irreversible act, frequently preceded the delivery of the very information necessary to authorize it.
The “Anonymity” Misrepresentation
A central pillar of Worldcoin’s marketing pitch to Iberian users was the claim of “anonymity.” Promotional materials and verbal scripts used by Orb operators frequently asserted that the system did not collect personal data, only a “unique code.” This characterization represents a serious transparency violation under GDPR Article 5(1)(a), which requires data to be processed lawfully, fairly, and transparently.
Technically, the “Iris Code” generated by the Orb is pseudonymous, not anonymous. Under the GDPR, pseudonymous data remains personal data because it can be linked back to a natural person if the specific technical keys or side-channel data (such as the phone number or email used for the wallet) are combined. By telling users their data was “anonymous,” Worldcoin stripped them of their perceived risk perception. A user who believes they are anonymous behaves differently than one who understands they are permanently linkable to a biometric hash. The AEPD noted that this obfuscation prevented users from understanding the true nature of the exchange, violating the requirement for “accurate” information.
AEPD Findings: The Identity Shell Game
The Spanish Data Protection Agency (AEPD) identified a serious failure in identifying the data controller. Article 13(1)(a) requires the controller to state their identity and contact details. In the complex corporate web of Worldcoin, users were frequently unsure whether they were contracting with Tools for Humanity Corporation (a US entity), Tools for Humanity GmbH (a German subsidiary), or the Worldcoin Foundation (a Cayman Islands entity).
This ambiguity was not academic; it had practical consequences for the exercise of rights. When Spanish users attempted to exercise their rights to access or erasure, they frequently found themselves in a bureaucratic loop, unsure of which entity held the legal liability for their biometric template. The AEPD’s precautionary order in March 2024 highlighted that this absence of clarity impeded the exercise of rights, as users cannot sue or petition an entity they cannot definitively identify.
CNPD Findings: The “Hotel California” Problem
In Portugal, the CNPD’s investigation unearthed a more disturbing transparency failure: the omission of information regarding the right to withdraw consent. Article 13(2)(c) explicitly mandates that controllers inform subjects of the existence of the right to withdraw consent at any time. yet, the CNPD found that for a significant period, Worldcoin provided “insufficient information on certain options, preventing them from deleting their data.”
Portuguese regulators discovered that the method for data deletion was either non-existent or so convoluted that it was inaccessible. Users were onboarded with the pledge of easy entry were not informed that exit was technically impossible at that stage of the project’s development. This “Hotel California” architecture, where check in never leave, constitutes a severe breach of the transparency principle. If a user does not know that their consent is irrevocable (or that revocation is technically unsupported), their consent is not “informed.” The CNPD this specific failure as a primary justification for its 90-day ban, noting that the inability to stop processing poses a high risk to fundamental rights.
The Language Barrier and Accessibility
The GDPR requires information to be provided in “clear and plain language.” In the Iberian context, this means accessible Spanish and Portuguese, free from excessive legal jargon. Reports surfaced that early iterations of the privacy notices and terms of service presented to Iberian users were either in English or contained poor machine translations that obscured the legal nuance of biometric processing.
also, the complexity of the “Zero-Knowledge Proof” (ZKP) technology used to justify the privacy claims was never adequately explained to the average data subject. While ZKP is a valid privacy-enhancing technology, using it as a buzzword to bypass the explanation of data retention periods and transfer method violates the transparency obligation. Users must understand what happens to their data, not just the mathematical theory the company hopes protect it.
Transparency Gap: pledge vs. Reality
Regulatory Requirement (GDPR Art. 13)
Worldcoin Disclosure (Iberian Operations)
Investigative Reality
Identity of Controller
Vague
The 'Proof of Humanity' Paradox: Biometric Uniqueness vs. GDPR Data Minimization
The central conflict defining Worldcoin’s European operations lies in a fundamental incompatibility between its technical architecture and the core tenets of the General Data Protection Regulation (GDPR). This friction creates what legal scholars and privacy advocates term the ‘Proof of Humanity’ paradox. To achieve its stated goal—distinguishing unique humans from artificial intelligence bots—Worldcoin asserts it must capture the most immutable, high-fidelity biometric data available: the human iris. Yet, Article 5(1)(c) of the GDPR mandates “data minimization,” requiring that personal data be “adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed.” Spanish and Portuguese regulators have dismantled Worldcoin’s defense by attacking this precise contradiction. The AEPD (Agencia Española de Protección de Datos) and CNPD (Comissão Nacional de Proteção de Dados) determined that the processing of special category biometric data on a mass was disproportionate to the commercial objective of distributing cryptocurrency or creating a digital identity network. The paradox is sharp: Worldcoin claims it must process intimate biological data to preserve human agency in an AI-dominated future, while European regulators that such processing strips individuals of the very privacy rights the project claims to protect.
The need and Proportionality Test
Under GDPR, processing biometric data (Article 9) requires not just consent, a demonstration of need and proportionality. The AEPD’s precautionary order in March 2024 challenged the premise that iris scanning is the only viable method for “Proof of Humanity.” The regulator’s analysis suggested that less intrusive methods, such as government-issued identification, existing trusted intermediaries, or zero-knowledge proofs based on less sensitive metrics, could achieve similar verification results without the permanent risk associated with iris templates. Worldcoin’s counter-argument relies on the concept of “Sybil resistance.” In distributed systems, a Sybil attack occurs when one entity creates multiple fake identities to manipulate the network. Worldcoin maintains that in the age of generative AI, facial recognition and government IDs are easily spoofed, leaving the entropy of the iris as the sole reliable anchor for uniqueness. yet, the Spanish National Court (Audiencia Nacional) rejected this technical justification as a shield against privacy law. In upholding the AEPD’s ban, the court established that the “prevailing general interest” of protecting citizen data overrides the “particular business interest” of the company. The court ruled that the efficiency of a private anti-bot method does not justify the mass collection of biometric data from the general population, particularly when that population includes minors.
The ‘Iris Code’ and the Myth of Anonymity
A serious component of this legal battle involves the technical nature of the “Iris Code.” Worldcoin has repeatedly stated that the Orb does not store raw images of the eye after the initial processing (unless the user opts in for data custody). Instead, the device generates a numerical hash, the Iris Code, which the company claims is a non-reversible, privacy-preserving identifier. Regulators in Spain and Portugal, aligned with the Bavarian lead authority (BayLDA), have scrutinized this claim under the definition of “pseudonymization” versus “anonymization.” Under GDPR, data is only anonymous if the data subject is no longer identifiable. If the data can be used to single out an individual, even to say “this person has already registered”, it remains personal data. The Iris Code functions specifically as a unique identifier. Its entire purpose is to match a new scan against a database of millions of existing codes to prevent duplicate sign-ups. This functionality places the Iris Code squarely within the scope of biometric data processing. The Article 29 Working Party (predecessor to the EDPB) and current European case law establish that hashed biometric data is still biometric data if it allows for the unique identification or authentication of a natural person. The “one-way” nature of the hash does not absolve the controller of GDPR obligations because the code itself is a sensitive link to the individual’s biological identity. also, security experts have raised concerns that if the hashing algorithm were compromised or if “master keys” existed, the codes could chance be linked back to other data points, shattering the illusion of anonymity.
The Immutability Risk Factor
The AEPD and CNPD emphasized the “high risk” nature of this processing due to the immutability of the iris. Unlike a password or a credit card number, an iris pattern cannot be changed. If a database of Iris Codes were leaked, or if future technology allowed for the reverse-engineering of these hashes into recognizable templates, the affected individuals would face a lifetime of vulnerability. This “irreparable harm” standard was the primary legal trigger for the urgent Article 66 interventions in Iberia. The CNPD in Portugal specifically highlighted the of data collection, over 300, 000 citizens, as an aggravating factor. The regulator argued that the accumulation of such a vast database of high-risk biometric data created a “honey pot” for cybercriminals. The principle of data minimization dictates that companies should not hold data they do not strictly need. By maintaining a centralized (or even distributed) ledger of biometric hashes for the sole purpose of preventing double-dipping in a crypto airdrop, Worldcoin was seen as violating the proportionality requirement. The risk to the data subject was deemed excessive compared to the benefit of receiving a small amount of WLD tokens.
The Failure of Less Intrusive Alternatives
The regulatory crackdown also exposed Worldcoin’s failure to exhaust less intrusive alternatives before resorting to biometric mass surveillance. The GDPR requires data controllers to consider whether the purpose can be achieved by other means. Critics and regulators pointed out that “Proof of Humanity” can be established through webs of trust, social verification, or existing eIDAS-compliant digital identities (such as the Chave Móvel Digital in Portugal or Cl@ve in Spain). Worldcoin’s dismissal of these alternatives rests on their desire for a “permissionless” and “global” system that does not rely on nation-states. yet, operating within the EU requires adherence to EU law, which prioritizes the rights of the data subject over the architectural p
Post-Ban Compliance Roadmap: Regulatory Conditions for Resuming Iberian Operations
The route to resuming operations in the Iberian Peninsula is not a return to the of 2023; it is a reconstruction of the project’s entire data architecture under the strict supervision of the Bavarian State Office for Data Protection Supervision (BayLDA), with the Spanish AEPD and Portuguese CNPD acting as vigilant “concerned supervisory authorities” under Article 60 of the GDPR. Following the February 2025 resolution, World (formerly Worldcoin) faces a rigid compliance roadmap that converts the temporary bans into a permanent conditional framework. The era of “move fast and break things” has ended; the new operational mandate requires forensic age assurance, decentralized custody, and retroactive data purging as non-negotiable prerequisites for re-entry.
The Age Verification Mandate: From Checkboxes to Zero-Knowledge Proofs
The primary catalyst for the CNPD’s 90-day suspension in March 2024 was the discovery of widespread underage data collection. The regulator received dozens of complaints regarding minors scanning their irises without parental consent, a direct violation of GDPR Article 8. The compliance roadmap demands a “zero-tolerance” technical barrier against minor registration. The previous method, a simple self-attestation checkbox, is permanently disqualified. To resume Iberian operations, World must implement a forensic age verification system that precedes the biometric scan. This system, likely integrating third-party identity verification tools, must authenticate a user’s government-issued ID (DNI in Spain, Cartão de Cidadão in Portugal) without storing the ID data itself. The roadmap specifies the use of Zero-Knowledge Age Verification (ZKAV), where the Orb receives only a cryptographic “yes/no” signal regarding the user’s majority status. This method must be auditable by local regulators to prove that no personal data from the ID card is retained or linked to the subsequent iris hash. The CNPD has made it clear: any recurrence of underage scanning trigger immediate Article 66 emergency procedures, bypassing the lead authority method.
Mandatory Personal Custody: The End of Centralized Storage
The most significant structural change required for re-entry is the total elimination of the “Data Custody” option, which previously allowed World to store encrypted biometric data on its servers. The AEPD’s investigation highlighted that this centralized repository posed an excessive risk to the rights and freedoms of Spanish citizens. The roadmap mandates the exclusive use of “Personal Custody,” a model where the Orb functions solely as a processor, not a storage unit. Under this protocol, the Orb generates the iris code and immediately transmits it to the user’s smartphone, encrypted with a public-private key pair generated on the device. The Orb then permanently deletes the local cache. The backend server receives only the encrypted packet, which it cannot decrypt. This architecture shifts data sovereignty entirely to the user, satisfying the GDPR’s data minimization and storage limitation principles. For Spanish and Portuguese regulators, this is a binary condition: if the Orb retains any biometric fragments post-transmission, or if the central server holds decryption keys, operations remain illegal.
Retroactive Erasure and the ‘Ghost Data’ Problem
A serious obstacle to resumption is the status of the estimated 400, 000 users in Spain and 300, 000 in Portugal who registered prior to the bans. The roadmap dictates a strict “re-verify or purge” policy. The AEPD has stipulated that consent obtained under the old model, which absence adequate disclosures regarding biometric processing and data transfer, is invalid for the new Personal Custody architecture. World must execute a two-phase remediation plan: 1. **Notification and Migration:** Existing users must be notified via the World App to migrate their old iris codes to the Personal Custody model. This requires active, explicit consent. 2. **The Great Purge:** For any user who does not migrate within a specified window, or for whom age cannot be retroactively verified, the associated iris code and all derived metadata must be permanently deleted from the central database. The use of Secure Multi-Party Computation (SMPC) to “shred” these codes into unusable shares is permitted only if the reconstruction of the original biometric template is mathematically impossible without the user’s private key. The CNPD requires independent certification of this deletion process before new scans can commence.
Decoupling Incentives from Consent
While the BayLDA resolution focused heavily on technical data security, the Spanish High Court’s ruling raised a fundamental problem regarding the validity of consent when conditioned on financial reward. The roadmap addresses this by requiring a clear separation between the “Proof of Human” verification and the receipt of WLD tokens. To satisfy the requirement that consent be “freely given” (GDPR Article 4(11)), World must demonstrate that users can withdraw their consent and delete their biometric data without losing access to previously accrued tokens, provided those tokens were not obtained fraudulently. Further, the “onboarding” process must present the privacy policy and consent forms in local languages (Spanish, Portuguese, Catalan, Basque, Galician) with granular granularity, ensuring users understand they are trading biometric data for a digital asset. The AEPD monitors this specifically to prevent the “coercive” nature of the transaction from exploiting populations, a concern that originally contributed to the ban.
Surveillance and Local Audit Rights
The final condition of the roadmap establishes a permanent surveillance method. While BayLDA retains the role of Lead Supervisory Authority, the AEPD and CNPD have secured the right to conduct unannounced local inspections of Orb sites. These inspections verify: * **Physical Security:** That Orbs are not left unattended or accessible to minors. * **Operator Training:** That “Orb Operators” are strictly prohibited from encouraging minors or obfuscating the risks of biometric collection. * **Signage:** That physical locations display clear, regulator-approved warnings about biometric data processing. The roadmap is not a guarantee of future stability; it is a probation agreement. World’s operations in Spain and Portugal are subject to a “kill switch.” Any deviation from the Personal Custody model or a single confirmed report of underage data collection authorizes the local regulators to invoke Article 66 again, chance leading to a permanent, irreversible ban. The load of proof has shifted entirely to World to demonstrate, on a daily basis, that its of a global identity network does not trample on the fundamental rights of Iberian citizens.
Timeline Tracker
March 6, 2024
The March 6 Directive: A Regulatory Guillotine — On March 6, 2024, the Spanish Data Protection Agency (AEPD) executed a rare and decisive legal maneuver against Tools for Humanity Corporation, the German-registered entity behind.
March 11, 2024
Judicial Validation: The Audiencia Nacional Ruling — Tools for Humanity immediately appealed the AEPD's order to the Audiencia Nacional (National Court), seeking an interim injunction to stay the ban. The company argued that.
June 2024
The Aftermath of the Injunction — Following the court's decision, Worldcoin was forced to disable its Orbs across Spain. The app remained downloadable, the verification function—the core of its "Proof of Personhood".
March 26, 2024
The CNPD's Biometric Blockade: A Regulatory Firewall — On March 26, 2024, the Portuguese National Data Protection Commission (CNPD) issued a decisive order that halted the operations of Tools for Humanity within its borders.
March 2024
The 'One-Stop-Shop' Friction: BayLDA's Authority vs. Iberian Regulators — BayLDA (Germany) Investigation (Nov 2022, Dec 2024); Deletion Order Article 60 (Cooperation), Article 58 (Powers) Technical complexity; absence of legal basis for storage; Security flaws (Art.
March 2024
The Economics of "Freely Given" Consent — Under GDPR Article 4(11), consent is valid only if it is free, specific, informed, and unambiguous. The European Data Protection Board (EDPB) Guidelines 05/2020 clarify that.
August 2024
The BayLDA's Retroactive Erasure Order — The culmination of these consent failures appeared in the administrative orders filtering down from the Bavarian State Office for Data Protection Supervision (BayLDA), the lead authority.
2023-2024
Forensic Analysis of the 'Orb' Age Verification Mechanisms and Failures — The forensic deconstruction of the 'Orb'—Worldcoin's chrome-shelled biometric imaging device—reveals a catastrophic disconnect between its advanced optical engineering and its rudimentary compliance. For a device marketed.
March 2024
From Centralized Databases to Personal Custody: A Regulatory Compliance Pivot — The architectural history of Worldcoin bifurcates sharply in March 2024. For the three years of its operation, the project relied on a centralized model where biometric.
March 22, 2024
The Mechanics of Personal Custody — On March 22, 2024, sixteen days after the Spanish ban, Worldcoin announced it would no longer store iris codes by default. Under the new Personal Custody.
May 15, 2024
Secure Multi-Party Computation (SMPC) — To maintain the integrity of the "Proof of Personhood" network without a central database to check against, Worldcoin implemented a cryptographic system known as Secure Multi-Party.
April 9, 2024
The "Unverify" Protocol and Right to Erasure — Alongside the storage pivot, Worldcoin introduced the ability to "unverify" a World ID, directly addressing the AEPD's citation regarding the inability to withdraw consent. Announced on.
March 26, 2024
Regulatory Skepticism and the Extended Ban — The introduction of Personal Custody and SMPC did not result in the immediate lifting of the Iberian bans. The AEPD maintained its blockade, and the CNPD.
2021-2024
Comparison of Architectures — The transition to Personal Custody represents a fundamental restructuring of Worldcoin's liability profile. By physically moving the data to the user's pocket, the company attempts to.
March 11, 2024
Judicial Review: The Spanish National Court's Ruling on Privacy vs. Commercial Interests — On March 11, 2024, the Contentious-Administrative Chamber of the Spanish National Court (*Audiencia Nacional*) delivered a decisive blow to Tools for Humanity's operations in Spain. In.
June 2024
Aftermath and Voluntary Extension — Following the court's refusal to grant the injunction, TFH was forced to comply with the order to cease all "Orb" operations in Spain. The legal defeat.
February 2025
The February 2025 BayLDA Resolution: Mandating Deletion of Spanish Biometric Data — The February 2025 BayLDA Resolution: Mandating Deletion of Spanish Biometric Data In February 2025, the regulatory siege on Worldcoin's operations in the Iberian Peninsula reached its.
May 2024
The Cryptographic Pivot: From Centralized Storage to Secret Shares — In May 2024, facing existential regulatory threats from the AEPD in Spain and the CNPD in Portugal, the World Foundation executed a radical architectural overhaul of.
May 15, 2024
The TACEO Factor and the "Deletion" Event — A central pillar of World's defense before the BayLDA (Bavarian State Office for Data Protection Supervision) was the assertion that the migration to SMPC allowed for.
December 2024
Regulatory Friction: Pseudonymization vs. Anonymization — even with the technical sophistication of SMPC, the BayLDA's December 2024 conclusion indicated that the measure did not fully absolve World of its GDPR obligations. The.
April 2024
Audit Verification: Least Authority and Trail of Bits — To its claims, World engaged security firms to audit the new infrastructure. Least Authority performed a security audit of the SMPC protocol in April 2024, examining.
2021-2024
Comparative Analysis: Legacy vs. SMPC Architecture — The following table contrasts the data protection posture of the original system against the SMPC implementation, highlighting the specific GDPR articles addressed. Data Storage Full Iris.
March 2024
AEPD Findings: The Identity Shell Game — The Spanish Data Protection Agency (AEPD) identified a serious failure in identifying the data controller. Article 13(1)(a) requires the controller to state their identity and contact.
March 2024
The need and Proportionality Test — Under GDPR, processing biometric data (Article 9) requires not just consent, a demonstration of need and proportionality. The AEPD's precautionary order in March 2024 challenged the.
February 2025
Post-Ban Compliance Roadmap: Regulatory Conditions for Resuming Iberian Operations — The route to resuming operations in the Iberian Peninsula is not a return to the of 2023; it is a reconstruction of the project's entire data.
March 2024
The Age Verification Mandate: From Checkboxes to Zero-Knowledge Proofs — The primary catalyst for the CNPD's 90-day suspension in March 2024 was the discovery of widespread underage data collection. The regulator received dozens of complaints regarding.
Why it matters: The fight for earned media in 2025 is impacted by staff cuts, AI-driven newsrooms, and journalists' time constraints. Journalists receive a high volume of pitches but find.
Tell me about the the march 6 directive: a regulatory guillotine of Worldcoin.
On March 6, 2024, the Spanish Data Protection Agency (AEPD) executed a rare and decisive legal maneuver against Tools for Humanity Corporation, the German-registered entity behind Worldcoin. The regulator invoked Article 66 of the General Data Protection Regulation (GDPR), a provision reserved for "exceptional circumstances" requiring immediate intervention to protect the rights and freedoms of individuals. This order mandated the immediate cessation of all biometric data collection within Spanish territory.
Tell me about the the catalogue of infringements of Worldcoin.
The AEPD's dossier against Worldcoin listed four primary violations that necessitated the Article 66 invocation., the agency identified a widespread failure to provide sufficient information to users regarding the processing of their data. The complex nature of biometric hashing and blockchain integration was not adequately explained to the average consumer, of whom were enticed solely by the prospect of receiving WLD tokens, then valued at approximately €70-€80. Second, and most.
Tell me about the judicial validation: the audiencia nacional ruling of Worldcoin.
Tools for Humanity immediately appealed the AEPD's order to the Audiencia Nacional (National Court), seeking an interim injunction to stay the ban. The company argued that the suspension would cause "irreparable harm" to its business, citing economic losses and reputational damage. They also contended that the AEPD had overstepped its jurisdiction by ignoring the competence of the Bavarian lead authority. On March 11, 2024, the Audiencia Nacional delivered a crushing.
Tell me about the the "one-stop-shop" bypass of Worldcoin.
The legal significance of this case extends beyond Worldcoin. It exposes the friction within the GDPR's enforcement architecture. The One-Stop-Shop method was designed to simplify regulation, allowing companies to deal with a single authority. Yet, critics have long argued that this system creates bottlenecks, particularly when the lead authority (in this case, Bavaria) is seen as slow to act. By invoking Article 66, Spain demonstrated that national authorities retain the.
Tell me about the the aftermath of the injunction of Worldcoin.
Following the court's decision, Worldcoin was forced to disable its Orbs across Spain. The app remained downloadable, the verification function—the core of its "Proof of Personhood" —was deactivated. This halted the company's growth in one of its most active European markets. The AEPD's aggressive stance also triggered a domino effect. The Portuguese Data Protection Commission (CNPD) followed suit shortly after, issuing a similar temporary ban citing the same concerns regarding.
Tell me about the the cnpd's biometric blockade: a regulatory firewall of Worldcoin.
On March 26, 2024, the Portuguese National Data Protection Commission (CNPD) issued a decisive order that halted the operations of Tools for Humanity within its borders. This directive, identified formally as Deliberation/2024/137, mandated an immediate suspension of all biometric data collection by the Worldcoin Foundation for a period of 90 days. The regulator's intervention was not a routine administrative check; it was an emergency measure invoked under Article 66 of.
Tell me about the the "minor" problem: exploiting the at-risk of Worldcoin.
The catalyst for this regulatory crackdown was a surge of complaints specifically regarding minors. In the month leading up to the ban, the CNPD received dozens of reports from parents and legal guardians who discovered their children had been scanned by Worldcoin Orbs without their knowledge or permission. These reports painted a disturbing picture of the project's field operations. Children, motivated by the prospect of free cryptocurrency, were queuing at.
Tell me about the technical negligence in the orb design of Worldcoin.
The failure in Portugal exposed a fundamental flaw in the design and deployment of the Worldcoin Orb. The device relies on advanced multispectral sensors to capture high-resolution images of the iris, which are then converted into a unique "IrisCode." While the technology is sophisticated in its ability to detect liveness and prevent duplicate registrations, it completely ignored the legal need of age verification. In the physical world, a bank or.
Tell me about the worldcoin's response and the personal custody pivot of Worldcoin.
In the wake of the CNPD's order, Tools for Humanity attempted to manage the. Jannick Preiwisch, the Data Protection Officer for the Worldcoin Foundation, issued a statement claiming the organization had "zero tolerance" for underage sign-ups. He asserted that the company was working to address the problem, even suggesting that the reports to the CNPD were the they had heard of the specific complaints. This claim of ignorance stood in.
Tell me about the the 'one-stop-shop' friction: baylda's authority vs. iberian regulators of Worldcoin.
BayLDA (Germany) Investigation (Nov 2022, Dec 2024); Deletion Order Article 60 (Cooperation), Article 58 (Powers) Technical complexity; absence of legal basis for storage; Security flaws (Art 32). AEPD (Spain) Immediate Ban (March 2024) Article 66 (Urgency Procedure) Processing of minors' data; Inability to withdraw consent; High risk to rights. CNPD (Portugal) Temporary Limitation (March 2024) Article 66 (Urgency Procedure) Protection of minors; absence of age verification; Excessive data collection. Regulatory.
Tell me about the coerced consent? the legality of exchanging iris scans for wld tokens of Worldcoin.
The central method of Worldcoin's expansion—the exchange of high-resolution iris scans for WLD tokens—presents a fundamental conflict with the General Data Protection Regulation (GDPR). While the company frames this transaction as "proof of personhood" rewarded by a share in the network, European regulators view it through the lens of Article 4(11) of the GDPR, which demands that consent be "freely given." The core legal question facing the Agencia Española de.
Tell me about the the economics of "freely given" consent of Worldcoin.
Under GDPR Article 4(11), consent is valid only if it is free, specific, informed, and unambiguous. The European Data Protection Board (EDPB) Guidelines 05/2020 clarify that consent is not free if the data subject has no genuine or free choice or is unable to refuse or withdraw consent without detriment. In the context of Worldcoin, the "detriment" is the forfeiture of WLD tokens, which held significant market value during the.
Why it matters: Global press freedom faces unprecedented challenges, with a significant decline in safety and economic security for journalists. Journalist killings, state-sponsored censorship through.
Why it matters: Online safety tools are crucial for protecting children from online threats. Despite their availability, the effectiveness of these tools remains a topic.
Why it matters: Direct-to-consumer DNA kits for personalized fitness are facing serious ethical and practical issues. Regulators and experts warn about misleading claims and accuracy.
Why it matters: Investigations reveal widespread labor abuses in India's IT and outsourcing sector Tech workers face overwork, underpayment, and discrimination in major Indian and.
Why it matters: Child malnutrition rates in India remain among the world’s worst despite economic growth and government efforts. Flagship schemes like Poshan Abhiyaan have.