Ethereum co-founder Vitalik Buterin recently authored a research paperthe primary focus of which was integrating privacy features into blockchain transactions while ensuring compliance with a range of regulatory requirements.
Experts from various backgrounds collaborated on this research project, including early Tornado Cash contributor Ameen Soleimani, Chainalysis chief scientist Jacob Illum, and researchers from the University of Basel.
The diverse team reflects the interdisciplinary nature of the research, drawing insights from cryptocurrency, blockchain security and academic scholarship.
The paper suggests a protocol known as “Privacy Pools,” which can act as a regulation-compliant tool aimed at improving the confidentiality of user transactions.
How do Privacy Pools work?
Privacy Pools, as Buterin and the team explain in the research paper, aim to protect the privacy of transactions while separating criminal activities from lawful funds by organizing them into isolated sets or categories, allowing users to prove to regulators that their funds are not mixed with illicit funds.
This is accomplished through the use of techniques like zero-knowledge proofs to demonstrate the legitimacy of the transactions and the absence of involvement with criminal activities.
Zero-knowledge proofs are cryptographic techniques that allow one party (the prover) to demonstrate knowledge of a specific piece of information to another party (the verifier) without revealing any details about the information itself.
When users want to take their money out of the Privacy Pool, they can choose to create a zero-knowledge proof. This proof does two things: First, it confirms that the user’s transaction is legitimate and doesn’t involve a blockchain address associated with criminal activity. Second — and more importantly for users — it keeps their identities private.
Association sets
Another crucial part of how Privacy Pools work is the idea of “association sets,” subsets of wallet addresses within a cryptocurrency pool. When making withdrawals from the pool, users specify which association set to use. These sets are designed to include only noncritical or “good” depositors’ wallet addresses while excluding those considered “bad” depositors.
The purpose of association sets is to maintain anonymity, as withdrawn funds can’t be precisely traced to their source. However, it can still be proven that the funds come from a noncritical source.
Association set providers (ASPs) create these sets and are trusted third parties responsible for analyzing and evaluating the pool’s contributing wallets. They rely on blockchain analytics tools and technologies used in Anti-Money Laundering and transaction analysis.
Association sets are formed through two distinct processes: inclusion (membership) proofs and exclusion proofs.
Inclusion, also known as membership, is the process of curating a selection based on positive criteria, much like creating a “good” list. When considering deposits, for instance, you examine various options and identify those with clear evidence of being secure and low-risk.
Recent: Multiple buyers consider purchase and relaunch of ‘irreparable’ FTX
Exclusion involves forming a selection by focusing on negative criteria, much like compiling a “bad” list. In the context of deposits, ASPs evaluate different options and pinpoint those that are evidently risky or unsafe. Subsequently, they generate a list that comprises all deposits except for the ones categorized as risky, thereby excluding them from the list.
The paper takes an example of a group of five people: Alice, Bob, Carl, David and Eve. Four are honest, law-abiding individuals who want to keep their financial activities private.
However, Eve is a thief or hacker, and this is well known. People may not know who Eve really is, but they have enough proof to know that the coins sent to the address labeled “Eve” come from a “bad” source.
When these individuals use the Privacy Pool to withdraw money, they will be grouped together by ASPs with other users based on their deposit history via association sets.
Alice, Bob, Carl and David want to make sure their transactions are kept private while reducing the chances of their transactions looking suspicious at the same time. Their deposits have not been linked to any potential malicious activity, so the ASP chooses for them to be associated only with each other. So, a group is created with just their deposits: Alice, Bob, Carl and David.
Eve, on the other hand, also wants to protect her privacy, but her own deposit — which comes from a bad source — cannot be left out. So, she’s added to a separate association set that includes her deposit and the others, forming a group with all five user’s deposits: Alice, Bob, Carl, David and Eve.
Essentially, Eve is excluded from the original group with the trusted deposits (Alice, Bob, Carl and David) but is instead added to a separate group that includes her transactions and the others. However this doesn’t mean that Eve can use the privacy pool to mix her funds.
Now, here’s the interesting part: Even though Eve doesn’t provide any direct information about herself, it becomes clear by the process of elimination that the fifth withdrawal must be from Eve, as she’s the only one associated with all five accounts in the withdrawal records (since she was added to the separate group that included all five deposits).
Association sets help Privacy Pools by separating trustworthy users from questionable ones.
This way, transactions from reliable sources stay private, while any shady or suspicious ones become more visible and easier to spot.
This way, malicious actors can be tracked, which can satisfy regulatory requirements since the bad users won’t be able to use the pools to hide their activities.
What are others saying about the proposals?
Buterin’s paper has sparked discussions and garnered attention from the blockchain community and industry experts. Ankur Banerjee, co-founder and chief technology officer of Cheqd — a privacy-preserving payment network — believes Privacy Pools can make it easier for noncentralized entities to identify bad actors.
Banerjee told Cointelegraph, “The approach outlined could make this kind of money laundering analysis more democratized, and available to DeFi protocols as well. In fact, in the case of crypto hacks, it’s very hard to prevent hackers from trying to launder what they’ve stolen via DeFi protocols — it’s only centralized exchanges where they can be more easily caught/stopped.”
Seth Simmons (aka Seth For Privacy), host of the privacy-focused podcast Opt Outtold Cointelegraph, “While the concept is technically interesting in that it does minimize the data given over to regulated entities, it asks and answers the wrong question. It asks the question ‘What privacy are we allowed to have?’ instead of ‘What privacy do we need to have?’”
Simmons continued, saying, “For years now, there has been no balance between user anonymity and regulatory compliance, with the current ruling powers having an almost total visibility into the actions we take and the ways we use our money.”
“Privacy Pools must seek to right this imbalance by providing the maximum privacy for users possible today instead of attempting to lessen that privacy to please regulators.”
Banerjee expressed concerns about the built-in delays for adding deposits to association sets, stating, “Tokens can’t immediately get included in a ‘good’ or ‘bad’ set since it takes some time to figure out whether they are ‘good’ or ‘bad.’ The paper suggests a delay similar to seven days before inclusion (this could be higher or lower).”
Banerjee continued, “But what’s the right amount of time to wait? Sometimes, like in the case of crypto hacks, it’s very obvious soon after the hack that the coins might be bad. But in the case of complex money laundering cases, it might be weeks, months or even years before tokens are figured out to be bad.”
Despite these concerns, the paper says deposits won’t be included if they are linked to known bad behavior such as thefts and hacks. So, as long as malicious behavior is detected, this should not be a concern.
Additionally, people with “good” deposits can prove they belong to a trusted group and gain rewards. Those with “bad” funds can’t prove their trustworthiness, so even if they deposit them in a shared pool, they won’t gain any benefits. People can easily spot that these bad funds came from questionable sources when they’re withdrawn from a privacy-enhancing system.
Recent regulatory actions
Recent actions within the blockchain space have underscored the critical need for privacy and compliance solutions. One notable incident involved the United States government imposing sanctions on Tornado Cash, a cryptocurrency mixing service.
This move was prompted by allegations that Tornado Cash had facilitated transactions for the North Korea-linked hacking group Lazarus. These sanctions effectively signaled the U.S. government’s heightened scrutiny of privacy-focused cryptocurrency services and their potential misuse for illicit purposes.
Chris Blec, host of the Chris Blec Conversations podcast, told Cointelegraph, “It’s the easy way out to just look at recent news and decide that you need to start building to government specifications, but sadly, that’s how many devs will react. They’re not here for the principle but for the profit. My advice to those who care: Build unstoppable tech and separate it from your real-world identity as much as possible.”
Magazine: Slumdog billionaire 2: ‘Top 10… brings no satisfaction’ says Polygon’s Sandeep Nailwal
As the adoption of cryptocurrencies and decentralized applications continues to grow, governments and regulatory bodies worldwide grapple with balancing enabling innovation and safeguarding against illegal activities.
Simmons believes it is better to have tools governments cannot shut down: “Regulators will continue to push the imbalance of privacy and surveillance further in their direction unless we actively seek to build tools that give power back to the individual.”
He continued, “Tornado Cash is a perfect example of this, as they even went above and beyond and complied with regulators as much as was technically possible, and yet that wasn’t enough for ‘them.’ Even after supposedly becoming compliant, they remained a target of the U.S. government because governments do not want a balance between compliance and privacy — they want total surveillance, which leads to total power.”
“What we need to build in the space are tools (like Tornado Cash) that are resistant to state-level attacks and impossible to shut down or censor, as this is the only way to ensure we have tools at our disposal to defend our freedoms and keep governments in check. Privacy or bust.”