Research Pulse #82 09/12/2022

  1. Zapper: Smart Contracts with Data and Identity Privacy
    Authors: Samuel Steffen, Benjamin Bichsel, and Martin Vechev

Privacy concerns prevent the adoption of smart contracts in sensitive domains incompatible with the public nature of shared ledgers.
We present Zapper, a privacy-focused smart contract system allowing developers to express contracts in an intuitive frontend. Zapper hides not only the identity of its users but also the objects they access—the latter is critical to prevent deanonymization attacks. Specifically, Zapper compiles contracts to an assembly language executed by a non-interactive zero-knowledge processor and hides accessed objects by an oblivious Merkle tree construction.
We implemented Zapper on an idealized ledger and evaluated it on realistic applications, showing that it allows generating new transactions within 22 s and verifying them within 0.03 s (excluding the time for consensus). This performance is in line with the smart contract system ZEXE (Bowe et al., 2020), which offers analogous data and identity privacy guarantees but suffers from multiple shortcomings affecting security and usability.

Link to Paper

  • Privacy is at the forefront of industry discussions given the recent arrest of the developer of Tornado Cash.
  • It can be argued that achieving privacy in the context of blockchains is the only way that many critical use cases for this technology, especially those that require personal data, can be accomplished.
  • This paper presents a fascinating schema called Zapper, which is a proof-of-concept of a private smart contract system that hides the identity of its users, as well as the applications they interact with.
  1. Measuring node decentralisation in blockchain peer to peer networks
    Authors: Andrew Howell, Takfarinas Saber, and Malika Bendechache

New blockchain platforms are launching at a high cadence, each fighting for attention, adoption, and infrastructure resources. Several studies have measured the peer-to-peer network decentralisation of Bitcoin and Ethereum (i.e., two of the largest used platforms). However, with the increasing demand for blockchain infrastructure, it is important to study node decentralisation across multiple blockchain networks–especially those containing a small number of nodes. In this paper, we propose NodeMaps, a data processing framework to capture, analyse, and visualise data from several popular P2P blockchain platforms such as Cosmos, Stellar, Bitcoin, and Lightning Network. We compare and contrast the geographic distribution, the hosting provider diversity, and the software client variance in each of these platforms. Through our comparative analysis of node data, we found that Bitcoin and its Lightning Network layer 2 protocol are widely decentralised P2P blockchain platforms, with the largest geographical reach and a high proportion of nodes operating on The Onion Router (TOR) privacy-focused network.Cosmos and Stellar blockchain have reduced node participation with nodes predominantly operating in large cloud providers or well-known data centres.

Link to Paper

  • One of the biggest determinants of a network’s decentralization is the number of independent nodes, or participants, it retains over time.
  • The very same principle is true for cryptoasset networks, which often comprise thousands of nodes hosted across many services and geographies.
  • This paper evaluates how cryptoasset nodes can be monitored and sheds light on the IP distribution and hosting providers of different cryptoassets.
  1. Authentication, Authorization, and Selective Disclosure for IoT data sharing using Verifiable Credentials and Zero-Knowledge Proofs
    Authors: Nikos Fotiou, Iakovos Pittaras, Spiros Chadoulos, Vasilios A. Siris, George C. Polyzos, Nikolaos Ipiotis, and Stratos Keranidis

As IoT becomes omnipresent vast amounts of data are generated, which can be used for building innovative applications. However, interoperability issues and security concerns, prevent harvesting the full potentials of these data. In this paper we consider the use case of data generated by smart buildings. Buildings are becoming ever “smarter” by integrating IoT devices that improve comfort through sensing and automation. However, these devices and their data are usually siloed in specific applications or manufacturers, even though they can be valuable for various interested stakeholders who provide different types of “over the top” services, e.g., energy management. Most data sharing techniques follow an “all or nothing” approach, creating significant security and privacy threats, when even partially revealed, privacy-preserving, data subsets can fuel innovative applications. With these in mind we develop a platform that enables controlled, privacy-preserving sharing of data items. Our system innovates in two directions: Firstly, it provides a framework for allowing discovery and selective disclosure of IoT data without violating their integrity. Secondly, it provides a user-friendly, intuitive mechanisms allowing efficient, fine-grained access control over the shared data. Our solution leverages recent advances in the areas of Self-Sovereign Identities, Verifiable Credentials, and Zero-Knowledge Proofs, and it integrates them in a platform that combines the industry standard authorization framework OAuth 2.0 and the Web of Things specifications.

Link to Paper

  • The intersection of blockchains and IoT continues to garnish substantial interest by academia and industry participants.
  • Given the highly sensitive nature of IoT data, many critics have shown concern for this trend due to the low privacy guarantees that blockchains currently offer.
  • This paper explores the use of Zero Knowledge Proofs (ZKPs) to obfuscate aggregated IoT data and limit the potential for privacy attacks.
  • While it does not apply this schema to blockchains, the paper does shed light on the primitives that would be required in order for IoT applications to operate on blockchains in a private setting.
  1. A comparison of the limit order book and automated market maker
    Author: Caren de Vries

All investment firms in Europa must track the market quality of different exchanges and choose the best for their clients. Previous research has shown that a limit order book with additional market makers performs best in terms of market quality, followed by a pure limit order book and market maker. With the rise of cryptocurrencies, a new exchange called the automated market maker has appeared in the crypto markets. It is unclear how it compares to the limit order book regarding market quality. This thesis compares the automated market maker with the limit order book. We have built a simulation and tested seven different scenarios, varying in liquidity, information, price variability and fee. Six out of seven scenarios indicated that the automated market maker outperforms the limit order book.

Link to Paper

  • The trading of assets is one of the key use-cases in DeFi, with platforms like Uniswap frequently settling billions of dollars worth of trades.
  • DeFi exchanges are often implemented using the Automated Market Maker (AMM) model which is substantially different than traditional orderbook exchanges.
  • This paper provides a comprehensive comparison of these two market types and covers all of their mathematical underpinnings.
  1. WTEYE: On-chain wash trade detection and quantification for ERC20 cryptocurrencies
    Authors: Wei Cui and Cunnian Gao

Wash trade is a common form of volume manipulation way to attract investors into the market and mislead them to make wrong investment judgments. Wash trade transactions are even more prominent in ERC20 cryptocurrencies. In this paper, we proposed two kinds of algorithms to reserve a direct evidence of wash trade based on the on-chain transactions data of ERC20 cryptocurrencies. After labeling wash trade, we continued to obtain features of wash trade and quantify the volume of wash trade. Our experiments show that for most ERC20 cryptocurrencies the rate of wash trade reached over 15%. Specifically, over 30% transactions of UNI token was labeled as wash trade. It is demonstrated that the activations of most ERC20 cryptocurrencies are unreal and restoring real data is necessary for market regulation.

Link to Paper

  • Also on the topic of trading, one of the biggest concerns with permission-less AMMs is the susceptibility to nefarious market attacks, such as wash trading.
  • For context, wash trading is a form of market manipulation where an entity can affect the price of an asset by simultaneously buying and selling it.
  • In traditional markets, exchange operators monitor the identity of buyers and sellers to prevent that from happening. However, that is not possible (or perhaps wanted) in the context of AMMs.
  • This paper does a great job proposing a detection model for wash trading in AMMs and quantifies how much it has taken place.
  • Astonishingly, their experiment find that, for most ERC20s, the rate of wash trade is at least 15% of the token’s volume.
  1. ReSuMo: Regression Mutation Testing for Solidity Smart Contracts
    Authors: Morena Barboni, Francesco Casoni, Andrea Morichetta, and Andrea Polini

Mutation testing is a powerful test adequacy assessment technique that can guarantee the deployment of more reliable Smart Contract code. Developers add new features, fix bugs, and refactor modern distributed applications at a quick pace, thus they must perform continuous re-testing to ensure that the project evolution does not break existing functionalities. However, regularly re-running the entire test suite can be time intensive, especially when mutation testing is involved. This paper presents ReSuMo, the first regression mutation testing approach and tool for Solidity Smart Contracts. ReSuMo uses a static, file-level technique to select a subset of Smart Contracts to mutate and a subset of test files to re-run during a regression mutation testing campaign. ReSuMo incrementally updates the mutation testing results considering the outcomes of the old program version; in this way, it can speed up mutation testing on evolving projects without compromising the mutation score.

Link to Paper

  • Mutation testing is a relatively new field in computer science and is used to evaluate the quality of a software testing suite or to develop new ways to test software.
  • The basic idea of mutation testing is the introduction of small changes, or mutations, to a software program with the ultimate goal of identifying unwanted or unexpected behavior.
  • This paper introduces a tool called ReSuMo, which applies regression mutation testing to Solidity Smart Contracts.
8 Likes