Git Product home page Git Product logo

gpact's Introduction

General Purpose Atomic Crosschain Transactions Protocol

This repo contains the General Purpose Atomic Crosschain Transactions (GPACT) protocol implementation and associated protocols. It contains Solidity contracts, Java library code, test code, and example code. It contains Docker files to run multiple blockchains using Hyperledger Besu so that the entire system can be run on a laptop computer. The GPACT Protocol is described in this paper: https://arxiv.org/abs/2011.12783.

Crosschain Protocol Layers

GPACT forms part of an overall crosschain protocol stack as shown in the diagram below. The links in the table below will take you to implementations of those parts of the protocol stack.

Crosschain Protocol Layer GPACT
(Atomic Updates)
SFC
(Not Atomic Updates)
Crosschain Application Layer Examples:
Conditional Execution
Hotel Train problem (3 blockchains)
Read across chains
ERC 20 Bridge
Trade-Finance (5 blockchains)
Write across chains
Examples:
ERC 20 Bridge
ERC 721 Bridge
Write across chains
Applications:
ERC 20 Bridge
Applications:
ERC 20 Bridge
ERC 721 Bridge
Helper contracts:
Lockable storage
Crosschain Function Call Layer Interfaces: Solidity Contracts, Java SDK
General Purpose Atomic Crosschain Transaction (GPACT): Solidity Contracts, Java SDK Simple Function Call (SFC): Solidity Contracts, Java SDK
Crosschain Messaging Layer Interfaces: Solidity Contracts, Java SDK
Messaging implementations:
Event Attestation: Solidity Contracts, Java SDK
Transaction Receipt Root Transfer: Solidity Contracts, Java SDK
Event Relay: Solidity Contracts

Services: Relayer / Attestor

Applications use the Crosschain Function Call code to execute function calls across blockchains. Crosschain Function Call code uses Crosschain Message Verification to ensure information from one blockchain is trusted on another blockchain. The layers of the protocol stack are separated by interfaces. Using common interfaces allows applications to use a variety of crosschain function call implementations, and for crosschain function call implementations to use a variety of crosschain messaging implementations. Importantly, this allows for different crosschain messaging systems to be used for different blockchains and rollups. It allows applications to choose use lighter weight non-atomic function call approaches (which may be less costly and have lower latency) for low cost and less important transactions and fully atomic protocols such as GPACT for more important business critical transactions.

Applications that are written for atomic crosschain function protocols will be different to non-atomic function call protocols, because the non-atomic implementations need to handle failures where an execution occurs on a source blockchain by not a destination blockchain.

It is expected that more Crosschain Messaging and Crosschain Function Call implementations will be written. Additionally, more example application code will be written. Please get in contact if you are interested in writing an implementation or an example.

GPACT

The General Purpose Atomic Crosschain Transaction protocol is a blockchain technology that allows function calls across blockchains that either updates state on all blockchains or discards state updates on all blockchains. The function calls can update state on each blockchain and return values across blockchains. The protocol enables applications to access information and utilise functionality that resides on one blockchain from other blockchains. Unlike previous atomic crosschain protocols that only offer atomic asset swaps, this protocol allows for general purpose application logic.

Trade Finance using GPACT protocol

The figure above shows a logical representation of a crosschain call graph using the protocol. A trade finance application creates a crosschain function call that goes across five contracts on five blockchains to execute a trade for a shipment of goods. The Root Transaction executes the entry point function, the executeTrade function in the Trade Wallet contract on the Wallet blockchain. The Trade Wallet contract could be a multi-signatory wallet that parties to a shipment have to submit a transaction to, indicating that they agree a shipment for a certain quantity of goods has been made and should be paid for. The executeTrade function calls the shipment function in the Logic contract on the Terms blockchain to determine the price that should be paid and to affect the transfer of stock and payment. The shipment function calls the getPrice function on the Oracle contract on the Price Oracle blockchain to determine the price that should be paid for the goods, then calls the transfer function on the Balances contract on the Finance blockchain to affect the payment, and finally calls the delivery function on the Stock contract on the Logistics blockchain to register the changed ownership of the goods.

It could be argued that some of the contracts could exist on the one blockchain, thus reducing the need for crosschain transactions. However, the Finance blockchain and the Logistics blockchain in particular could be consortium blockchains involving different participants. The Price Oracle blockchain could be operated by a consortium that charged for access to the information. Government regulators could require the logic on the Terms blockchain visible to them, but the participants in the trade wallet on the Wallet blockchain may wish to remain anonymous. A crosschain transaction capability is needed to meet these requirements.

More information

How to build

Reproducing Performance Results

Design documentation

gpact's People

Contributors

drinkcoffee avatar ermyas avatar hbriese avatar lucassaldanha avatar wcgcyx avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gpact's Issues

Why do we need to simulate cross-tx?

according GPACT paper, the purpose of simulation is to expect all function parameter values of call tree. so, we have to verify all expected values and the actual values are equal.
question : i don't understand why we need expectation, expected values. without expectation&simulation, what is going to happen?

Docker images no longer on docker hub for ./test-blockchains docker-compose.yaml

The docker image consensys/gpact/services/relayer either does not exist any longer or is not authed for public access. Error messages:

ERROR: The image for the service you're trying to recreate has been removed. If you continue, volume data could be lost.

ERROR: pull access denied for consensys/gpact/services/relayer, repository does not exist or may require 'docker login': denied: requested access to the resource is denied

If this is by design understood, just thought to check as this repo is very intriguing.

Multi-signature messaging layer encoding update to be compatible with EEA draft specification

The EEA draft messaging layer specification is here:
https://entethalliance.github.io/crosschain-interoperability/draft_crosschain_techspec_messaging.html

The encoding of multiple signatures has been updated to add a type field and a meta field. This codebase should be updated to be compatible with the draft specification.

pragma solidity >=0.8;
struct Signature {
uint256 by;
uint256 sigR;
uint256 sigS;
uint8 sigV;
bytes meta;
}
struct Signatures {
uint16 typ;
uint16 numberOfSignatures;
Signature[] signatures;
}

Gradle build keeps failing

I installed all the requirements listed and added Web3J to the gpact folder. But I keep getting this when I am trying the scripts/create_chain.js 32 1 command:

~/gpact$ scripts/create_chain.js 32 1
Creating a blockchain with chainID 32 and 1 validators in the genesis file...
child_process.js:661
    throw err;
    ^

<ref *1> Error: spawnSync ../besu/build/install/besu/bin/besu ENOENT
    at Object.spawnSync (internal/child_process.js:1074:20)
    at spawnSync (child_process.js:626:24)
    at execFileSync (child_process.js:653:15)
    at Object.<anonymous> (/home/user/gpact/scripts/create_chain.js:79:17)
    at Module._compile (internal/modules/cjs/loader.js:1068:30)
    at Object.Module._extensions..js (internal/modules/cjs/loader.js:1097:10)
    at Module.load (internal/modules/cjs/loader.js:933:32)
    at Function.Module._load (internal/modules/cjs/loader.js:774:14)
    at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:72:12)
    at internal/main/run_main_module.js:17:47 {
  errno: -2,
  code: 'ENOENT',
  syscall: 'spawnSync ../besu/build/install/besu/bin/besu',
  path: '../besu/build/install/besu/bin/besu',
  spawnargs: [
    '--data-path=/home/user/cgpact_data/chain32/node0',
    'public-key',
    'export-address',
    '--to=/home/user/cgpact_data/chain32/node0/node-acct'
  ],
  error: [Circular *1],
  status: null,
  signal: null,
  output: null,
  pid: 43928,
  stdout: null,
  stderr: null
}

What would be the potential reason to that? I am sorry if this is an obvious question, I'm new in this.

GPACT v1 and v2 common code should be in a common solidity file

GPACTv1 and v2 though different, have some common code. For example, the Segment event is defined identically.

Having duplication of identical code will lead to one being updated and not the other. It also makes it harder for people new to the codebase to understand how GPACT v1 and v2 differ.

Pausible

Crosschain function calls should be pausible This would allow crosschain bridges built on top of the crosschain function call approach to be paused.

SFC: ERC 20 bridge: all different ERC 20 configurations on the same bridge

The ERC 20 bridge Solidity code (https://github.com/ConsenSys/gpact/tree/main/application/nonatomic-appcontracts/erc20bridge/src/main/solidity) at present allows multiple ERC 20 token types to be transferred across the one bridge. However, a bridge contract is either deployed as a Mass Conservation bridge, or a Minting / Burning bridge. Each token is likely to need a different configuration.

For example:
Blockchain A. Blockchain B. Blockchain C
Token A Mass C. Mass C. Mass C.
Token B. Mass C. Mint Burn
Token C. Mint Burn. Mass C. Mass C.

To support this variety of ERC 20 token configurations, the bridge needs to have a configuration item when a token contract is added, indicating what type the token contract is. That is, the configuration should specify how the bridge should use the token contract: transferFrom or mint.

Change from ByteUtils to abi.decode

abi.decode can be used to decode ABI encoded data. The repo has used ByteUtils contract to do this. However, the abi.decode is likely to be more gas efficient, especially if entire data structures are being decoded.

GPACTv2 function call salts

Salt needs to be added to function calls to increase the security of GPACTv2.

The GPACT v2 relies of salted message digests as part of its security. That is, in the call execution tree, there is are message digests called FunctionCallHashes.

FunctionCallData = transaction call data (that is function selector and function data)
FunctionHash = Keccak256(abi.encode(FunctionCallData, Salt))
FunctionCallHash = Keccak256(abi.encodePacked(blockchain id, contract address, FunctionHash)

The code as currently written has FunctionHash = Keccak256(FunctionCallData)

The best way to have the Salt carried around with FunctionCallData is to concatenate it when generating the FunctionCallData, in the sdk. As the Salt is at the end of the call data, it will be ignored by function call processing in the EVM. Having the Salt as part of the FunctionCallData will mean that an extra variable won't need to be passed around. However, when comparing expected and actual function call data, the code will need to ignore the Salt.

The Salt needs to be a standard size. 128 bits will provide adequate security.

Voting algorithm weight to allow stake based voting

The voting algorithms are currently based on a fixed weighting between participants. The voting interface and registration system should allow variable weighting to be configured, to allow stake based voting.

add getVotingAlgoirthmName
getVotingWeight(address)
getListOfVoters() address []

GPACT v1: Segment Events need to contain root blockchain id

Segment events are shown to be related to a Start Event, and a root blockchain by the Crosschain Transaction Id and the Call Tree Hash.
Root events only contain the Crosschain Transaction Id (and success / fail). The root blockchain id can be inferred by which chain emitted the Root event.

However, in the signalling function, there is no way to check that the root blockchain is the correct one.

The segment event should emit its understand of the root blockchain id, which can then be checked in the signalling function.

Destination contract and source / from contract allow listing

Function call approaches should have the ability to switch on an "allow" or "white" listing of contracts that can call the system or be called by the system. A possible approach would be:

Have a bool destContractAllowListingEnabled
and have a mapping(address => bool) destContractAllowed

By default when the contract is constructed destContractAllowListingEnabled is false. When one or more contracts are added to the allow list, destContractAllowListingEnabled would be set to true.

The ability to remove allowed contracts is required too.

GPACT ERC 20 Bridge Rework

The GPACT ERC 20 bridge is currently implemented as Crosschain ERC 20 and Traditional IERC20 Adaptor, wrapping the Lockable ERC 20 contract. This means that each ERC 20 contract is its own bridge, rather than having one bridge that manages multiple ERC 20s.

It is a better idea to allow for a separate bridge contract to manager multiple ERC 20s for the following reasons:

  • There are fewer bridges to deal with. Application bridges are a possible source of issues. It would be better to have as few as possible.
  • The bridge could be upgraded independently of the ERC 20 contracts.
  • The ERC 20 contracts could be managed by both atomic (GPACT) and non-atomic (SFC) bridges.
  • There is a large variety of ERC 20 contracts. Having the possible ERC 20s with lockable and crosschain adds to the possible number of combinations. Having a separate bridge removes "crosschain" as one of the dimensions of combinations.

msg.sender functionality

Ethereum contracts allow allow the caller to be determined by looking at msg.sender. This can be used to check that a contract is being called by a specific other contract. This functionality could be emulated in the GPACT system by having a function that returns the blockchain id and address of the contract that called the current contract.

The Cross-Blockchain Control contract could have functions to return these values. The contract could look at the call graph to determine these values, plus to the entry point contract for this blockchain.

Support for GPACT v2 should be added to the Relayer

Support for GPACT v2 should be added to the Relayer.

GPACT v2 currently works with the "fake" messaging layer. The Relayer does not understand GPACTv2 events. The Relayer needs the following added:

  • GPACTv2 as a type of protocol.
  • The event signatures added as events to be observed for the GPACTv2 protocol.

TODOs in CrosschainControl.sol

There are a number of TODOs in GPACT CrosschainControl.sol. These need to be addressed to ensure events are securely processed.

Call Execution Tree encoding

The Call Execution Tree is encoded as RLP. In Java, this must be decoded to process the call execution tree in the execution engine, and in the Solidity code it has to be decoded. Decoding RLP has been shown to use more gas than by using abi.encodePacked encoding.

Issue: RLP encoding Call Execution Tree is inefficient in Solidity and makes understanding the Java code more complex

Proposed solution: abi encode / binary call execution tree and have a class structure that represents the structure in Java.

Parallel Execution Engine can cause "Replacement transaction underpriced"

When the parallel execution engine is used, it sends transactions in parallel. If multiple transactions are executed by Ethereum from the same account with the same nonce at the same gas price, then an error "Replacement transaction underpriced" is returned.

Web3J determines the nonce to use by contacting the Ethereum node and finding out what the current nonce value is for the account, given the current transactions. However, if multiple parallel transactions are sent simultaneously, they will be submitted with the same nonce.

There are two ways of solving this issue:

Send transactions that are for information from separate blockchains using different accounts. This would then mean that each of the accounts would need to be funded on each blockchain.
Keep a track of the nonce value to use, and create the Web3J raw transaction manager. An example of how to do this is here: https://ethereum.stackexchange.com/questions/66277/how-to-change-the-nonce-between-transactions-in-web3j

Crosschain Application Authentication

Currently, crosschain function call code stores the from blockchain and from contract and from root blockchain (for GPACT), and then make these available via isSingleBlockchainCall() and whoCalledMe(). Invariably, these functions are called at the entry to an application function. A more gas efficient way of doing things would be to:

  • abi.encode(functionParam, rootBc, fromBc, fromContract)
  • Have application functions take the extra parameters
  • Have the application functions check that msg.sender is the Crosschain Control contract. This is needed to ensure the additional parameters are correct.

The advantage of this is not having to store the two or three storage locations and then not have to later read them.

Locking simplification

The locking implementation as currently written allows for multiple calls to the same contract, if the calls are within the same crosschain call. In some rare situations when a read or write is executed out of natural execution order relative to a write, an attacker could create a malicious invalid update.

The protocol has been revised to prevent this attack by simplifying the locking mechanism: any updates prevent any further updates to the contract. The lockable storage contract needs to be updated to reflect this update in the protocol.

Transaction Receipt Root messaging protocol uses faked relayer

The Transaction Receipt Root messaging protocol offers a Crosschain Messaging Layer protocol where transaction receipt roots are transferred from source to target chains, and then events can be proven to belong to transactions that are a part of a block.

The Relayer does not support Transaction Receipt Root transfer at this point. The transfer is done using a "faked" signer.

The relayer should implement Transaction Receipt Root transfer.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.