Git Product home page Git Product logo

sygma-explorer-indexer's People

Stargazers

 avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

sygma-explorer-indexer's Issues

Refactor indexing logic

Update the indexing logic to check for the latest processed block for each domain in the database. Based on this information, continue indexing chain events. This change ensures that the system can recover robustly in case of interruptions or restarts, as it will always resume from the last processed block for each domain.

Implementation details

On indexer startup, the logic should be:

  • iterate through domains from loaded shared configuration
    • determine starting block for the domain (use the last indexed block that is saved in DB, if there isn't one, use one defined in shared config)
    • start indexing process for this domain

Indexing logic should process blocks in batches (small batches of 5, similar to the relayer). Check for all events in the batch of blocks, process them, and save information to the database. In the end, it should also record the last processed block to the database. This process is being repeated until the indexer catches up with the blockchain's latest block. At this point domain indexer skips a few seconds and checks if there are enough new blocks to process them (the batch size defines this). This is an ongoing process.

Testing details

  1. Test the updated listening logic to ensure it correctly retrieves the latest processed block for each domain from the database.
  2. Verify that the indexing process resumes from the correct block based on the retrieved information.
  3. Cover basic cases with unit tests

Acceptance Criteria

  1. The listening logic is updated to save information on the last processed block for each domain to the database.
  2. The indexing process resumes from the correct block based on the retrieved information from the database.
  3. All test cases pass, confirming that the updated implementation functions as expected.

Investigate Prisma migrations

Investigate how to use Prisma migrations on our indexer so we can make model upgrades without executing the entire indexing process from the beginning.

Acceptance Criteria

  • Create a guide on using migrations
  • Test it on testnet environment

Update e2e test environment

Add failed transactions to: evm1 or evm2 and substrate-pallet image.
update substrate-pallet image with newest substrate-pallet so we can test the FeeCollected event handling

Implementation details

Testing details

Acceptance Criteria

Update seeding logic

We are going to revert back the current changes on the schema to the original implementation. This means, not using id as object.id from mongo and passing resource and domains values

Implementation details

  • update seeding logic to accomodate changes on schema

Testing details

  • seed script adds data to the database

Acceptance Criteria

  • database has records

Refactor EVM event processing logic

Refactor EVM listening logic to support the latest Solidity contracts and align with the new data model

Implementation details

Rewrite functions for persisting different chain events (e.g., saveDeposits) to align with the latest smart contracts. The new implementation should fetch and save events from a range of blocks provided as an argument. Update the EVM listening logic accordingly.

Additionally, apply the new data model when implementing the refactored EVM listening logic. A new data model and more information can be found in the research document.

Testing details

  1. Test the refactored EVM listening logic with the latest Solidity contracts. (this can be done with devnet/testnet contracts)
  2. Test the logic to ensure it fetches and saves events within the specified block range. (this can be done with devnet/testnet contracts)
  3. Add minimal unit tests for processing logic

Acceptance Criteria

  • The EVM listening logic is refactored to support the latest Solidity contracts.
  • The implementation fetches and saves events within the specified block range.
  • Passing unit tests

Decode substrate multilocation

We don't decode multilocation data right now and destination for substrate is invalid (hex string of multilocation).
We should parse substrate multilocation data and extract recipient general key to get the destination recipient.

Fix pagination issue

Implementation details

  • Fix pagination issues
  • Remove state from transfer service

Testing details

  • current transfer service and transfer controller test must pass
  • pagination should work as expected

Acceptance Criteria

Test substrate indexing logic

Since we have implemented substrate indexing logic, we should have test to cover corner cases and check that data is being stored properly.

Implementation details

  • test substrate indexer implementation
  • test utility and helpers functions

Testing details

Acceptance Criteria

  • test must pass

Migrate to fastify

Migrate away from express to fastify implementation on the API side of indexer

Implementation details

  • remove Express as dependency
  • install and use fastify
  • refactor code to accomodate usage of fastify

Testing details

  • server should work with fastify
  • server should be able to build under build command

Acceptance Criteria

  • API works with fastify

Error when query params missing

Api returns 500 if page or limit query params are missing as we don't have sane defaults.

Expected Behavior

Current Behavior

Possible Solution

Steps to Reproduce (for bugs)

Versions

ChainBridge commit (or docker tag):
chainbridge-solidity version:
chainbridge-substrate version:
Go version:

Add boolean property to OFAC addresses

We need to comply to check OFAC addresses by adding a boolean property to the model.

Implementation details

  • add boolean flag to the model
  • update indexer work to reflect those changes

Testing details

  • update unit tests

Acceptance Criteria

  • transfers have this property

Fix log format

Datadog doesn't parse log levels and new lines correctly currently.
Fix the logs format so it is compatible with datadog.

Make coinmarketcap API requests more robust.

Implementation details

  • The current implementation of querying the coinmarketcap API seems to error out occasionally.
  • We need to enhance the robustness of these API requests to handle errors and possibly retry failed requests.
  • Consider implementing error handling mechanisms, such as retries with exponential backoff, to ensure successful data retrieval.

Testing details

  • Test the enhanced API request mechanism under various scenarios, including simulated API failures, to ensure it can recover and provide the expected data.

Acceptance Criteria

  • The coinmarketcap API requests should be able to handle errors gracefully and retry failed requests.
  • All test cases should pass without any errors.

Upgrade logs

Currently, indexer logs mostly use info logs, so it is hard to find important information.

Details

Review current logs, and move logs per block to debug level.

Acceptance Criteria

  • Logs for processing specific logs moved to debug level.

Add endpoint to get all transfers

We need to create the endpoint to get all the transfers ordered by time

Implementation details

  • implement the following: GET /api/transfers?status={status}&page={page}&limit={limit}
  • status (optional): Transaction status (pending/executed/failed)
  • page (required): Page number for pagination
  • limit (required): Number of items per page
  • create test for the endpoint

Testing details

  • indexer fetchs and returns all the transfers ordered by time
  • indexer fetchs and returns all the transfers ordered and by status

Acceptance Criteria

  • endpoint behaves as expected
  • test are passing

Update endpoint to search by txHash adding chainId as optional parameter

Since there is the chance -small one-, that txHash could not be unique, we should fetch transfer passing txHash + chainId

Implementation details

  • update current endpoint to receive chainId as optional parameter

Testing details

  • update test to reflect the change

Acceptance Criteria

  • tests are passing
  • we are able to fetch transfer by txHash + chainId

USD Value on indexer

Description

We would like to implement the USD value for the Explorer UI via a single API (coingecko API) to avoid any calculation.

Consideraitons:

  • Use the coinmarket API that the fee oracle is potentially already using (confirm with Freddy)
  • see design

Review optional fields on schema

Currently we have a bunch of optional fields on the schema, and for some of them this optionality doesn't make too much sense. Revisit this in conjunction with the indexing logic.

Implementation details

  • sender and timesteamp shouldn't be nullable
  • check the other fields

Testing details

Acceptance Criteria

Create or modifiy endpoints according to new specified API design

There are new endpoints that we need to create or modify based on the current ones

Implementation details

Get all transfers for a specific resource

GET /api/resources/{resourceID}/transfers?status={status}&page={page}&limit={limit}

  • status (optional): Transaction status (pending/executed/failed)
  • page (required): Page number for pagination
  • limit (required): Number of items per page

Get all transfers with a specific domain as source or destination

GET /api/domains/{domainID}/transfers?status={status}&page={page}&limit={limit}&domain={source/destination}

  • status (optional): Transaction status (pending/executed/failed)
  • page (required): Page number for pagination
  • limit (required): Number of items per page
  • domain (optional): Filter transfers by source or destination domain (source/destination)

Get all transfers from a source domain to a destination domain

GET /api/domains/source/{sourceDomainID}/destination/{destinationDomainID}/transfers?page={page}&limit={limit}

  • page (required): Page number for pagination
  • limit (required): Number of items per page

Get all transfers for a specific resource between source and destination domains

GET /api/resources/{resourceID}/domains/source/{sourceDomainID}/destination/{destinationDomainID}/transfers?page={page}&limit={limit}

  • page (required): Page number for pagination

  • limit (required): Number of items per page

  • provide testing for all the endpoints

Testing details

  • endpoints retreive informaton as expected
  • tests are passing

Acceptance Criteria

  • endpoints behave as expected according to spec definition
  • test are passing

Add usd conversion to fee

Implementation details

  • add usd value conversion to fee

Testing details

Acceptance Criteria

  • fee attribute should have amount converted to usd

Load RPC urls from ENV

Load domain the RPC url-s from env variable so they are not exposed inside public repository.

Implementation details

Testing details

  • Add unit tests

Acceptance Criteria

  • RPC URLs are loaded from ENV variable
  • Passing unit tests

Add Substrate event processing logic

Add substrate listening logic that will be aligned with our pallet and new data model.

Implementation details

@tcar121293 as you mentioned most of the processing logic from relayers can be translated to typescript.

Testing details

  1. Test the logic to ensure it fetches and saves events within the specified block range. (this can be done with devnet/testnet contracts)
  2. Add minimal unit tests for processing logic

Acceptance Criteria

  • The implementation fetches and saves events within the specified block range.
  • Passing unit tests

Add prettier and linter to keep consistency accross the code

Implementation details

  • implement prettier and eslint
  • for reference, use the config thats present on sdk
  • update GA to run test and prettier

Testing details

  • code should be formated properly
  • linter should show errors following the configured rules

Acceptance Criteria

Invalid extrinsic IDs

Extrinsic IDs for Khala and Phala don't match the extrinsic ID of the transaction with the deposit (probably also execution).

Expected Behavior

Current Behavior

Possible Solution

Steps to Reproduce (for bugs)

Versions

ChainBridge commit (or docker tag):
chainbridge-solidity version:
chainbridge-substrate version:
Go version:

Substrate destination missing

When indexing transfer from Khala and Phala destination is not indexed.

Implementation details

Testing details

Acceptance Criteria

Enable executing actions on existing database

As the indexing process is very slow (especially on substrate chains), we need to figure out ways to make changes and upgrades on the database without reindexing entire chains. For changing the actual schema of the database in some environment, we will use Prismas migrations. What we need in parallel to this is the ability to run some action on the entire database or some subset of entries in the database.

First such use case is to re-run $ price calculations for amounts bridged on our mainnet environment. In the process of re-indexing this calculation was failing, so all transfers have 0 as the $ value.

Implementation details

  • Enable CLI-like functionality to run actions (scripts) on database entries
  • Implement the first action that checks and calculates the $ value of the token amount and saves it to the database if it is wrong

Testing details

  • Add unit tests where applicable
  • Manually test action on local setup
  • Manually test action on testnet

Acceptance Criteria

  • All unit & e2e tests passing
  • All manual tests were successful
  • Ability to check and fix $ value fo all transfer

Implement endpoint to get all the transfer providing `sender` address

Implement the following endpoint:

GET /api/sender/{senderAddress}/transfers?status={status}&page={page}&limit={limit}

Implementation details

  • add endpoint to current controllers
  • status (optional): Transaction status (pending/executed/failed)
  • page (required): Page number for pagination
  • limit (required): Number of items per page

Testing details

  • indexer fetchs and returns all the transfers from the current sender ordered by time
  • indexer fetchs and returns all the transfers from the current sender ordered and by status

Acceptance Criteria

  • endpoint behaves as expected
  • test on transfer.service pass

Add shared configuration support

Refactor logic for loading configuration, so it loads configuration from provided URL and an array of RPC endpoints for each domain as ENV parameter.

Implementation details

  • Define TS types for shared configuration format
  • Refactor getSygmaConfig() to load configuration from URL and then append RPC endpoints from array provided as ENV variable
  • Remove getConfigFromSSM() and any other deprecated config-related code

Testing details

  • Add unit test for getSygmaConfig function

Acceptance Criteria

  • Passing unit tests

Add deployment pipeline for Indexer

From repository sygma-explorer-indexer we need to deploy two different services:

  • indexer service
  • API service
  • ** there also needs to be a working database that both services will connect to

Details

We need to support testnet (on each push to main) and mainnet (on release) environments.

Both services use same root Dockerfile, just different start commands

  • API service: node ./build/index.js
  • Indexer: node ./build/indexer/index.js

Display failed deposit error reason in case of liquidity problems

Story

As a bridge user or sdk developer
I want to see reason of my failed Deposit (execution) if it was failed bcs of not enough liquidity
So that I can produce less panic.

Background

As a protocol provider we can't really guarantee that someone will send a Transaction with amount bigger then a liquidity, so atleast we want to show to the users that their Transaction faild bcs of that reason

Details

Please show something like "Transaction failed bcs of not enough liquidity, please be patient, your transaction will be retried soon when"

Scenarios

Scenario:
Given I am a bridge user
When send a deposit with amount higher then a liquidity on dest. network
And it get failed
Then I see that it was failed bcs of specific reason and will be fixed soon

Implementation details

You need to check the Transaction Logs if transaction was failed and search for a specific log that happens only when handlers fails on the not enough liquidity, should be pretty straight forward for an EVM but could be tricky for a Substrate.

Testing details

[] Unit tests
[] Manual tests

Acceptance criteria

[] If Execution failed bcs of liquidity problems, corresponding error should be displayed

Restructure repository

Restructure the repository for maintainability and scalability reasons.

Implementation details

Restructure the repo to monorepo with two packages:

  • explorer-api - Fastify application that exposes defined explorer API
  • explorer-indexer - Node application that indexes, processes, and saves to database Sygma on-chain events.
  • move all the github actions workflows that relate to explorer and indexer

With this approach, we can scale explorer-api separately to the explorer-indexer, which is a highly possible scenario.

Please follow monorepo setup defined by Engineering Handbook.

Refactoring the indexing codebase is not in the scope of this issue, so just remove the indexer-worker code and migrate the indexer code to the new package explorer-indexer.

Testing details

  1. Make sure that the yarn file is set up properly, so you can build and run individual packages

Acceptance Criteria

  • Repository is set up as a monorepo with two packages: explorer-api and explorer-indexer

Fix feeAmount format

Fix feeAmount format (remove commas)

Implementation details

Testing details

Acceptance Criteria

Update and create test for latest indexing logic

Implementation details

  • Update current evm indexer test to the latest version of the EvmIndexer class
  • Create test for evm indexing logic
  • Create test for utility functions being used

Testing details

  • All test must pass

Acceptance Criteria

Update contracts ABI information

Update ABI based on new published version of solidity contracts.

Acceptance Criteria

  • passing tests
  • working testnet deployment

Add endpoint for filtering transfers by dates

Our analytic team has a need for fetching transfers by date.

Implementation details

Add new endpoint that would allow fetching transfers filtered by date-time. It should allow two way bounded filtering, where both bounds are optional. Here is a dummy example:

../?fromDate={url_encoded_ISO_8601}&toDate={url_encoded_ISO_8601}

Use ISO 8601 for time format.

Testing details

  • Add unit tests

Acceptance Criteria

  • Have functioning endpoint for fetching transfers filtered by date

More detailed context on failure

Story

As a user
I want to understand the reason why my transaction failed on execution
So that I can feel like my assets are safe

As a on-duty developer
I want to quickly understand the reason why transaction failed on execution
So that I can take appropriate actions

Background

We had a few incidents on PHA routes, where users had locked their founds for smaller periods as liquidity was missing on the destination chain. This on Explorer is displayed as generic failed transaction, where it would be a much better user experience if we showed the actual reason why this happened.

Details

This piece of art is a rough sketch of how something like this could look.

index_liq_change_1

index_liq_change_2

We are providing additional context on failed transactions. We could possibly even use this to automate retries based on fail context in the future.

Implementation details

We should start with something simple, such as only recognizing the situation where there is missing liquidity on the destination chain. Still, we should design a solution that can be extendable to hold different failed contexts.

Testing details

  • Add unit tests
  • Test on devnet/testnet

Acceptance criteria

  • Indexer API expanded to save and provide information on failure context
  • Explorer UI displays additional context for failed transactions

Update current transfer by ID endpoint

Implementation details

  • Update current transfer by ID endpoint to use new data model
  • update tests

Testing details

  • endpoint returns transfer by id

Acceptance Criteria

  • implementation is updated
  • test are passing

Move all the setup to docker-compose and remove only-mongo compose file

Currently you need to modifiy the hosts file to run the replicas and for those to be accessible outside the container. This setup is useful for docker-compose.only-mongo.yml file. We should have everything inside the containers and not different compose files.

Implementation details

  • remove only-mongo docker compose file
  • update readme
  • test indexing and API works using the main docker-compose

Testing details

Acceptance Criteria

  • all the setup for the repo works from single docker compose file

Create Swagger API for documentation and mocked UI testing

To make Explorer code more extendable and testable we need to try to follow the TDD approach and create first Swagger documentation that will describe full indexer API and then use it on ExplorerUI

Implementation details

First, we need to agree on the API design. Define URL schema, methods, and what data it returns. (mostly done here
Create swagger documentation out from this information and server that can mock this requests so it can be used in ExplorerUI local development

Testing details

Acceptance Criteria

[] Swagger documentation and mock server added to the indexer repo

Add seeding for database

Add yarn action for adding seed data to the database so it is easier to locally test.

Implementation details

Check out this guide

Testing details

Manually tests that database is populated once seeding action is run

Acceptance Criteria

Functioning data seeding command

Refactor deployment strategy

Currently we are using pm2 package to start indexer and api processes separately. This is not really compatible with our setup as this is going to be deployed to the aws, so there is no reason to use package like this. Best approach here is to prepare separate docker images for indexer and api so we replicate similar strategy as our other services.

Implementation details

  • create docker image for the indexer application
  • create docker image for the api application
  • refactor docker compose file so it facilitates local setup using new docker images

Testing details

  • manually starting both docker images
  • manually test local setup

Acceptance Criteria

  • working docker images for: indexer and api
  • working local setup

Index time for deposit and execution separately

We want to save time of deposit and execution as two separate entries in database. This way we can calculate actual delta from information in database.

Implementation details

  • Add additional timestamp field
  • Refactor indexing logic so we save separately deposit and execution time

Testing details

  • Test locally that indexer is appropriately saving time information

Acceptance Criteria

  • Indexer saving information on deposit and execution time in database
  • Passing unit tests

Add health endpoint

Add /health endpoint for easy AWS deployment setup

Implementation details

Endpoint just needs to return a 200 OK response

Acceptance Criteria

  • Working health endpoint

Healthcheck does not work properly

When the error occurs, indexer will be stopped, but since healthcheck endpoint just checks if the db is up, the service will not be restarted

Implementation details

Testing details

Acceptance Criteria

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.