Git Product home page Git Product logo

blockchain-http's Introduction

blockchain-http

CI codecov

This is an Erlang application to serve up the Helium blockchain as stored by the blockchain-etl service and schema. The two applications rely on the schema being compatible to work

Developer Usage

  • Clone this repository

  • Create .env file by copying .env.template and editing it to reflect your postgres read-only and read-write access URLs

  • Run make release in the top level folder

  • Run make start to start the application. Logs will be at _build/default/rel/blockchain_http/log/*.

Once started the application will start serving up the blockchain through a number of routes. Documentation for these routes will be added soon.

Installing Ubuntu Required Packages

If running on Ubuntu, you will need the following packages installed before running make release:

wget https://packages.erlang-solutions.com/erlang-solutions_2.0_all.deb
sudo dpkg -i erlang-solutions_2.0_all.deb
sudo apt-get update
sudo apt install esl-erlang=1:23.2.3-1 cmake libsodium-dev libssl-dev
sudo apt install build-essential

WARNING

This application does NOT serve up over TLS, and does NOT rate control, or access control clients. Please run this service behind a load balancer that terminates SSL and does some rate and access control.

Using Docker

Building the Docker Image

docker build -t helium/api .

Running the Docker Container

docker run -d --init \
--restart unless-stopped \
--publish 8080:8080/tcp \
--name api \
--mount type=bind,source=$HOME/api_data,target=/var/data \
-e DATABASE_RO_URL=postgresql://user:[email protected]:5432/helium_blockchain \
-e DATABASE_RW_URL=postgresql://user:[email protected]:5432/helium_blockchain \
-e DATABASE_RO_POOL_SIZE=10 \
helium/api

Updating Docker

Navigate to your copy of the blockchain-http repository.

cd /path/to/blockchain-http

Stop the Docker container.

docker stop api

Remove the existing Docker container.

docker rm api

Update the repository.

git pull

Rebuild the Docker image.

docker build -t helium/api .

Run the updated Docker container.

docker run -d --init \
--restart unless-stopped \
--publish 8080:8080/tcp \
--name api \
--mount type=bind,source=$HOME/api_data,target=/var/data \
-e DATABASE_RO_URL=postgresql://user:[email protected]:5432/helium_blockchain \
-e DATABASE_RW_URL=postgresql://user:[email protected]:5432/helium_blockchain \
-e DATABASE_RO_POOL_SIZE=10 \
helium/api

blockchain-http's People

Contributors

densone avatar georgica avatar jadeallenx avatar jamesdobson avatar jontow avatar lthiery avatar madninja avatar riobah avatar vagabond avatar vihu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

blockchain-http's Issues

Include location of hotspots in consensus transaction data

It would be awesome if the lat/lon of all 16 consensus group hotspots were included in the API response for a txn of type consensus_group_v1. I'm trying to plot them all, but I think right now the only way to do it is to make 16 separate requests of hotspot IDs to get their lat/lons

Right now that transaction type returns an array members of each hotspot's address in string format. It would be awesome if each element in members had at least:

  • latitude
  • longitude
  • hotspot address

instead of just the hotspot address string

Remove Score from the API

Now that score is no longer used for PoC or for CG selection, remove the PoC based score from the API.

Disable cursor expiration on account list

Like with hotspots: Remove the requirement that all accounts are retrieved before the next invalidating block arrives. While it enables a consistent snapshot of all accounts at the reported height, most client interactions can't complete the paged account list fast enough.

Decoupling like with hotspots means reporting the block height for each account as it is returned which allows clients to determine what height the account data is valid at. The returned account list will have a mix of block heights.

  • Update API docs to match

Missing stake_validator, transfer_stake and unstake_validator txs

Need apis for validator metrics

Need apis for validator metrics for both testnet and mainnet:

  • provide total # of validator counts broken down by stake status (staked, cooldown, in consensus, online, offline)
  • amount of HNT staked (individual nodes and total)
  • HNT paid out to validator (individual nodes and total)
  • consensus group members and size
  • state of validator node (online, offline, in consensus, cooldown)
  • date of initial stake
  • amount of HNT in cooldown
  • total staked vs total supply vs total circulating supply

We'd leverage mainnet apis to provide a visuals on a validator section of explorer.

Hotspots should always have reward_scale field (with a default value instead of the field not being present)

I'm not sure what the correct default value should be, but there are some cases where a hotspot doesn't have a reward_scale value. It's probably better if the field is always there with a default value than if it sometimes doesn't show up. Either 0 or 1 would probably make sense as a default value, or possibly null?

The two cases I've noticed where a hotspot doesn't have the field:

Duplicated results in account rewards API

/accounts/14cWRnJk7oZDeRSfo9yS3jpWfQmqZxNEzxQoygkoPBixLVSQaTg/rewards?min_time=2020-06-05T00:00:00.000000&max_time=2020-06-16T07:00:00.000000 returns one page of 100 results and second page with 2 results where the first result from the second page duplicates the last result from the first page. Modifying the end date range to 2020-06-16T00:00:00.000000 returns 1 page with 1 result and second page with 99 results and no duplicates.

Get count of a given transaction type in a given address's activity

It would be really handy to be able to get a count of a given transaction type for a given account.

I don't know if this would be the right way to do it, but I'm picturing something like:

https://api.helium.io/v1/accounts/:address/activity/add_gateway_v1/sum
https://api.helium.io/v1/accounts/:address/activity/assert_location_v1/sum

My specific request is so I can make a "maker dashboard" which shows how many hotspots each official maker account address has paid for (add_gateway_v1 & assert_location_v1 transactions):
Screen Shot 2021-03-11 at 1 36 29 PM

And I'm thinking this could also be useful for other things, e.g. showing a visual breakdown of a hotspot/account's transaction types.

Right now that screenshot ^ is achieved by using the filtering in helium-js to pull activity filtered by transaction type, then displaying the length of the array.

Because I don't need any of the data in the transactions themselves, it feels wasteful to pull them all when just displaying the count.

Not sure if it'd be doable or harder, but could also work nicely to have a generic route something like:
https://api.helium.io/v1/accounts/:address/activity/sum that would return an object with each transaction type and a number of how many transactions of that type there were.

Search hotspots by name, geography

We'd like to support text search for hotspots by fields like animal name and geography (city, state, country). We use algolia for this currently, but it's getting $$$. Because the contents of the columns being searched is limited, using postgres search shouldn't be terrible performance wise.

Set Score to 1.0

As a precursor to #136 so we can give downstream consumers time to adjust, set all scores to 1.0 on API responses.

HNT Emitted endpoint

We'd like to start showing HNT emissions on the explorer page to provide insight into individual earnings (i.e. network emissions down = hotspot earnings down) and whether the network keeps pace with the 5MM per month emission schedule.

Time bucket this endpoint as we do with other earnings-related endpoints.

inconsistent results from API queries

Having a problem with hotspot rewards API calls where the rewards for a longer period doesn't equal the sum of the rewards of shorter periods within it. For example:
The API returns rewards for 10/1 to 10/27: 33108296896
The API returns rewards for 10/27 to 10/28: 849902422
sum of these: 33958199318
but the API returns rewards for 10/01 to 10/28: 34036803710, or 78604392 greater than the sum of the individual periods.

[https://api.helium.io/v1/hotspots/1162fRqWoUGd8DUexz92d3ZsVaZre4wjXFTWEmLjK4WnXkmZMss/rewards/sum/?min_time=2020-10-01T00:00:00&max_time=2020-10-27T00:00:00]
{"data":{"sum":"33108296896","min_time":"2020-10-01T00:00:00Z","max_time":"2020-10-27T00:00:00Z"}}

[https://api.helium.io/v1/hotspots/1162fRqWoUGd8DUexz92d3ZsVaZre4wjXFTWEmLjK4WnXkmZMss/rewards/sum/?min_time=2020-10-27T00:00:00&max_time=2020-10-28T00:00:00]
{"data":{"sum":"849902422","min_time":"2020-10-27T00:00:00Z","max_time":"2020-10-28T00:00:00Z"}}

[https://api.helium.io/v1/hotspots/1162fRqWoUGd8DUexz92d3ZsVaZre4wjXFTWEmLjK4WnXkmZMss/rewards/sum/?min_time=2020-10-01T00:00:00&max_time=2020-10-28T00:00:00]
{"data":{"sum":"34036803710","min_time":"2020-10-01T00:00:00Z","max_time":"2020-10-28T00:00:00Z"}}

Witness Count: Valid Witnesses

The current route shows how many witnesses a Hotspot has, but does not necessarily separate valid witnesses from invalid ones.

We know that only valid witnesses contribute to earnings, so we should only relevant data to the user.

route to summarize account balance changes over a time period

For each wallet address, summarize how much HNT a wallet address has received (via mining, payments) and how much HNT was debited (via burn, payments, asserts, transaction fees).

We will need a daily value, weekly value, and monthly value.

Here's the UI we want to create:
image

Green is a credit of HNT, blue is a debit of HNT. you'll also see the top left of the chart showing the exact HNT value.

cc @allenan

validation reason

It would be helpful for users of the API and app developers to see a reason for is_valid = false. Potentially add a field or change is_valid to be a tuple, although I'm not sure if the latter makes sense.

cc: @cokes518

Sort transactions optional param

Sometimes it could be useful to get transactions from beginning to end instead of the default most recent to beginning.

For example, I'm trying to find when a hotspot was first added, I have to go to hotspot activity and go through transactions until I get to the beginning of time. This could be part of whats returned from the hotspot list endpoint as well (/hotspots) but it doesn't appear to be that way.

Another example, if I'm trying to rederive a balance at a certain date, I'll want to look at the account transactions in reverse order.

Feature: Provide an endpoint to find PoC Challenge txns

Summary

It would be helpful to be able to find PoC Challenge transactions on the basis of an Onion Key Hash alone. The ETL schema supposedly already has the data to serve such requests. An API endpoint for searching transactions this way would be great.

Background / Need

I am researching possible witnessing failures in the p2p network due to a possible DDOS ("The Thundering Herd") problem wherein well-heard hotspots induce so much traffic towards the challenger in a PoC that they crash the challenger, preventing the challenger from recording witness receipts, and even preventing the challenger from checking back in within the timeout interval. This almost effectively erases the fact that the challenge even happened at all!

The one piece of data that can help diagnose if this is an issue is the Onion Key Hash, which can be calculated by examining the payload of a Lora packet sent in response to the challenge.

Feature: earnings buckets

We want to be able to answer questions like "what was the total HNT earned by all hotspots on the network between two timestamps?" and "what was the median hotspot earning between those timestamps?". We also want to be able to separate out earnings by "buckets" between those timestamps. So for a given hotspot, we could chart a bar graph for each day's earnings within the past week. Ideally the design of this API will be flexible enough to serve different use cases.

[Request] assert_location_v1 transactions could return geocode data

I'm not sure how feasible it is, but it would be great if transactions of type assert_location_v1 could return the same "geocode" info as /hotspots/:address returns:
Screen Shot 2020-12-07 at 4 54 07 PM

so that I could use the city/state/country/street names in things like meta tags or UIs in the frontend without having to make another request.

Come to think of it, it would be great if other transaction types returned the same thing, e.g. poc_receipts_v1 and poc_request_v1 โ€” could be useful to display city / street names when showing challenges and their participants in the app or in Explorer

Clear Individual witnesses after 7 days instead of resetting whole list

To better reflect the most recent witnesses of a hotspot I would propose that witness lists not clear once every two weeks, but that an individual witness will remain visible for 7 days (~11,340) blocks after it was last witnessed. Numerous people are confused by having all witnesses drop at once, and this display that a hotspot has no witnesses is not accurate, especially even if successfully beaconed with multiple witnesses a short time before it's list clears.

I'm not sure exactly where this list is maintained, in explorer I imagine, and when it usually runs the process to delete the list, but as long as there is a way to see each witness's most recent time stamp (block#) of when it witnessed a hotspot, it could be set to delete after 11,340 or some other number of blocks. Not sure how often this would need to run to check the lists, at least once a day I'd think, or if simply every time a hotspot is c hallenged.

Add optional parameters min and max time to hotspots/:address/challenges route

It would be really helpful if with API call, user is able to get challenges in certain time period, currently hotspots/:address/challenges route does not allow to filter result based on time period.

Can optional query parameter min_time and max_time be added in hotspot-challenges route similar to what is present in hotspots/:address/rewards ?

Add lat/lng bounding box param to hotspots API

With the number of hotspots on the network growing at a fast clip, it's getting too expensive to fetch and store all hotspots in memory. And Algolia's geospatial search while fast, is pricey, only gives us a limited number of hotspots within a radius, and isn't integrated with helium-js to return Hotspot objects. Therefore it'd be great to allow our existing hotspots route to take a bounding box which would be the currently viewed map area, and return a cursor-paginated list of hotspots within that bounding box.

I'd propose a bounding box be defined as the SW + NE coordinates to be consistent with Mapbox. If indexes are added to the lat/lng columns, then it should be relatively efficient to do a BETWEEN query:

WHERE lat BETWEEN :minLat AND :maxLat AND lng BETWEEN :minLng AND :maxLng

We can choose to address the wrapping issue at 180 degrees longitude if we want, or just do the above as a first pass at a solution, given that we don't expect m/any queries to be made over the pacific in that geography.

Add hotspot earnings in account-rewards-sum API

Currently, Account-rewards-sum API (https://api.helium.io//v1/hotspots/:address/rewards/sum) only returns account earnings for given timeframe. If API could be modified to get rewards(sum) for all hotspot in an account broken at hotspot level along with Account earnings for given timeframe.

This would eliminate a need to make multiple API call at hotspot level, if account level API can breakdown hotspot earnings.

Current output:
{
"data": {
"max_time": "2020-08-29T00:00:00Z",
"min_time": "2020-08-27T00:00:00Z",
"sum": "10000"
}
}

Suggested future output:

{
"data": {
"max_time": "2020-08-29T00:00:00Z",
"min_time": "2020-08-27T00:00:00Z",
"sum": "10000"
"hotpsot": [
{"address": hotspot_address1,
"name": "name-of-hotspot1",
"sum": "3000"
},
{"address": hotspot_address2,
"name": "name-of-hotspot2",
"sum": "7000"
}
]
}
}

Parse Binary Chain vars to be more readable

The API should parse binary chain variables and present it in a more legible way for clients.

https://api.helium.io/v1/transactions/vnEqwbKtFfFxXgYI_9L5Th0LRVkpJlsX-sQzZTh2VwY

{
  "data": {
    "version_predicate": 0,
    "vars": {
      "witness_redundancy": 4,
      "poc_witnesses_percent": 0.2124,
      "poc_reward_decay_rate": 0.8,
      "poc_path_limit": 1,
      "poc_challengees_percent": 0.0531,
      "hip17_res_9": "2,1,2",
      "hip17_res_8": "2,1,4",
      "hip17_res_7": "2,5,20",
      "hip17_res_6": "1,25,100",
      "hip17_res_5": "1,100,400",
      "hip17_res_4": "1,250,800",
      "hip17_res_3": "2,100000,100000",
      "hip17_res_2": "2,100000,100000",
      "hip17_res_12": "2,100000,100000",
      "hip17_res_11": "2,100000,100000",
      "hip17_res_10": "2,1,1",
      "hip17_res_1": "2,100000,100000",
      "hip17_res_0": "2,100000,100000",
      "hip17_interactivity_blocks": 3600,
      "density_tgt_res": 4
    },
    "unsets": [],
    "type": "vars_v1",
    "time": 1608142069,
    "proof": "MEUCIFSiomdFyceMis_PyYCExxDzn26Pl3fnbuagWHgG5ZkqAiEAyAKH6Z3eQPk8KbOZZc9S7sRrTeW1raloE4NyIFXPagg",
    "nonce": 50,
    "master_key": null,
    "key_proof": "",
    "height": 635109,
    "hash": "vnEqwbKtFfFxXgYI_9L5Th0LRVkpJlsX-sQzZTh2VwY",
    "cancels": []
  }
}

Rewards and stats for all hotspots

In addition to the rewards endpoints for individual hotspots and accounts provided here: #129, we need an additional endpoint that exposes rewards and stats across all hotspots.

I'd imagine the route would be: /v1/hotspots/rewards/stats and would take the same params as the other endpoints with regard to max/min time and bucket.

The one potentially tricky point is that in calculating the stats, the rewards should be considered per hotspot.

For example, if there were 5 hotspots in the entire network that were producing rewards for the week of 11/22, and their earnings for that week were:

  • hotspot A: 100 HNT
  • hotspot B: 500 HNT
  • hotspot C: 300 HNT
  • hotspot D: 100 HNT
  • hotspot E: 200 HNT

If we requested min_time=2020-11-22 and max_time=2020-11-29 and bucket=week, we'd expect to see:

  • total = 1200 HNT
  • median = 300 HNT

Note there that the median reward for the week was hotspot C's total earnings for that week as compared to each of the other hotspots' total earnings for that week.

Expose account's staked balance

The staked balance is currently subtracted from the balance value on an account. balance will continue to act as an account's liquid balance going forward. In addition, we should expose a staked_balance value

Add `last_poc_challenge` to the hotspot API endpoint

To add the block height that is returned from last_poc_challenge under each hotspot record for the endpoint https://api.helium.io/v1/hotspots. This will assist in filtering out non-interactive hotspots in accordance with HIP17 implementation

Apis needed for mainnet validator-related explorer items

For validator-related explorer items for mainnet need the following data:

  • of active validators

  • amount of HNT staked
  • total HNT paid out to validators
  • consensus group size
  • total staked vs total supply vs total circulating supply

Hotspot Beacon Rate

For each hotspot, bucketed over time, tell me how many beacons this Hotspot was a challengee.

This should unblock the app team from launch.

Eventually we'll follow this up with whether the Beacon had witnesses and valid witnesses, and the full monty: challenge pass/fail rate.

Add 'Scale' to Location output

Per #171 the reward_scale is visible for the hotspot. The final reward_scale is the result of the product of all it's higher-resolutions scales. This piece of business logic is complicated as neighbouring hexes density_target also needs to be taken into account.

For more transparency it would be helpful if the Location-API could accept any h3 resolution hex which is parenting a hotspot as input, and return the scale (clipped/unclipped) as response.

Additional apis related to unstaking

  • ability to track unstaking when it happens
  • track the amount of HNT in unbounding period

We'd look to potentially provide a graph showing how much HNT is leaving locked up system.

503 first byte timeout on hotspot/activity URL

Provide summary of total amt of DC created by burning HNT

We'd like to know the total amount of DC generated over a period of time as well as how much HNT was burned to generate those data credits.

It should be a daily, weekly, monthly, and an all time summary of DC generated and HNT burned.

Helium SDK

I've played around with the current v1 API and hit quite a few roadblocks that make developing an app quite cumbersome because it requires a lot of network requests and heavy client-side processing to get the data you want

TL;DR: An API should be designed with implementation in mind, not to deliver a big dump of data that can then be parsed on the client.

I'm a fan of designing an API so that it only delivers the minimum data needed for the current request. I don't think the hotspots API makes much sense for any application other than one that would literally display every hotspot's stats on the same page. It would make more sense to me to just have the hotspots API deliver name and address, ideally as a map (I know y'all are switching to use algolia for search, but if a community developer wanted a simple search bar, a map of hotspot names should be fine). Then, you can use the address to request more information about a given hotspot.

Same with the blocks API. There doesn't seem to be a need to have an endpoint that delivers all blocks with all block data.

It also depends what are your goals for these endpoints? Are they meant to just power explorer or are there plans to have an SDK that makes it easy for the community to build out apps for their own needs? If the latter, it should be designed in a way where the patterns are easy to understand and be able to "drill down" to get more refined data.

For example, say I want to find the last time a hotspot's witness witnessed the hotspot (say I had a hotspot page that showed its witnesses and next to each there is a "View Last Witness" button). Currently, I would need to call /hotspots/:address/challenges, loop through all of the challenges, and within each challenge, loop through all of the witnesses and keep doing so until I find the witness address I'm looking for. But ideally, all of that work should be done on the server that delivers the response. I should be able to call /hotspots/:address/witnesses/:address/challenges/offset/1 and be given the challenge in which this witness was last involved. If I change 1 to 5, it increases the offset and gives me the last 5. This allows the developer to use the Helium SDK/API to get specific data without heavy client-side processing or making multiple network requests.

Add locations/:h3index route for geo details

Would be awesome to be able to get geo details for a given h3 index for example at /locations/:h3index, like in the response for a hotspot:

geocode:
  cityId: "c2FuIGZyYW5jaXNjb2NhbGlmb3JuaWF1bml0ZWQgc3RhdGVz"
  longCity: "San Francisco"
  longCountry: "United States"
  longState: "California"
  longStreet: "Spear Street"
  shortCity: "SF"
  shortCountry: "US"
  shortState: "CA"
  shortStreet: "Spear St"

This came up for displaying nicer city/state/country info for the assert location transaction (helium/explorer#136), and I'm sure could be used in other places as well

Get rewards bucketed by given timezone offset

I'm not sure how hard or feasible this would be, but it would be really useful if we could get the rewards sent back based on a timezone offset we pass into the request. And if one isn't specified it could return in UTC like it does now.

I think this would make it a lot easier to show users the earnings over certain time periods in a way that easily lines up with how they think about days. E.g. it can be kind of disorienting when Sitebot stops adding rewards to "Today" and it starts counting them for "Tomorrow".

So if we could for example take the user's local timezone offset and pass that as part of the request to get earnings bucketed by day at -8:00 offset, that would be great.

But maybe this wouldn't be feasible to add on the backend, or perhaps there's a client-side way we can do this even given the UTC buckets... I'm not sure. Timezone math is confusing me :)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.