Git Product home page Git Product logo

hm-pyhelper's Introduction

Helium Python Helper

A helper module used across various Nebra repos to reduce redundant features.

This package is used in a number of Nebra software repos:

The package is available on PyPI and PyPI test repos:

Helium Hardware Definitions

from hm_pyhelper.hardware_definitions import variant_definitions

This repository contains the python file that contains a GPIO map for all of the different hardware combinations to be supported by the Nebra Helium Hotspot Software.

All numbers below are their GPIO / BCM Numbers, not physical pin numbers.

Note: Light hotspot software will also work on all models listed as type "full".

Nebra Hotspots

Model ENV Identifier SPI Bus Reset Pin Status LED Button Type Cellular Notes
Nebra Indoor Hotspot Gen 1 NEBHNT-IN1 1.2 38 25 26 Full False CM3 based
Nebra Outdoor Hotspot Gen 1 NEBHNT-OUT1 1.2 38 25 24 Full True CM3 based
Nebra Pi 0 Light Hotspot S NEBHNT-LGT-ZS 1.2 22 24 23 Light False SPI Based Ethernet
Nebra Pi 0 Light Hotspot X NEBHNT-LGT-ZX 1.2 22 24 23 Light False USB Based Ethernet
Nebra Beaglebone Light Hotspot NEBHNT-BBB 1.0 60 31 30 Light False In Planning
Nebra Pocket Beagle Light Hotspot NEBHNT-PBB 1.0 60 31 30 Light False In Planning
Nebra Hotspot HAT ROCK Pi 4 Indoor NEBHNT-HHRK4 32766.0 149 156 (Physical pin 18) 154 (Physical pin 16) Full False In Planning
Nebra Hotspot HAT ROCK Pi 4 Outdoor NEBHNT-HHRK4-OUT 32766.0 149 156 (Physical pin 18) 154 (Physical pin 16) Full True In Planning
Nebra Hotspot HAT RPi NEBHNT-HHRPI 0.0 22 24 23 Full False Should be compatible with 3+ & 4
Nebra Hotspot HAT RPi LIGHT NEBHNT-HHRPL 0.0 22 24 23 Light False Light is compatible with all 40 pin headers
Nebra Hotspot HAT Tinkerboard 2 NEBHNT-HHTK 2.0 167 163 162 Full False Light would be compatible on TK1

Third Party Hotspots

We may be adding in support for other vendor's hotspots to use our software soon. Here are the variables for those.

These would also depend on their SOCs being supported by Balena.

Model SOC/SBC ENV Identifier SPI Bus Reset Pin Status LED Button Type Cellular Notes
Rak Hotspot Miner BCM2711 (Pi4 2gb RAM) COMP-RAKHM 0.0 17 20 21 Full False Only Compatible with V2 hotspots with ECC Key.
OG Helium Hotspot BCM2711 (Pi4 2gb RAM) COMP-HELIUM 0.0 17 20 21 Full False
Syncrobit Hotspot 1 (Pi) Full False
Syncrobit Hotspot 2 (RK) Full False
Bobcat Miner 300 Full False
SenseCAP M1 BCM2711 (Pi4 2gb RAM) COMP-SENSECAPM1 0.0 17 20 21 Full False

DIY Hotspots

The following DIY options are also supported for light hotspot software only.

Please note, DIY Hotspots do not earn HNT.

Model SOC/SBC ENV Identifier SPI Bus Reset Pin Status LED Button Type Cellular Notes
Pi Supply IoT LoRa Gateway HAT RPi DIY-PISLGH 0.0 22 Light False Any pi with 40 pin header
RAK2287 RPi DIY-RAK2287 0.0 17 Light False Any pi with 40 pin header

hardware_definitions

variant_definitions

A dictionary of all known Nebra hotspot variants. Not all variants are fully supported.

get_variant_attribute(variant_name, variant_attribute)

Returns the value of an attribute from a specific variant. Raises UnknownVariantException and UnknownVariantAttributeException if invalid params supplied.

logger

from hm_pyhelper.logger import get_logger
LOGGER = get_logger(__name__)
LOGGER.debug("message to log")

miner_param

retry_get_region(region_override, region_filepath)

Return the region from envvar region_override or from the contents of region_filepath

from hm_pyhelper.miner_param import retry_get_region
print(retry_get_region("US915", "/invalid/path"))
# US915

# echo "EU868" > /var/pktfwd/region
print(retry_get_region("", "/var/pktfwd/region"))
# EU868

gateway-mfr-rs (gateway_mfr)

This helper module brings in the armv6 build of gateway-mfr-rs (gateway_mfr) which allows us to program the ECC secure element chips in production.

We have a GitHub action in this repo called update-gateway-mfr-rs.yml which checks the upstream repo for updated release versions every Sunday at midnight and if a new release is found it triggers a new PR in this repo to update the version number. This action also can be triggered on workflow_dispatch: meaning you can also manually trigger it from the actions tab.

LockSingleton

LockSingleton prevents the concurrent access to a resource across threads.

Methods

LockSingleton()

Creates a new LockSingleton object.

acquire(timeout = DEFAULT_TIMEOUT)

Waits until the resource is available. DEFAULT_TIMEOUT = 2 seconds

release()

Release the resource.

locked()

Check if there is an available resource.

Usage

lock = LockSingleton()

try:
    # try to acquire the resource or may raise an exception
    lock.acquire()

    # do some work
    print("Starting work...")
    sleep(5)
    print("Finished work!")

    # release the resource
    lock.release()
except ResourceBusyError:
    print("The resource is busy now.")

@lock_ecc decorator

@lock_ecc(timeout=DEFAULT_TIMEOUT, raise_resource_busy_exception=True):

This is the convenient decorator wrapping around the LockSingleton.

  • timeout: timeout value. DEFAULT_TIMEOUT = 2 seconds.
  • raise_resource_busy_exception: set True to raise exception in case of timeout or some error, otherwise just log the error msg

Usage

@lock_ecc()
def run_gateway_mfr():
    return subprocess.run(
        [gateway_mfr_path, "key", "0"],
        capture_output=True,
        check=True
    )

gateway_mfr_result = run_gateway_mfr()
log_stdout_stderr(gateway_mfr_result)

helium/miner RPC

Send RPC commands to the miner container.

create_add_gateway_txn

Return a blockchain_txn_add_gateway_v1_pb2 signed by the gateway and formatted the same way as the smartphone app expects.

Usage:

While using miner container, use json rpc client
client = MinerClient()
While using gateway-rs container, use grpc client
client = GatewayClient()
result = client.create_add_gateway_txn('owner_address', 'payer_address', 'gateway_address')

gateway_address is optional and will only be used to validate the returned payload if supplied.

Testing

To run tests:

poetry install --with dev
poetry run pytest --cov=hm_pyhelper --cov-fail-under=90

Referencing a branch for development

It is sometimes convenient to use recent changes in hm-pyhelper before an official release. To do so, first double check that you've added any relevant dependencies to the install_requires section of setup.py. Then add the following lines to the project's Dockerfile.

RUN pip3 install setuptools wheel
RUN pip3 install --target="$OUTPUTS_DIR" git+https://github.com/NebraLtd/hm-pyhelper@BRANCH_NAME

Releasing

To release, use the Github new release flow.

  1. Create a new tag in format vX.Y.Z. You can use a previously tagged commit, but this is not necessary.
  2. Make sure the tag you created matches the value in setup.py.
  3. Select master as the target branch. If you do not select the master branch, the tag should be in format vX.Y.Z-rc.N.
  4. Title: Release vX.Y.Z.
  5. Body:

Note: you can create the release notes automatically by selecting the "Auto-generate release notes" option on the releases page.

## What's Changed
* Foo
* Bar

**Full Changelog**: https://github.com/NebraLtd/hm-pyhelper/compare/v0.0.A...v0.0.Z

Release strategy

The automated GitHub Actions in this repo do the following:

  • all pushes / PRs, regardless of branch, trigger a build of the wheels and python package which are released as build artifacts (see below section)
  • pushes to master with an updated version number in setup.py are pushed to Test PyPI as well as being uploaded as build artifacts (note that if the version number isn't properly updated and is a duplicate of a previous one then the push to Test PyPI will fail)
  • any tagged releases on master branch (see releasing process above) are built and published to PyPI as well as being uploaded as build artifacts

Test release artifacts

Note that artifacts (wheels and source) are uploaded to the GitHub Actions artifacts even when the build fails or isn't pushed to PyPI/Test PyPI due to not being on the master branch.

For example, this failed build, has artifacts uploaded here.

These artifacts can be useful for testing releases without needing to bump version numbers.

hm-pyhelper's People

Contributors

ccrisan avatar dependabot[bot] avatar ilyastrodubtsev avatar kashifpk avatar kerrryu avatar kevinwassermann94 avatar marvinmarnold avatar mr-bump avatar muratursavas avatar posterzh avatar pritamghanghas avatar radicale avatar robputt avatar shawaj avatar vpetersson avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hm-pyhelper's Issues

bump gateway-mfr-rs to latest version

Bump to latest https://github.com/helium/gateway-mfr-rs/releases/tag/v0.1.5

Also - is there a reason we are building this every time we build hm-pyhelper? Would it make more sense to build this only once each time there is a gateway-mfr-rs release and then hm-pyhelper builds can just pull in the pre-built release?

would speed up the hm-pyhelper builds considerably. and we could just run a periodic (weekly perhaps?) check as an action that checks gateway-mfr-rs repo, builds it and creates a release with the file attached?

Include gateway-mfr-rs

We will need to perform various ECC operations inside the miner in various places. To simplify this, let's include gateway-mfr-rs in this package.

There are build instructions here.

The acceptance criteria is that you should be able to just run gateway-mfr after having installed this python package, so we need to include the ARM64 binary inside the package.

This should all be done in GitHub Actions.

Document release process

The release process for this repo is not obvious unless you've done it before. Let's add documentation in the README so any developer can easily contribute. Most recently, the process I attempted failed to push a release to PyPi. These were the steps I took.

  1. Open PR and tag commit: git tag v0.0.x -a
  2. Rebase and merge
  3. Create a new release
  • Add description of recent changes
  • Add link to PyPi release

I then tried deleting the tag, pulling master locally, and retagging that. After pushing the new v0.8.9 tag, PyPi still didn't release.

Add parameters to support USB concentrators

For the 5g software, we need to add a parameter to the variants file that says USB or SPI and relates to the connection method of the concentrator

This will then be picked up in hm-pktfwd to determine what to put in the config.json

fix: is_rockpi and is_raspberry_pi should not break outside Balena

User Story
As a user, I should not hit an exception if running outside Balena.

Notes
We currently detect hardware by inspecting the BALENA_DEVICE_TYPE environment variable, created by Balena. In test cases, this will raise an exception.

    def is_raspberry_pi():
        """
        Pulled from
        https://www.balena.io/docs/reference/base-images/devicetypes/
        """
        device_type = os.getenv('BALENA_DEVICE_TYPE')
        device_type_match = [
            'raspberry-pi2',
            'raspberrypi3',
            'raspberrypi3-64',
            'raspberrypi4-64',
            'nebra-hnt',
            'raspberrypicm4-ioboard'
        ]
    
        for device in device_type_match:
>           if device in device_type:
E           TypeError: argument of type 'NoneType' is not iterable

Acceptance Criteria

  • is_rockpi and is_raspberry_pi should not raise exceptions if invoked outside Balena. Return False in both cases.

MAC address lookup returns invalid JSON

If no MAC address is found, we currently return null:

> $ curl -s 192.168.202.90/initFile.txt | base64 -d
{"VA": "NEBHNT-OUT1", "FR": "868", "E0": "00:BD:27:EF:6D:E7", "W0": null, "RPI": "00000000589f8f22", "OK": "112mPWkGW55kcbQTgbtJvgAKMSTrEhHgavrdF1Cbu8FU85tTL4Nc", "PK": "112mPWkGW55kcbQTgbtJvgAKMSTrEhHgavrdF1Cbu8FU85tTL4Nc", "PF": false, "ID": "204f46e27d5a44bc65fac817ebc77e5b"}%                                                                             

This is invalid JSON:

> $ curl -s 192.168.202.90/initFile.txt | base64 -d | jq
parse error: Invalid numeric literal at line 1, column 2

Add TPM to variants

We need to add TPM option to variants file for 5g units

This will be consumed by the miner and gwmfr containers to initialise the TPM

rak has no led

MNTD miners by rak have no LED

Not sure about other versions of RAK

Case for increasing linter max-line-length from 80 to 120

Issue

Current linter max line length of 80 characters causes unnecessary code fragmentation. In order to please the linter sometimes the resulting code is even harder to understand which is contrary to why we lint.

Example:

Dicts with descriptive key names and medium sized values:

request_payload = {
    ADD_GATEWAY_TXN_NAME_KEY: payload[ADD_GATEWAY_TXN_PAYLOAD_KEY]['destination_name'],
    'serial_number': payload[SERIAL_NUMBER_KEY]
}

Line 2 of the above is > 80 chars so for the linter it would need to be re-written as:

request_payload = {
    ADD_GATEWAY_TXN_NAME_KEY:
    payload[ADD_GATEWAY_TXN_PAYLOAD_KEY]['destination_name'],
    'serial_number': payload[SERIAL_NUMBER_KEY]
}

But this one becomes confusing to read as one key value pair is on 2 lines while the next one is on a single line, so we format it further like:

request_payload = {
    ADD_GATEWAY_TXN_NAME_KEY:
    payload[ADD_GATEWAY_TXN_PAYLOAD_KEY]['destination_name'],
    
    'serial_number':
    payload[SERIAL_NUMBER_KEY]
}

This results in both readable and liner-acceptable code. However this is far less readable (specially for dicts where key and start of value are expected to be on the same line) than the initial version.

To avoid such scenarios, devs are tempted to use shorter variable/constant/key names which again is bad for readability.

Solution

Most modern IDEs now have at least 120 characters of code area visible even with side bars etc. Older editors like VIM don't even suffer from this as they have more screen real-state available. Increasing the max-line-length to 120 is a good balance between readability and fewer code formatting woes.

Don't run tests for `provision_key()`

Right now, we are calling on get_gateway_mfr_test_result() to determine if the device has been provisioned or not. While this works, it is marginally slower than simply calling on get_public_keys_rust(), which would yield roughly the same result.

Suggestions for refactoring and development flow speedup.

Issue

We usually define constants in code for key names and then use those constants in the code repository. This saves us from having to change multiple locations when a key name needs to be renamed or a different name/key needs to be used. However some of these key names are used across repositories with each repositories defining their own constants or hard-coding these values. This leads to requiring changes at multiple repositories and going through the whole PR->Review->Approve-Merge->(optional) Release process.

What this results in is unnecessary wastage of time when only a single field name is changed.

Because of this issue there has been resistance to have proper and consistent key naming across repositories. Some examples:

  • Usage of lower case and upper case key names within the same dict
  • Friendly key names and legacy key names in things like /initFile.txt payload in hm-diag.
  • The same things being called as different names in different repositories like serial number in initFile payload to HPT to registration request to hm-dashboard.

Solution

Use a names package within hm-pyhelper. We should create a packkage named names (or something else like constants) where we store the names that are used across repositories. An example would be ADD_GATEWAY_TXN_PAYLOAD_KEY which currently is being called transaction in hm-dashboard and backend db while hm-diag, hm-pyhelper and hpt refer to it as destination_add_gateway_txn.

If such names are defined in one central place then they can be used everywhere and changing them in the future becomes trivial except for when database migrations are involved.

Please note that these are only for names/keys that are used across repos. Things residing inside a single repo can still be handled as the developer sees fit.

@NebraLtd/developers looking for your opinion on this.

Future considerations

For tasks that require changes in multiple repos and involve multiple developers working on parts of the task/epic; having a meeting of involved developers and agreeing on common names for data fields etc before implementation is advised. This just helps avoid unnecessary renaming later.

Create boiler plate python module

Notes
Objective is to abstract away reusable code. Identify shared components between hm-config/diag/etc

Acceptance criteria

  • setup module with README explainig how to import
  • publish to pypi
  • waiting for miner keys to exist. read from API instead of waiting for file to load.
  • miner keys parsing
  • diagnostics related things: MAC address
  • migrate https://github.com/NebraLtd/helium-hardware-definitions moved to hm-pythelper #3

Read firmware version from miner API

  • Add functionality to read firmware version from Miner API
  • Check what the latest GA version is
  • Create new issues in other repos that need to be updated to reflect this new logic.

Once done, adopt this in the relevant repos (replaces NebraLtd/hm-diag#89).

Pisces P100

See here for working setup on pisces p100...

Miner - radicale/hm-miner@90bb1c3
Diag - https://github.com/radicale/hm-diag
Config - https://github.com/radicale/hm-config
Helium miner software - https://github.com/radicale/helium-miner-software

Definition (from here):

# Pisces P100 Hotspot
    'COMP-PISCESP100': {
        'FRIENDLY': 'Pisces P100',
        'SPIBUS': 'spidev0.0',
        'ECCBUS': 'i2c-0',
        'KEY_STORAGE_BUS': '/dev/i2c-0',
        'RESET': 23,
        'MAC': 'eth0',
        'STATUS': 17,
        'BUTTON': 22,
        'ECCOB': True,
        'TYPE': 'Full',
        'CELLULAR': False,
        'FCC_IDS': [],
        'CONTAINS_FCC_IDS': [],
        'IC_IDS': [],
        'CONTAINS_IC_IDS': []
        },

Fix rockpi

if is_rockpi():
extra_args = ['--path', '/dev/i2c-7']
command.extend(extra_args)

this is currently outputting gateway_mfr test --path /dev/i2c-7 but it needs to be gateway_mfr --path /dev/i2c-7 test

gateway_mfr isn't properly included

I just deployed the latest hm-diag, which includes hm-pyhelper. However, it isn't working at this point.

>>> from hm_pyhelper.miner_param import get_public_keys_rust
>>> get_public_keys_rust()
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python3.8/site-packages/hm_pyhelper/miner_param.py", line 15, in get_public_keys_rust
    run_gateway_mfr_keys = subprocess.run(
  File "/usr/lib/python3.8/subprocess.py", line 493, in run
    with Popen(*popenargs, **kwargs) as process:
  File "/usr/lib/python3.8/subprocess.py", line 858, in __init__
    self._execute_child(args, executable, preexec_fn, close_fds,
  File "/usr/lib/python3.8/subprocess.py", line 1704, in _execute_child
    raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: '/usr/lib/python3.8/site-packages/hm_pyhelper/gateway_mfr'

Finestra miner teardown

Observations so far.

  • Using a Raspberry Pi 4b 4gb
  • Seeed wm1302 concentrator
  • Appears to be using OpenBalena
  • Does not have an exposed Ethernet port (but later versions apparently do)

20220121_151026
20220121_150413
20220121_151012

is_rockpi isn't called on in provision_key()

We need to refactor the provision_key() function to use the run_gateway_mfr() function rather than trying to invoke gateway_mfr by itself to make things DRY and also work on the RockPi.

Right now, because we're running gateway_ mfr directly, we don't call on the is_rockpi() function, and are therefore not taking the different I2C interface into account.

Change SPIBUS?

Perhaps we should change SPIBUS to LORA_BUS as with 5g we will be using USB concentrators.

We could also use the full address like /dev/spidev1.0 or /dev/ttyACM0 for example

Related to #28

Test third party miners thoroughly

Once we have fully functional software for a third party miner, we should do some through testing, perhaps similar to the deployment checks that @NebraLtd/tech-support do on the main repo.

In any case we should check various functionality and we should document what needs to be tested as well before it goes live

Use dbus or jsonrpc to create miner generated AddGateway transaction

Notes

Acceptance criteria

  • Method should accept a specific wallet to generate the transaction for
  • Throw an exception to indicate error
  • Reusable logic added to hm-pyhelper to generate AddGateway transaction
  • Confirm fee and amount are static values, and what their exact values should be
  • Returned format should match
{
  "address": "11TL62V8NYvSTXmV5CZCjaucskvNR1Fdar1Pg4Hzmzk5tk2JBac",
  "fee": 65000,
  "mode": "full",
  "owner": "14GWyFj9FjLHzoN3aX7Tq7PL6fEg4dfWPY8CrK8b9S5ZrcKDz6S",
  "payer": "138LbePH4r7hWPuTnK6HXVJ8ATM2QU71iVHzLTup1UbnPDvbxmr",
  "staking fee": 4000000,
  "txn": "CrkBCiEBrlImpYLbJ0z0hw5b4g9isRyPrgbXs9X+RrJ4pJJc9MkSIQA7yIy7F+9oPYCTmDz+v782GMJ4AC+jM+VfjvUgAHflWSJGMEQCIGfugfLkXv23vJcfwPYjLlMyzYhKp+Rg8B2YKwnsDHaUAiASkdxUO4fdS33D7vyid8Tulizo9SLEL1lduyvda9YVRCohAa5SJqWC2ydM9IcOW+IPYrEcj64G17PV/kayeKSSXPTJOMCEPUDo+wM="
}

Rework documentation

Need to rework docs surrounding variants to make sure it matches the variants file

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.