Git Product home page Git Product logo

facebookresearch / labgraph Goto Github PK

View Code? Open in Web Editor NEW
160.0 160.0 48.0 110.89 MB

LabGraph is a Python framework for rapidly prototyping experimental systems for real-time streaming applications. It is particularly well-suited to real-time neuroscience, physiology and psychology experiments.

License: MIT License

Starlark 0.41% C++ 25.65% Dockerfile 0.12% Python 49.07% C 0.10% Batchfile 0.06% Shell 0.05% CSS 0.05% HTML 0.04% TypeScript 6.33% Makefile 0.03% JavaScript 0.21% Jupyter Notebook 17.88%

labgraph's Introduction

LabGraph

LabGraph is a streaming framework built by the Facebook Reality Labs Research team at Facebook. LabGraph is also mentioned in this story. More information can be found here.

Quick Start

Method 1 - using PyPI (Recommended)

Prerequisites:

  • Python3.6+ (Python 3.8 recommended)
  • Mac (Big Sur, Monterey), Windows and Linux (CentOS 7, CentOS 8, Ubuntu 20.04; Python3.6/3.8)
  • Based on PyPa, the following Linux systems are also supported: Fedora 32+, Mageia 8+, openSUSE 15.3+, Photon OS 4.0+ (3.0+ with updates), Ubuntu 20.04+
pip install labgraph

For more documentation on installation and trouble shooting, please use this link.

Method 2 - building from source code

Prerequisites:

cd labgraph
python setup.py install

Method 3 - using Docker

Prerequisites:

Setup:

docker login
docker build -t IMAGE_NAME:VERSION .
docker images
docker run -it -d Image_ID
docker ps -a
docker exec -it CONTAINER_ID bash

Testing:

To make sure things are working you can run the example:

python -m labgraph.examples.simple_viz

You can also run the test suite as follows:

python -m pytest --pyargs labgraph

or

RUN export LC_ALL=C.UTF-8
RUN export LANG=en_US.utf-8
. test_script.sh

Now go to the index and documentation to learn more!

License: LabGraph is MIT licensed, as found in the LICENSE file.

labgraph's People

Contributors

bennaaym avatar catoen avatar celikbasak avatar dasfaust avatar djfurie avatar dtemir avatar fzchriha avatar gaurang-1402 avatar jfresearcheng avatar kristinlam avatar mofe64 avatar nik-sm avatar patrickmineault avatar pdamodara avatar sean-vu avatar yunusbcr avatar zdon-official avatar zina-kamel avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

labgraph's Issues

WebSocket Connection Interrupted on LabGraph Monitor

πŸ› Bug

WebSocket connection is interrupted when trying to display the real-time messages on Labgraph Monitor (frontend)

To Reproduce

Steps to reproduce the behavior:

  1. python3 labgraph/examples/labgraph_monitor_example.py
  2. cd extensions/prototypes/labgraph_monitor
  3. yarn start
  4. Click on RandomMessage Node
    Demo
    demo_error2

Expected behavior

After a few seconds the real-time data freezes, check the terminal where the WebSocket is running it will display the following error:
- WS_SERVER_NODE: Raised an exception: RuntimeError("Task <Task pending name='Task-162949' coro=<WSAPIServerNode.ws_server_publisher() running at /Users/fatimazahra.chriha/Desktop/mlh/spring2022/labgraph/labgraph/websockets/ws_server/ws_api_node_server.py:125>> got Future <Future pending> attached to a different loop")

Screen Shot 2022-04-02 at 3 20 49 PM

Environment


 - LabGraph Version:
 - OS: Mac
 - Python version: Python 3.8.13

ChatGPT-like conversational UI in localhost

πŸš€ Feature

Support chatGPT-like conversational UI in local machine.

Key UI features include:

  1. Question typing box
  2. Question and Answer display

Other useful features include:

  1. Thumbs up/down option next the the answer
  2. Add a new chat
  3. Facebook authentication for user login

Additional context

  1. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/extensions/speechgpt
  2. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  3. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  4. Add proper license header.

LabGraph Monitor Performance Metrics - YAML generation

πŸš€ Feature

Some performance metrics would be important for users to understand the real-time performance of the created graph:

  1. Latency: Average time delays between nodes; Average time delays for the critical path of the graph; Average time delays for node computation
  2. Throughput: Size of data passed per second (e.g. in bytes, kb or mb)
  3. Data rate: Number of messages passed per second

This feature would help users identified the performance bottleneck of the graph.

Users can choose in the backend whether performance monitoring is activated.

Additional context

  1. Existing application can be found [here] (https://github.com/facebookresearch/labgraph/tree/main/extensions/yaml_support)
  2. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/extensions/yaml_support
  3. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  4. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  5. Add proper license header, example:
#!/usr/bin/env python3
# Copyright 2004-present Facebook. All Rights Reserved.

Microphone integration with LabGraph

πŸš€ Feature

Support computer built-in microphone integration with LabGraph. Example on device integration can be found here

Additional context

  1. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/devices/microphone
  2. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  3. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  4. Add proper license header.

LabGraph Monitor Improvement - Add input/output message information in YAML generation

πŸš€ Feature

Add input/output message information (e.g. data type and message name) in YAML generation (backend).

Additional context

  1. Existing application can be found [here] (https://github.com/facebookresearch/labgraph/tree/main/extensions/yaml_support)
  2. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/extensions/yaml_support
  3. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  4. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  5. Add proper license header, example.

Graphviz Not Installed

πŸ“š Documentation

I had an error running python -m extensions.graphviz_support.graphviz_support.tests.test_lg_graphviz_api from the root directory even after running python setup.py install for the extension

$ python -m extensions.graphviz_support.graphviz_support.tests.test_lg_graphviz_api
E......
======================================================================
ERROR: test_build_graph (__main__.TestLabgraphGraphvizAPI)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/damir/miniconda3/envs/labgraph/lib/python3.8/site-packages/graphviz-0.19.1-py3.8.egg/graphviz/backend/execute.py", line 85, in run_check
    proc = subprocess.run(cmd, **kwargs)
  File "/home/damir/miniconda3/envs/labgraph/lib/python3.8/subprocess.py", line 489, in run
    with Popen(*popenargs, **kwargs) as process:
  File "/home/damir/miniconda3/envs/labgraph/lib/python3.8/subprocess.py", line 854, in __init__
    self._execute_child(args, executable, preexec_fn, close_fds,
  File "/home/damir/miniconda3/envs/labgraph/lib/python3.8/subprocess.py", line 1702, in _execute_child
    raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: PosixPath('dot')

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/damir/Git/labgraph/extensions/graphviz_support/graphviz_support/tests/test_lg_graphviz_api.py", line 94, in test_build_graph
    build_graph("Demo", nodes, output_file_name, output_file_format)
  File "/home/damir/Git/labgraph/extensions/graphviz_support/graphviz_support/generate_graphviz/generate_graphviz.py", line 154, in build_graph
    graph_viz.render()
  File "/home/damir/miniconda3/envs/labgraph/lib/python3.8/site-packages/graphviz-0.19.1-py3.8.egg/graphviz/_tools.py", line 172, in wrapper
    return func(*args, **kwargs)
  File "/home/damir/miniconda3/envs/labgraph/lib/python3.8/site-packages/graphviz-0.19.1-py3.8.egg/graphviz/rendering.py", line 119, in render
    rendered = self._render(*args, **kwargs)
  File "/home/damir/miniconda3/envs/labgraph/lib/python3.8/site-packages/graphviz-0.19.1-py3.8.egg/graphviz/_tools.py", line 172, in wrapper
    return func(*args, **kwargs)
  File "/home/damir/miniconda3/envs/labgraph/lib/python3.8/site-packages/graphviz-0.19.1-py3.8.egg/graphviz/backend/rendering.py", line 317, in render
    execute.run_check(cmd,
  File "/home/damir/miniconda3/envs/labgraph/lib/python3.8/site-packages/graphviz-0.19.1-py3.8.egg/graphviz/backend/execute.py", line 88, in run_check
    raise ExecutableNotFound(cmd) from e
graphviz.backend.execute.ExecutableNotFound: failed to execute PosixPath('dot'), make sure the Graphviz executables are on your systems' PATH

----------------------------------------------------------------------
Ran 7 tests in 0.010s

I was able to resolve the issue following this WillKoehrsen/Data-Analysis#36 with sudo apt install graphviz for Ubuntu 20.04

Fatima is having the same issue rn, we're trying to troubleshoot with Mac

How does logging work in LabGraph?

❓ Questions and Help

Question

LabGraph includes an HDF5Logger node, but the documentation does not explain how to use this node. How should the HDF5Logger be added to a Graph?

What I have tried

Using the simple_viz.py example, I have attempted to add a logger to the Demo graph:

class Demo(lg.Graph):
    AVERAGED_NOISE: AveragedNoise
    PLOT: Plot
    LOGGER: lg.HDF5Logger

    def setup(self) -> None:
        logger_config = lg.LoggerConfig(
            output_directory=os.getcwd(),
            streams_by_logging_id=self.AVERAGED_NOISE._get_streams())
        self.LOGGER.configure(logger_config)

        self.AVERAGED_NOISE.configure(
            AveragedNoiseConfig(sample_rate=SAMPLE_RATE,
                                num_features=NUM_FEATURES,
                                window=WINDOW))
        self.PLOT.configure(
            PlotConfig(refresh_rate=REFRESH_RATE, num_bars=NUM_FEATURES))

    def connections(self) -> lg.Connections:
        return ((self.AVERAGED_NOISE.OUTPUT, self.PLOT.INPUT),)

    def process_modules(self) -> Tuple[lg.Module, ...]:
        return (self.AVERAGED_NOISE, self.PLOT)

This graphs runs normally (no errors), but it does not create an HDF5 file for logging. What is missing?

Support webcam camera streaming from PC to phone using mediapipe

πŸš€ Feature

Support webcam camera streaming from PC to phone using mediapipe

Additional context

Example application can be found [here] (https://github.com/facebookresearch/labgraph/tree/main/devices/webcam)
The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/android/labgraph_vision
Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
Add proper license header.

LSL Support

LabStreamingLayer is a widely-used transport layer to stream biosignal data. It is supported by over 50 biosignal devices. An LSL sub node could be implemented in much the same way that the ZMQ sub node is implemented.

LSL support

πŸš€ Feature

Support LabStreamingLayer sources

Motivation

Labgraph is meant to be used for processing streaming biosignals in realtime. However, it doesn't currently have drivers for biosignal hardware (ExG, fNIRS, etc.), which means that users have to implement their own nodes to stream external data. This forms a barrier to experiment with LG.

Pitch

LabStreamingLayer is a widely-used transport layer to stream biosignal data. It is supported by over 50 biosignal devices. An LSL sub node could be implemented in much the same way that the ZMQ sub node is implemented.

Alternatives

ZeroMQ-based streaming is a partial solution to streaming biosignal data into LG, however it still requires writing adapters to grab data, as most biosignal hardware does not directly stream to ZeroMQ.

Additional context

I can implement a minimalistic LSL sub node with tests, however I'd like to know if that fits with current roadmap expectations.

Replay VRS file (image data) and generate hand pose

πŸš€ Feature

Hand tracking enables the use of hands as an input method for AR/VR applications. When using hands as input modality, it delivers a new sense of presence, enhances social engagement, and delivers more natural interactions with fully tracked hands and articulated fingers.

MediaPipe can be used for real-time hand tracking. LabGraph can be used to record the data captured from a webcam. It could also be extended to other downstream applications.

This task is to replay vrs file (image data) and generate hand pose. This task needs the completion of a previous task: #90

Additional context

  1. MediaPipe can be found [here] (https://google.github.io/mediapipe/solutions/hands)
  2. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/devices/webcam/
  3. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  4. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  5. Add proper license header.

Serial Port Support in LabGraph

πŸš€ Feature

Sensor & Device integration is an important part to help accelerate researchers development. This integration on Serial Port will allow new devices to be automatically usable in LabGraph. Serial Port here refers to pyserial: https://pypi.org/project/pyserial/

Additional context

  1. Create relevant files using the example from https://github.com/facebookresearch/labgraph/tree/main/labgraph/zmq_node
  2. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/devices/protocols/
  3. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  4. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml

LabGraph Monitor Performance Metrics - Frontend Monitor Support

πŸš€ Feature

Some performance metrics would be important for users to understand the real-time performance of the created graph:

  1. Latency: Average time delays between nodes; Average time delays for the critical path of the graph; Average time delays for node computation
  2. Throughput: Size of data passed per second (e.g. in bytes, kb or mb)
  3. Data rate: Number of messages passed per second

This feature would design and implement the frontend to support performance metrics for LabGraph Monitor.

Users can choose in the backend whether performance monitoring is activated.

Additional context

  1. Existing application can be found [here] (https://github.com/facebookresearch/labgraph/tree/main/extensions/prototypes/labgraph_monitor)
  2. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/extensions/prototypes/labgraph_monitor
  3. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  4. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  5. Add proper license header.

Array dimension problem with HDF5 Logger when it is used with np.ndarray

πŸ› Bug

HDF5 Logger records only some portion on the numpy array when it has 2 or more dimesion

To Reproduce

Steps to reproduce the behavior:

  1. Create RandomMessage.py as,
import labgraph as lg
import numpy as np
# A data type used in streaming, see docs: Messages
class RandomMessage(lg.Message):
    timestamp: float
    data: np.ndarray

and save it in same folder with simple_viz.py. import in simple_viz.py as from RandomMessage import RandomMessage.
!!! If you don't do this, it creates pickle problem.

  1. NUM_FEATURES = 100
    change with NUM_FEATURES = 3 for easy tracking

  2. timestamp=time.time(), data=np.random.rand(self.config.num_features)
    add second dimension to the data timestamp=time.time(), data=np.random.rand(self.config.num_features,2)

  3. all_data = np.stack([message.data for message in self.state.messages])
    change to all_data = np.stack([message.data[:,0] for message in self.state.messages]) for consistency

  4. add logger to simple_viz.py as,

# import os 
# from typing import Dict

    def logging(self) -> Dict[str, lg.Topic]:
        return {"AVERAGED_NOISE.GENERATOR.OUTPUT": self.AVERAGED_NOISE.GENERATOR.OUTPUT}


# Entry point: run the Demo graph
if __name__ == "__main__":
    
    options = lg.RunnerOptions(logger_config=lg.LoggerConfig(output_directory=os.getcwd(), recording_name='Data'))                                                                                       
    runner = lg.ParallelRunner(graph=Demo(), options=options)                                
    runner.run()

  1. Check Data.h5
import h5py

filename = "Data.h5"
data = h5py.File(filename,'r')

AVERAGED_NOISE_GENERATOR_OUTPUT = data['AVERAGED_NOISE.GENERATOR.OUTPUT']

for message in AVERAGED_NOISE_GENERATOR_OUTPUT:
    print(message) # message[0] for timestamp, message[1] for data  

data.close()

output,

(1.63596626e+09, array([0.41416002, 0.39325703, 0.66150395]))
(1.63596626e+09, array([0.62143319, 0.41737914, 0.61858237]))
(1.63596626e+09, array([0.44333838, 0.24507895, 0.20770642]))
(1.63596626e+09, array([0.32552889, 0.54726282, 0.05445356]))
(1.63596626e+09, array([0.36576081, 0.38641926, 0.8574152 ]))
(1.63596626e+09, array([0.97234542, 0.00512511, 0.00118584]))
(1.63596626e+09, array([0.11100949, 0.01632484, 0.42616371]))
...
...

Expected behavior

(1.63596626e+09, some random numpy array with shape of (3,2))
(1.63596626e+09, some random numpy array with shape of (3,2))
(1.63596626e+09, some random numpy array with shape of (3,2))
(1.63596626e+09, some random numpy array with shape of (3,2))
(1.63596626e+09, some random numpy array with shape of (3,2))
...
...

Environment


 - LabGraph Version (e.g., 1.0): 1.0.2 (installed via pip)
 - OS (e.g., Linux/Mac/Windows): Ubuntu 20.04
 - Python version: 3.6.15 

Integrate SpeechGPT with Lgpilot

πŸš€ Feature

This issue is the integration of issue 97 and issue 99, so that user could display LabGraph code in the UI.

Additional context

  1. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/extensions/lgpilot
  2. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  3. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  4. Add proper license header.

Real-time hand tracking using WebCam

πŸš€ Feature

Hand tracking enables the use of hands as an input method for AR/VR applications. When using hands as input modality, it delivers a new sense of presence, enhances social engagement, and delivers more natural interactions with fully tracked hands and articulated fingers.

MediaPipe can be used for real-time hand tracking. LabGraph can be used to record the data captured from a webcam. It could also be extended to other downstream applications.

This task is to use MediaPipe for real-time hand tracking and labgraph for data logging (hdf5 format).

Additional context

  1. MediaPipe can be found [here] (https://google.github.io/mediapipe/solutions/hands)
  2. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/devices/webcam/
  3. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  4. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  5. Add proper license header.

LabGraph Monitor Improvement - Graphviz for LabGraph graphs

πŸš€ Feature

We would like to be able to generate a graphviz visualization of the LabGraph topology. This will be a function that looks something like:
def generate_graphviz(graph: df.Graph, output_file: str) -> None:
The graph topology itself consists of the following:

  • Methods (these are the nodes in the graph)
  • Publishers (these are directed edges leaving a method)
  • Subscribers (these are directed edges entering a method)
  • Nodes (these are logical groupings of methods which always share the same process – technically not part of the topology, but the grouping may be useful to call out)
  • Topics
    Topics determine what the directed edges are. Any two connected topics will back the same stream. Each stream will have one publisher and one or more subscribers; the edges are all from that publisher to each subscriber.

We can inspect these properties of a graph by looking at these private attributes:
__topics__: The topics
__streams__: The streams
__methods__: The methods (publishers, subscribers, and "transformers" which are just methods with both @publisher and @subscriber decorators)

Additional context

  1. Existing application can be found [here] (https://github.com/facebookresearch/labgraph/tree/main/extensions/graphviz_support)
  2. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/extensions/graphviz_support
  3. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  4. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  5. Add proper license header, example.

Pytest causes bus error on test_hang.

πŸ› Bug

When running pytest on the whole Labgraph module, via python -m pytest --pyargs -v labgraph --ignore=labgraph/devices, the test labgraph/runner/tests/test_exception.py::test_hang fails on [ProcessPhase.STOPPING] every time with a SIGABRT. However, running these tests alone, via python -m pytest --pyargs -v labgraph.runners.tests.test_exception, passes.

To Reproduce

Steps to reproduce the behavior:

  1. Build docker image using Dockerfile, e.g., docker build . -t labgraph
  2. Run inside tests inside a container, docker run -it labgraph bash
  3. Run the test suite with gdb debugging, gdb -ex r --args python3.9 -m pytest --pyargs -v labgraph --ignore=labgraph/devices

Since I ran with GDB, I was able to run a backtrace after the SIGABRT (find log file attached below):

labgraph/runners/tests/test_exception.py::test_hang[ProcessPhase.STARTING] [Detaching after fork from child process 2271]
[Detaching after fork from child process 2272]
PASSED                                                                                 [ 72%]
labgraph/runners/tests/test_exception.py::test_hang[ProcessPhase.READY] [Detaching after fork from child process 2339]
[Detaching after fork from child process 2340]
PASSED                                                                                    [ 73%]
labgraph/runners/tests/test_exception.py::test_hang[ProcessPhase.RUNNING] [Detaching after fork from child process 2407]
[Detaching after fork from child process 2408]
PASSED                                                                                  [ 73%]
labgraph/runners/tests/test_exception.py::test_hang[ProcessPhase.STOPPING] [Detaching after fork from child process 2490]
[Detaching after fork from child process 2491]

Thread 28 "python3.9" received signal SIGABRT, Aborted.
[Switching to Thread 0x7fff6bfff700 (LWP 1440)]
0x00007ffff711e387 in __GI_raise (sig=sig@entry=6) at ../nptl/sysdeps/unix/sysv/linux/raise.c:55
55        return INLINE_SYSCALL (tgkill, 3, pid, selftid, sig);
(gdb) bt
#0  0x00007ffff711e387 in __GI_raise (sig=sig@entry=6) at ../nptl/sysdeps/unix/sysv/linux/raise.c:55
#1  0x00007ffff711fa78 in __GI_abort () at abort.c:90
#2  0x00007ffff4218a95 in __gnu_cxx::__verbose_terminate_handler () at ../../../../libstdc++-v3/libsupc++/vterminate.cc:95
#3  0x00007ffff4216a06 in __cxxabiv1::__terminate (handler=<optimized out>) at ../../../../libstdc++-v3/libsupc++/eh_terminate.cc:38
#4  0x00007ffff4216a33 in std::terminate () at ../../../../libstdc++-v3/libsupc++/eh_terminate.cc:48
#5  0x00007ffff4216c53 in __cxxabiv1::__cxa_throw (obj=0x7fff64000940, tinfo=0x7ffff44a41f0 <typeinfo for std::runtime_error>, dest=
    0x7ffff422b1f0 <std::runtime_error::~runtime_error()>) at ../../../../libstdc++-v3/libsupc++/eh_throw.cc:87
#6  0x00007ffff4b9454a in cthulhu::Framework::validate() () from /home/builder/.local/lib/python3.9/site-packages/cthulhubindings.cpython-39-x86_64-linux-gnu.so
#7  0x00007ffff4bea5d0 in cthulhu::StreamConsumerIPC::update() () from /home/builder/.local/lib/python3.9/site-packages/cthulhubindings.cpython-39-x86_64-linux-gnu.so
#8  0x00007ffff4bea1f5 in cthulhu::StreamConsumerIPC::StreamConsumerIPC(cthulhu::StreamInterfaceIPC*, std::function<bool (cthulhu::StreamConfigIPC const&)> const&, std::function<bool (cthulhu::StreamSampleIPC const&)> const&, bool)::{lambda()#1}::operator()() const ()
   from /home/builder/.local/lib/python3.9/site-packages/cthulhubindings.cpython-39-x86_64-linux-gnu.so
#9  0x00007ffff4bebb2a in void std::__invoke_impl<void, cthulhu::StreamConsumerIPC::StreamConsumerIPC(cthulhu::StreamInterfaceIPC*, std::function<bool (cthulhu::StreamConfigIPC const&)> const&, std::function<bool (cthulhu::StreamSampleIPC const&)> const&, bool)::{lambda()#1}>(std::__invoke_other, cthulhu::StreamConsumerIPC::StreamConsumerIPC(cthulhu::StreamInterfaceIPC*, std::function<bool (cthulhu::StreamConfigIPC const&)> const&, std::function<bool (cthulhu::StreamSampleIPC const&)> const&, bool)::{lambda()#1}&&) () from /home/builder/.local/lib/python3.9/site-packages/cthulhubindings.cpython-39-x86_64-linux-gnu.so
#10 0x00007ffff4bebadf in std::__invoke_result<cthulhu::StreamConsumerIPC::StreamConsumerIPC(cthulhu::StreamInterfaceIPC*, std::function<bool (cthulhu::StreamConfigIPC const&)> const&, std::function<bool (cthulhu::StreamSampleIPC const&)> const&, bool)::{lambda()#1}>::type std::__invoke<cthulhu::StreamConsumerIPC::StreamConsumerIPC(cthulhu::StreamInterfaceIPC*, std::function<bool (cthulhu::StreamConfigIPC const&)> const&, std::function<bool (cthulhu::StreamSampleIPC const&)> const&, bool)::{lambda()#1}>(std::__invoke_result&&, (cthulhu::StreamConsumerIPC::StreamConsumerIPC(cthulhu::StreamInterfaceIPC*, std::function<bool (cthulhu::StreamConfigIPC const&)> const&, std::function<bool (cthulhu::StreamSampleIPC const&)> const&, bool)::{lambda()#1}&&)...) ()
   from /home/builder/.local/lib/python3.9/site-packages/cthulhubindings.cpython-39-x86_64-linux-gnu.so
#11 0x00007ffff4beba8c in void std::thread::_Invoker<std::tuple<cthulhu::StreamConsumerIPC::StreamConsumerIPC(cthulhu::StreamInterfaceIPC*, std::function<bool (cthulhu::StreamConfigIPC const&)> const&, std::function<bool (cthulhu::StreamSampleIPC const&)> const&, bool)::{lambda()#1}> >::_M_invoke<0ul>(std::_Index_tuple<0ul>) ()
   from /home/builder/.local/lib/python3.9/site-packages/cthulhubindings.cpython-39-x86_64-linux-gnu.so
#12 0x00007ffff4beba62 in std::thread::_Invoker<std::tuple<cthulhu::StreamConsumerIPC::StreamConsumerIPC(cthulhu::StreamInterfaceIPC*, std::function<bool (cthulhu::StreamConfigIPC const&)> const&, std::function<bool (cthulhu::StreamSampleIPC const&)> const&, bool)::{lambda()#1}> >::operator()() ()
   from /home/builder/.local/lib/python3.9/site-packages/cthulhubindings.cpython-39-x86_64-linux-gnu.so
#13 0x00007ffff4beba46 in std::thread::_State_impl<std::thread::_Invoker<std::tuple<cthulhu::StreamConsumerIPC::StreamConsumerIPC(cthulhu::StreamInterfaceIPC*, std::function<bool (cthulhu::StreamConfigIPC const&)> const&, std::function<bool (cthulhu::StreamSampleIPC const&)> const&, bool)::{lambda()#1}> > >::_M_run() ()
   from /home/builder/.local/lib/python3.9/site-packages/cthulhubindings.cpython-39-x86_64-linux-gnu.so
#14 0x00007ffff4f42860 in execute_native_thread_routine () from /home/builder/.local/lib/python3.9/site-packages/cthulhubindings.cpython-39-x86_64-linux-gnu.so
#15 0x00007ffff7bc6ea5 in start_thread (arg=0x7fff6bfff700) at pthread_create.c:307
#16 0x00007ffff71e6b0d in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:111

It looks like the culprit is cthulhu::Framework::validate() which is throwing the SIGABRT. This can be traced back to:

void Framework::validate() {
auto memoryPool = Framework::instance().memoryPool();
if (memoryPool != nullptr && !Framework::instance().memoryPool()->isValid()) {
throw std::runtime_error("Framework is not valid");
}
}

log.txt

Expected behavior

Test suite should not SIGABRT.

Environment

  • LabGraph Version (e.g., 1.0): 2.0.0 (pperanich/upgrade-python-version branch)
  • OS (e.g., Linux/Mac/Windows): Linux (inside manylinux2014 container)
  • Python version: 3.6-3.10

Human Validation and improvement on Code to Node function

πŸš€ Feature

This feature is to validate the results generated from #96
and the relevant improvement.

Additional context

  1. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/extensions/lgpilot/
  2. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  3. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  4. Add proper license header.

python executable not found on windows

On Windows, there is no binary named python3.exe or python36.exe (only python.exe), so this fails to find the executable:

for executable in ("python3", "python" + PYTHON_VERSION):
if not shutil.which(executable):
continue
if _config_var(executable, "py_version_short") != PYTHON_VERSION:
continue
return _config_var(executable, key)
raise Exception("Could not find Python " + PYTHON_VERSION + " on PATH")

python used to be in this list, but it was removed in eda02c8. Can it be added back? The correct python version (3.6 or 3.8, never 2.x) should still be found by line 24.

LabGraph Support in Mac M1

πŸš€ Feature

Support LabGraph in Mac M1.

Motivation

Based on Wikipedia, the Apple M1 is an ARM-based system on a chip (SoC) designed by Apple Inc. as a central processing unit (CPU) and graphics processing unit (GPU) for its Macintosh computers and iPad Pro tablets. It also marks the third change to the instruction set used by Macintosh computers, 14 years after Apple switched Macs from PowerPC to Intel in 2006.

Additional context

Potential changes may need to be made for both Cthulhu and LabGraph.

Data Generation and Parser from OpenGPT

πŸš€ Feature

Generate data set for training using opengpt. And also create a parser to separate the coded from the context.

The starting points would be filtering techniques: e.g. Kalman Filter and IIR filter.

Convert:

An infinite impulse response (IIR) filter is a type of digital filter that can be used to implement a variety of different frequency response characteristics. It is characterized by its recursive structure, which allows it to have an impulse response that extends indefinitely into the past. Here is some sample code that demonstrates how to implement an IIR filter using a direct form II structure:

import numpy as np

class IIRFilter:
    def __init__(self, b, a):
        self.b = b
        self.a = a
        self.z = np.zeros(max(len(b), len(a)))

    def filter(self, x):
        y = np.zeros_like(x)
        for n in range(len(x)):
            y[n] = self.b[0] * x[n]
            for m in range(1, len(self.b)):
                if n-m >= 0:
                    y[n] += self.b[m] * x[n-m]
            for m in range(1, len(self.a)):
                if n-m >= 0:
                    y[n] -= self.a[m] * y[n-m]
            self.z[0] = x[n]
            self.z[1:] = self.z[:-1]
        return y

to
result[0] = "An infinite impulse response (IIR) filter is a type of digital filter that can be used to implement a variety of different frequency response characteristics. It is characterized by its recursive structure, which allows it to have an impulse response that extends indefinitely into the past. Here is some sample code that demonstrates how to implement an IIR filter using a direct form II structure:"

result[1] =

import numpy as np

class IIRFilter:
    def __init__(self, b, a):
        self.b = b
        self.a = a
        self.z = np.zeros(max(len(b), len(a)))

    def filter(self, x):
        y = np.zeros_like(x)
        for n in range(len(x)):
            y[n] = self.b[0] * x[n]
            for m in range(1, len(self.b)):
                if n-m >= 0:
                    y[n] += self.b[m] * x[n-m]
            for m in range(1, len(self.a)):
                if n-m >= 0:
                    y[n] -= self.a[m] * y[n-m]
            self.z[0] = x[n]
            self.z[1:] = self.z[:-1]
        return y

The example above is a transformer node.

For this issue, we would like to prioritize the following nodes (priority high to low):

  1. Transformer Node (e.g. IIRFilter)
  2. Source Node (e.g. Microphone integration)
  3. UI (e.g. Matplotlib-based plotting)

Additional context

  1. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/extensions/lgpilot
  2. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  3. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  4. Add proper license header.

Real-time hand tracking logging via WebCam in VRS format

πŸš€ Feature

Hand tracking enables the use of hands as an input method for AR/VR applications. When using hands as input modality, it delivers a new sense of presence, enhances social engagement, and delivers more natural interactions with fully tracked hands and articulated fingers.

MediaPipe can be used for real-time hand tracking. LabGraph can be used to record the data captured from a webcam. It could also be extended to other downstream applications.

This task is to use VRS as the data logging format in labgraph. This task needs the completion of a previous task: #89

Additional context

  1. pyVRS can be found [here] (https://github.com/facebookresearch/vrs/tree/main/pyvrs)
  2. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/devices/webcam/
  3. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  4. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  5. Add proper license header.

Microphone integration with speechGPT

πŸš€ Feature

This issue is the integration of issue 99 and issue 100, so that user could use voice via microphone as input to speechGPT.

Additional context

  1. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/extensions/speechgpt
  2. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  3. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  4. Add proper license header.

Hand Tracking Visualization

πŸš€ Feature

Hand tracking enables the use of hands as an input method for the Oculus Quest headsets. When using hands as input modality, it delivers a new sense of presence, enhances social engagement, and delivers more natural interactions with fully tracked hands and articulated fingers.

We can use LabGraph to record the data captured from Quest headset and visualize the data (e.g. in Unity). Oculus Hand Tracking API can be found here.

One set of Quest 2 and Link Cable could be provided for a US-based user who has contributed to LabGraph (subject to review/approval).

This task is a follow-up task of #81 on visualize the data obtained.

Additional context

  1. Existing application can be found [here] (https://developer.oculus.com/documentation/unity/unity-handtracking/)
  2. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/devices/quest2/visualization
  3. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  4. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  5. Add proper license header.

Test suite crashes while running test_process_manager.py

πŸ› Bug

When running the test suite with python -m pytest --pyargs labgraph, pytest crashes at test_process_manager.py. Strangely, the tests complete successfully when test_process_manager.py is run independently.

To Reproduce

Steps to reproduce the behavior:

  1. Run the full test suite: python -m pytest --pyargs labgraph. Here's the output on my system:
====================================================== test session starts ======================================================
platform linux -- Python 3.6.8, pytest-3.10.1, py-1.10.0, pluggy-0.13.1
rootdir: /home/snel, inifile:
plugins: mock-2.0.0
collected 115 items                                                                                                             

miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/_cthulhu/tests/test_clock.py ...                            [  2%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/_cthulhu/tests/test_cthulhu.py ..                           [  4%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/events/tests/test_event_generator.py ............           [ 14%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/events/tests/test_event_generator_node.py .....             [ 19%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/graphs/tests/test_config.py ...                             [ 21%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/graphs/tests/test_graph.py ..                               [ 23%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/graphs/tests/test_group.py ................                 [ 37%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/graphs/tests/test_harness.py .....                          [ 41%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/graphs/tests/test_module.py ..                              [ 43%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/graphs/tests/test_node.py ....                              [ 46%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/graphs/tests/test_publisher.py ..                           [ 48%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/graphs/tests/test_subscriber.py ..                          [ 50%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/loggers/tests/test_loggers.py .                             [ 51%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/messages/tests/test_message.py ....................         [ 68%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_aligner.py ss                            [ 70%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_cpp.py .                                 [ 71%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_exception.py .........                   [ 79%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_launch.py .                              [ 80%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py ...........Aborted (core dumped)
  1. Run the tests in labgraph/runners/tests with verbose output python -m pytest --pyargs -v labgraph.runners.tests. Here's the output on my system:
====================================================== test session starts ======================================================
platform linux -- Python 3.6.8, pytest-3.10.1, py-1.10.0, pluggy-0.13.1 -- /home/snel/miniconda3/envs/labgraph/bin/python
cachedir: .pytest_cache
rootdir: /home/snel, inifile:
plugins: mock-2.0.0
collected 31 items                                                                                                              

miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_aligner.py::test_slow_align_interval SKIPPED [  3%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_aligner.py::test_align_two_streams SKIPPED [  6%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_cpp.py::test_cpp_graph PASSED            [  9%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_exception.py::test_local_throw[MainThrowerNode] PASSED [ 12%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_exception.py::test_local_throw[BackgroundThrowerNode] PASSED [ 16%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_exception.py::test_local_throw[PublisherThrowerNode] PASSED [ 19%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_exception.py::test_local_throw[PublisherSubscriberThrowerNode] PASSED [ 22%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_exception.py::test_parallel_throw[PublisherThrowerGraph] PASSED [ 25%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_exception.py::test_parallel_throw[SubscriberThrowerGraph] PASSED [ 29%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_exception.py::test_parallel_throw[MainThrowerGraph] PASSED [ 32%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_exception.py::test_parallel_throw[BackgroundThrowerGraph] PASSED [ 35%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_exception.py::test_logger_throw PASSED   [ 38%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_launch.py::test_launch PASSED            [ 41%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_normal PASSED   [ 45%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_crash[ProcessPhase.STARTING] PASSED [ 48%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_crash[ProcessPhase.READY] PASSED [ 51%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_crash[ProcessPhase.RUNNING] PASSED [ 54%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_crash[ProcessPhase.STOPPING] PASSED [ 58%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_exception[ProcessPhase.STARTING] PASSED [ 61%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_exception[ProcessPhase.READY] PASSED [ 64%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_exception[ProcessPhase.RUNNING] PASSED [ 67%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_exception[ProcessPhase.STOPPING] PASSED [ 70%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_hang[ProcessPhase.STARTING] PASSED [ 74%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_hang[ProcessPhase.READY] PASSED [ 77%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_hang[ProcessPhase.RUNNING] Aborted (core dumped)
  1. Run only test_process_manager.py: python -m pytest --pyargs -v labgraph.runners.tests.test_process_manager. Here's the output on my system:
====================================================== test session starts ======================================================
platform linux -- Python 3.6.8, pytest-3.10.1, py-1.10.0, pluggy-0.13.1 -- /home/snel/miniconda3/envs/labgraph/bin/python
cachedir: .pytest_cache
rootdir: /home/snel, inifile:
plugins: mock-2.0.0
collected 13 items                                                                                                              

miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_normal PASSED   [  7%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_crash[ProcessPhase.STARTING] PASSED [ 15%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_crash[ProcessPhase.READY] PASSED [ 23%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_crash[ProcessPhase.RUNNING] PASSED [ 30%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_crash[ProcessPhase.STOPPING] PASSED [ 38%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_exception[ProcessPhase.STARTING] PASSED [ 46%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_exception[ProcessPhase.READY] PASSED [ 53%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_exception[ProcessPhase.RUNNING] PASSED [ 61%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_exception[ProcessPhase.STOPPING] PASSED [ 69%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_hang[ProcessPhase.STARTING] PASSED [ 76%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_hang[ProcessPhase.READY] PASSED [ 84%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_hang[ProcessPhase.RUNNING] PASSED [ 92%]
miniconda3/envs/labgraph/lib/python3.6/site-packages/labgraph/runners/tests/test_process_manager.py::test_hang[ProcessPhase.STOPPING] PASSED [100%]

================================================== 13 passed in 37.99 seconds ===================================================

In steps 1 and 2, pytest crashes while running test_process_manager.py. In step 3, we run only the tests in test_process_manager.py, and no crash occurs.

The verbose output in step 2 shows that the crash happens in the test_hang function of test_process_manager.py when the hang_phase is set to ProcessPhase.RUNNING. Here's a link to that function:

@pytest.mark.parametrize(
"hang_phase",
(
ProcessPhase.STARTING,
ProcessPhase.READY,
ProcessPhase.RUNNING,
ProcessPhase.STOPPING,
),
)
@local_test
def test_hang(hang_phase: ProcessPhase) -> None:
"""
Tests that we can run multiple processes where one of them hangs.
"""
manager = ProcessManager(
processes=(
ProcessInfo(
module=__name__,
name="proc1",
args=(
"--manager-name",
"test_manager",
"--shutdown",
"HANG",
"--last-phase",
hang_phase.name,
),
),
ProcessInfo(
module=__name__,
name="proc2",
args=("--manager-name", "test_manager", "--shutdown", "NORMAL"),
),
),
name="test_manager",
startup_period=TEST_STARTUP_PERIOD,
shutdown_period=TEST_SHUTDOWN_PERIOD,
)
with pytest.raises(ProcessManagerException) as ex:
manager.run()
assert ex.value.failures == {"proc1": ProcessFailureType.HANG, "proc2": None}

Does anyone know what would cause this crash?

Expected behavior

The full test suite should run to completion. Also, the results of running one test script should match the results of running that same script within the full test suite.

Environment

  • LabGraph Version (e.g., 1.0): 1.0.0 (installed via pip)
  • OS (e.g., Linux/Mac/Windows): CentOS 8
  • Python version: 3.6.8

Support phone streaming object detection result to cloud

πŸš€ Feature

Support phone streaming object detection result to cloud

Additional context

Example application can be found [here] (https://github.com/facebookresearch/labgraph/tree/main/devices/webcam)
The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/android/labgraph_vision
Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
Add proper license header.

LabGraph Monitor API Improvement

πŸš€ Feature

LabGraph Monitor currently needs a user to define the websocket connections when generating the yaml file passed to the frontend. Example for labgraph_monitor_example.py (link):

    def connections(self) -> lg.Connections:
        return (
            (self.NOISE_GENERATOR.NOISE_GENERATOR_OUTPUT, self.ROLLING_AVERAGER.ROLLING_AVERAGER_INPUT),
            (self.NOISE_GENERATOR.NOISE_GENERATOR_OUTPUT, self.AMPLIFIER.AMPLIFIER_INPUT),
            (self.NOISE_GENERATOR.NOISE_GENERATOR_OUTPUT, self.ATTENUATOR.ATTENUATOR_INPUT),
            (self.NOISE_GENERATOR.NOISE_GENERATOR_OUTPUT, self.SERIALIZER.SERIALIZER_INPUT_1),
            (self.ROLLING_AVERAGER.ROLLING_AVERAGER_OUTPUT, self.SERIALIZER.SERIALIZER_INPUT_2),
            (self.AMPLIFIER.AMPLIFIER_OUTPUT, self.SERIALIZER.SERIALIZER_INPUT_3),
            (self.ATTENUATOR.ATTENUATOR_OUTPUT, self.SERIALIZER.SERIALIZER_INPUT_4),
            (self.SERIALIZER.SERIALIZER_OUTPUT, self.WS_SERVER_NODE.topic),
        )

Can we simplify the the API to the following instead:

    def connections(self) -> lg.Connections:
        return (
            (self.NOISE_GENERATOR.NOISE_GENERATOR_OUTPUT, self.ROLLING_AVERAGER.ROLLING_AVERAGER_INPUT),
            (self.NOISE_GENERATOR.NOISE_GENERATOR_OUTPUT, self.AMPLIFIER.AMPLIFIER_INPUT),
            (self.NOISE_GENERATOR.NOISE_GENERATOR_OUTPUT, self.ATTENUATOR.ATTENUATOR_INPUT),
        )

Additional context

  1. Existing application can be found [here] (https://github.com/facebookresearch/labgraph/tree/main/extensions/yaml_support)
  2. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/extensions/yaml_support
  3. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  4. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  5. Add proper license header, example:
#!/usr/bin/env python3
# Copyright 2004-present Facebook. All Rights Reserved.

Time intervals of messages are not equally separated for fixed sample rate.

❓ Questions and Help

Timestamps of messages are not equally separated for fixed sample rate. How can I have equal time separation between samples?

To Reproduce

Steps to reproduce the behavior:
In simple_viz.py change line 49 as,

async def generate_noise(self) -> lg.AsyncPublisher:
        tmp = 0
        while True:
            now = time.time()
            print((now - tmp)*1000)
            tmp = now
            yield self.OUTPUT, RandomMessage(
                timestamp=now, data=np.random.rand(self.config.num_features)
            )
            await asyncio.sleep(1 / self.config.sample_rate)

and with SAMPLE_RATE = 10.0 I am getting this time intervals(in ms) as output,

1634770261291.764
101.32384300231934
102.19454765319824
102.87952423095703
102.3719310760498
101.06134414672852
103.06859016418457
102.8130054473877
104.36487197875977
102.22911834716797
101.91011428833008
102.6158332824707
103.40309143066406
102.59652137756348
...
...

How can I have 100.0 ms time interval between samples?

P.S. For low sampling rate it is not a big deal but for high frequency sensors it becomes important.

Apache Ant causes 404 error during docker build

πŸ› Bug

docker build fails with an HTTP 404 error when trying to download Apache Ant

To Reproduce

Steps to reproduce the behavior:

  1. Navigate to the root directory of the labgraph repository.
  2. Run docker build .

Expected behavior

In the following line, Docker attempts to download Apache Ant 1.10.9 from a link that no longer exists:

RUN wget https://downloads.apache.org/ant/binaries/apache-ant-1.10.9-bin.zip

Instead, we should download Apache Ant 1.10.9 from the archive:

RUN wget https://archive.apache.org/dist/ant/binaries/apache-ant-1.10.9-bin.zip

Environment

  • LabGraph Version (e.g., 1.0): 1.0.0, commit 97e0f18
  • OS (e.g., Linux/Mac/Windows): Ubuntu 18.04.5 LTS
  • Python version: 3.6

Message-based I/O Support in LabGraph

πŸš€ Feature

Sensor & Device integration is an important part to help accelerate researchers development. This integration on Message-based I/O will allow new devices to be automatically usable in LabGraph. Message-based I/O here refers to cobs-python: https://pypi.org/project/cobs/

Additional context

  1. Create relevant files using the example from https://github.com/facebookresearch/labgraph/tree/main/labgraph/zmq_node
  2. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/devices/protocols/
  3. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  4. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml

Real-time Hand Tracking and Logging

πŸš€ Feature

Hand tracking enables the use of hands as an input method for the Oculus Quest headsets. When using hands as input modality, it delivers a new sense of presence, enhances social engagement, and delivers more natural interactions with fully tracked hands and articulated fingers.

We can use LabGraph to record the data captured from Quest headset and visualize the data (e.g. in Unity). Oculus Hand Tracking API can be found here.

One set of Quest 2 and Link Cable could be provided for a US-based user who has contributed to LabGraph (subject to review/approval).

This project provides an opportunity for a contributor to learn more about Oculus SDK and python programming.

Additional context

  1. Existing application can be found [here] (https://developer.oculus.com/documentation/unity/unity-handtracking/)
  2. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/devices/quest2
  3. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  4. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  5. Add proper license header.

No module named 'buck_ext'

πŸ› Bug

Upgrading LabGraph with,

python3.6 -m pip install labgraph -U

creates No module named 'buck_ext' error

Environment

 - LabGraph Version (e.g., 1.0): 1.0.2 (installed via pip)
 - OS (e.g., Linux/Mac/Windows): Ubuntu 20.04
 - Python version: 3.6.15 

Support object detection on Android phone using Mediapipe and a mock video stream feed

πŸš€ Feature

Support object detection on Android phone using Mediapipe and a mock video stream feed

Additional context

Example application can be found [here] (https://github.com/facebookresearch/labgraph/tree/main/devices/webcam)
The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/android/labgraph_vision
Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
Add proper license header.

Support LabGraph Monitor using a free online service

πŸš€ Feature

Creating instructions on hosting LabGraph Monitor service using a free online service, Reference.

This would help reduce the computer usage for user locally and simplify the use case for LabGraph Monitor.

Additional context

  1. Existing application can be found [here] (https://github.com/facebookresearch/labgraph/tree/main/extensions/prototypes/labgraph_monitor)
  2. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_monitor
    Note that the application would be moved out of the prototypes folder.
  3. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  4. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml

Determine Source Node in Subscriber?

❓ Questions and Help

I have a publisher.

class NoiseGenerator(lg.Node):
    OUTPUT = lg.Topic(RandomMessage)
    config: NoiseGeneratorConfig

    @lg.publisher(OUTPUT)
    async def generate_noise(self) -> lg.AsyncPublisher:
        while True:
            yield self.OUTPUT, RandomMessage(
                timestamp=time.time(),
                data=np.random.rand(self.config.num_features)
            )
            await asyncio.sleep(1 / self.config.sample_rate)

I have a subscriber.

class Serializer(lg.Node):
  INPUT_1 = lg.Topic(RandomMessage)
  config: SerializerConfig
  state: DataState
  
  @lg.subscriber(INPUT_1)
  def add_message_1(self, message: RandomMessage) -> None:
      self.state.data_1 = {
          "timestamp": message.timestamp,
          "numpy": list(message.data),
      }

They are connected through the graph.

def connections(self) -> lg.Connections:
    return (
        ...
        (self.NOISE_GENERATOR.OUTPUT, self.SERIALIZER.INPUT_1),
        ...
    )

Question: is there a way for me to determine to which publisher my subscriber INPUT_1 is subscribed to within def add_message_1()?

@lg.subscriber(INPUT_1)
def add_message_1(self, message: RandomMessage) -> None:
    name_of_the_publisher = self.INPUT_1.source_name (should be "NOISE_GENERATOR/OUTPUT")
    self.state.data_1 = {
        "publisher": name_of_the_publisher
        "timestamp": message.timestamp,
        "numpy": list(message.data),
    }

Update library dependency for LabGraph Monitor and LabGraph Viz

Description

Update library dependency as defined in the dependabot alerts. Reference link for LabGraph Monitor and LabGraph Viz.

Additional context

  1. Add relevant test plans for LabGraph Monitor and LabGraph Viz.
  2. Specify the relevant information: LabGraph Version (e.g., 2.0); OS (e.g., Linux/Mac/Windows)/ Python version
  3. Update/Add the relevant github action supports.

Support for macOS

πŸš€ Feature

Support to build on macOS would be highly appreciated.

Motivation

macOS is one of the primary dev platforms for many developers.

More information

As such, I gave it a try and after few issues with buck/java8, python setup.py build started to proceed but then am stuck at issues with linking errors with gettext

Seems other projects have faced it and so far none of the remedies have worked.

Real-time hand pose visualization using mock hand pose estimation

πŸš€ Feature

Hand tracking enables the use of hands as an input method for AR/VR applications. When using hands as input modality, it delivers a new sense of presence, enhances social engagement, and delivers more natural interactions with fully tracked hands and articulated fingers.

MediaPipe can be used for real-time hand tracking. LabGraph can be used to record the data captured from a webcam. It could also be extended to other downstream applications.

This task is to create mock hand pose data and visualize it with real-time hand. This task needs the completion of a previous task: #90

Additional context

  1. MediaPipe can be found [here] (https://google.github.io/mediapipe/solutions/hands)
  2. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/devices/webcam/
  3. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  4. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  5. Add proper license header.

Create Code to Node function in LabGraph

πŸš€ Feature

Create a function to convert code to a LabGraph node.

The starting points would be filtering techniques: e.g. Kalman Filter and IIR filter.

Convert

import numpy as np

class IIRFilter:
    def __init__(self, b, a):
        self.b = b
        self.a = a
        self.z = np.zeros(max(len(b), len(a)))

    def filter(self, x):
        y = np.zeros_like(x)
        for n in range(len(x)):
            y[n] = self.b[0] * x[n]
            for m in range(1, len(self.b)):
                if n-m >= 0:
                    y[n] += self.b[m] * x[n-m]
            for m in range(1, len(self.a)):
                if n-m >= 0:
                    y[n] -= self.a[m] * y[n-m]
            self.z[0] = x[n]
            self.z[1:] = self.z[:-1]
        return y

to

class IIRFilterNode(lg.Node):
    INPUT = lg.Topic(InputMessage)
    OUTPUT = lg.Topic(OutputMessage)

    def setup(self):
        self.b = self.config.b
        self.a = self.config.a
        self.z = np.zeros(max(len(self.b), len(self.a)))

    @lg.subscriber(INPUT)
    @lg.publisher(OUTPUT)
    def filter(self, message: MessageInput):
        x = message.x
        y = np.zeros_like(x)
        y = np.zeros_like(x)
        for n in range(len(x)):
            y[n] = self.b[0] * x[n]
            for m in range(1, len(self.b)):
                if n-m >= 0:
                    y[n] += self.b[m] * x[n-m]
            for m in range(1, len(self.a)):
                if n-m >= 0:
                    y[n] -= self.a[m] * y[n-m]
            self.z[0] = x[n]
            self.z[1:] = self.z[:-1]
        yield self.OUTPUT, y

Additional context

  1. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/extensions/lgpilot
  2. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  3. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  4. Add proper license header.

Audio-based Mediapipe Buckification for Android Application

πŸš€ Feature

MediaPipe is an open-source framework for building pipelines to perform computer vision inference over arbitrary sensory data such as video or audio. Using MediaPipe, such a perception pipeline can be built as a graph of modular components. It is currently built using bazel.

Buck is a multi-language build system developed and used by Meta Platforms, Inc. It was designed for building small, reusable modules consisting of code and resources within a monorepo.[5] It supports many programming languages, including C++, Swift, Shell, Java, Kotlin, Python, Lua, OCaml, Rust and Go. It can produce binary outputs for a variety of target platforms including IOS, Android, .NET and Java VM runtimes. This project will focus on Buck2.

This project targets buckifying mediapipe (i.e. change mediapipe’s build system from bazel to buck) with a focus on supporting Audio Classification

Additional context

Example application can be found here
The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/android/labgraph_audio
Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
Add proper license header.

Using Black for Code Formatter?

πŸš€ Feature

I think it would be a good idea to use Black https://github.com/psf/black as a code formatter to set a code style.

Motivation

I just think it would be a good idea to set a standard across the entire project.

I ran black . on the entire repository and got this. It looks like it would be especially useful to make extensions more readable.

Skipping .ipynb files as Jupyter dependencies are not installed.
You can fix this by running ``pip install black[jupyter]``
reformatted extensions/graphviz_support/graphviz_support/errors/errors.py
reformatted extensions/graphviz_support/graphviz_support/tests/demo_graph/amplifier.py
reformatted extensions/graphviz_support/graphviz_support/tests/demo_graph/attenuator.py
reformatted extensions/graphviz_support/graphviz_support/tests/demo_graph/noise_generator.py
reformatted extensions/graphviz_support/graphviz_support/graphviz_node/graphviz_node.py
reformatted extensions/graphviz_support/graphviz_support/tests/demo_graph/demo.py
reformatted buck_ext.py
reformatted extensions/graphviz_support/graphviz_support/tests/demo_graph/rolling_averager.py
reformatted extensions/graphviz_support/graphviz_support/generate_graphviz/generate_graphviz.py
reformatted extensions/graphviz_support/graphviz_support/tests/test_lg_graphviz_api.py
reformatted extensions/labgraph_viz/labgraph_viz/application/application.py
reformatted extensions/labgraph_viz/labgraph_viz/examples/application_example.py
reformatted extensions/labgraph_viz/labgraph_viz/examples/scatter_plot_example.py
reformatted extensions/labgraph_viz/labgraph_viz/examples/line_plot_example.py
reformatted extensions/labgraph_viz/labgraph_viz/examples/spatial_plot_example.py
reformatted extensions/labgraph_viz/labgraph_viz/examples/heat_map_example.py
reformatted extensions/labgraph_viz/labgraph_viz/plots/bar_plot.py
reformatted extensions/labgraph_viz/labgraph_viz/plots/heat_map.py
reformatted extensions/labgraph_viz/labgraph_viz/plots/color_map.py
reformatted extensions/psychopy_example/psychopy_example/__main__.py
reformatted extensions/yaml_support/labgraph_monitor/aliases/aliases.py
reformatted extensions/psychopy_example/setup.py
reformatted extensions/yaml_support/labgraph_monitor/labgraph_monitor.py
reformatted extensions/yaml_support/labgraph_monitor/lg_monitor_node/lg_monitor_message.py
reformatted extensions/yaml_support/labgraph_monitor/lg_monitor_node/lg_monitor_node.py
reformatted extensions/yaml_support/labgraph_monitor/server/lg_monitor_server.py
reformatted extensions/yaml_support/labgraph_yaml_parser/loader/base_loader.py
reformatted extensions/psychopy_example/psychopy_example/components.py
reformatted extensions/yaml_support/labgraph_yaml_parser/loader/errors/errors.py
reformatted extensions/yaml_support/labgraph_yaml_parser/model/base_model.py
reformatted extensions/yaml_support/labgraph_yaml_parser/loader/python_file_loader.py
reformatted extensions/yaml_support/labgraph_monitor/examples/labgraph_monitor_example.py
reformatted extensions/yaml_support/labgraph_yaml_parser/serializer/base_serializer.py
reformatted extensions/yaml_support/labgraph_yaml_parser/serializer/yaml_serializer.py
reformatted extensions/labgraph_viz/labgraph_viz/plots/line_plot.py
reformatted extensions/yaml_support/labgraph_yaml_parser/tests/tests_code/test_py_code.py
reformatted extensions/yaml_support/labgraph_yaml_parser/tests/tests_code/test_lg_unit_node.py
reformatted extensions/yaml_support/labgraph_monitor/generate_lg_monitor/generate_lg_monitor.py
reformatted extensions/yaml_support/setup.py
reformatted extensions/yaml_support/labgraph_yaml_parser/model/lg_unit_model.py
reformatted extensions/yaml_support/labgraph_monitor/tests/test_lg_monitor_api.py
reformatted labgraph/devices/protocols/lsl/lsl_message.py
reformatted labgraph/devices/protocols/lsl/lsl_poller_node.py
reformatted extensions/yaml_support/labgraph_yaml_parser/_parser/lg_units_parser.py
reformatted labgraph/devices/protocols/lsl/lsl_sender_node.py
reformatted extensions/yaml_support/labgraph_monitor/server/serializer_node.py
reformatted extensions/yaml_support/labgraph_yaml_parser/tests/test_lg_yaml_api.py
reformatted labgraph/examples/rate.py
reformatted labgraph/examples/zmq_source.py
reformatted labgraph/events/tests/test_event_generator_node.py
reformatted labgraph/devices/protocols/lsl/tests/test_lsl_node.py
reformatted labgraph/examples/simulation.py
reformatted labgraph/examples/simple_viz_zmq.py
reformatted labgraph/runners/tests/test_process_manager.py
reformatted setup.py
reformatted labgraph/runners/tests/test_exception.py
reformatted signal_processing/synthetic/generators/tests/test_sine_wave_generator.py
reformatted test.py
reformatted labgraph/runners/process_manager.py
reformatted signal_processing/utils/types/time_series.py

All done! ✨ 🍰 ✨
60 files reformatted, 238 files left unchanged.

LabGraph Monitor Improvement - Support graph attributes in YAML

Use a similar method in a related PR, to support LabGraph topology in LabGraph Monitor. This will be a function that looks something like:
def generate_labgraph_monitor(graph: df.Graph) -> None:
The graph topology itself consists of the following:

  • Methods (these are the nodes in the graph)
  • Publishers (these are directed edges leaving a method)
  • Subscribers (these are directed edges entering a method)
  • Nodes (these are logical groupings of methods which always share the same process – technically not part of the topology, but the grouping may be useful to call out)
  • Topics
    Topics determine what the directed edges are. Any two connected topics will back the same stream. Each stream will have one publisher and one or more subscribers; the edges are all from that publisher to each subscriber.

We can inspect these properties of a graph by looking at these private attributes:
__topics__: The topics
__streams__: The streams
__methods__: The methods (publishers, subscribers, and "transformers" which are just methods with both @publisher and @subscriber decorators)

Additional context

  1. Existing application can be found [here] (https://github.com/facebookresearch/labgraph/tree/main/extensions/graphviz_support)
  2. The code should update/replace the following folder: https://github.com/facebookresearch/labgraph/tree/main/extensions/yaml_support
  3. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  4. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  5. Add proper license header, example.

Support integration with SurveyGPT

πŸš€ Feature

Support integration of LabGraph Vision with SurveyGPT

Motivation

LabGraph Vision could provide context information for SurveyGPT through (but not limited to) object detection, holistic detection and etc.

Additional context

Example application can be found [here] (https://github.com/facebookresearch/labgraph/tree/main/devices/webcam)
The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/android/labgraph_vision
Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
Add proper license header.

Linking issues on Ubuntu 20 for cthulhu

πŸ› Bug

import cthulhubindings fails with error ImportError: cthulhubindings.cpython-36m-x86_64-linux-gnu.so: undefined symbol: shm_unlink

To Reproduce

Steps to reproduce the behavior:

  1. Installed labgraph on Ubuntu 20, with base install of Python 3.6 (from ppa:deadsnakes/ppa)
  2. python3.6 -m labgraph.examples.simple_viz
  3. ImportError: /usr/local/lib/python3.6/dist-packages/labgraph-1.0.0-py3.6-linux-x86_64.egg/cthulhubindings.cpython-36m-x86_64-linux-gnu.so: undefined symbol: shm_unlink

The test suite also fails (all errors) as import cthulhubindings fails

Expected behavior

works

Environment

 - LabGraph Version (e.g., 1.0): 1.0.0
 - OS (e.g., Linux/Mac/Windows): Ubuntu 20.04.2 LTS
 - Python version: 3.6.14 from `ppa:deadsnakes/ppa`
 - Any other relevant information: N/A

Additional context

It looks like the issue is in linking the relevant realtime libraries. I see in the buck config -lrt used in exported_linker_flags, but it doesn't look like it's picking it up. Is there a way to increase the verbosity of the buck outputs to be able to see the flags it passes to g++? I looked through the buck logs but nothing jumped out.

Network Sockets Support in LabGraph

πŸš€ Feature

Sensor & Device integration is an important part to help accelerate researchers development. This integration on Network Sockets will allow new devices to be automatically usable in LabGraph. Network socket here refers to python socket module: https://docs.python.org/3/library/socket.html . A stretch goal is to also support Boost.Asio, https://www.boost.org/doc/libs/1_76_0/doc/html/boost_asio.html

Additional context

  1. Create relevant files using the example from https://github.com/facebookresearch/labgraph/tree/main/labgraph/zmq_node
  2. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/devices/protocols/
  3. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  4. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  5. Add proper license header, example.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.