Git Product home page Git Product logo

edge-tpu-silva's Introduction

The power of Coral Edge TPU and Ultralytics all in one place: edge-tpu-silva.

Our edge-tpu-silva is a Python package that simplifies the installation of the Coral TPU USB dependency and ensures compatibility with PyCoral and Ultralytics. This package empowers object detection, segmentation and classification capabilities on various edge devices to achieve higher FPS (Real Time Processor Speed).

Coral USB Accelerator Exclusivity:

The edge-tpu-silva library is purpose-built for seamless integration with the Coral USB Accelerator. This powerful hardware accelerator is not just a requirement but a strategic choice, unlocking the library's full potential for superior object detection, segmentation and classification.

Discover the Coral USB Accelerator and experience a tailored approach to edge computing with the edge-tpu-silva library.

The package edge-tpu-silva is only compactible with python versions <3.10. Install specific python version if your python version is not compatible.

Example: For Raspberry Pi you can Click For instructions on how to install specific python version using pyenv

Note: Python 3.6 to 3.9 is Recommended, Click on link above on how to install specific python version

Note: Be sure your Raspberry Pi is up to date. To do so, run below command in terminal.

sudo apt-get update
sudo apt-get upgrade

Run the bash code below in your terminal to create and activate a new virtual environment named .venv. Ensure you are in the specific directory you want this environment to be installed.

python3 -m venv .venv
source .venv/bin/activate

Step 1: Install edge-tpu-silva

To install edge-tpu-silva, use the following pip command in a specified python environment:

pip install edge-tpu-silva

Step 2: Run Setup Command

This table provides an overview of the compatibility of the system with different devices and operating systems.

Compatibility Setup Command
Raspberry Pi 5 silvatpu-linux-setup
Raspberry Pi 4 silvatpu-linux-setup
Raspberry Pi 3 silvatpu-linux-setup
Jetson Nano silvatpu-linux-setup
Windows
macOS

In order to configure setup tools for your system, run the setup command in the terminal after step 1 is completed.

Example: If you are on a Raspberry Pi 5, run below command in the terminal following step 1.

silvatpu-linux-setup

The command installs the standard Edge TPU runtime for Linux, running the device at a reduced clock frequency. Alternatively, you can install a version for maximum speed, but be cautious of increased power consumption and device heat. If unsure, stick to the reduced frequency for safety. To install maximum frequency runtime, specify the speed of the setup command to max.

silvatpu-linux-setup --speed max

You cannot have both versions of the runtime installed at the same time, but you can switch by simply installing the alternate runtime as shown above

Caution: Using the USB Accelerator at maximum clock frequency can make it dangerously hot. To prevent burn injuries, keep it out of reach or operate it at a reduced clock frequency.

Note: Please ensure that you have the Coral USB Accelerator connected through usb 3.0 port (for faster speed). If the Coral USB Accelerator was connected during the installation and setup, please disconnect and reconnect it to ensure proper configuration.

To unleash the power of object detection, segmentation, and classification with this library, you'll need an Edge TPU-compatible .tflite model. These models should be exported using Ultralytics, ensuring a seamless integration with the edge-tpu-silva library.

NOTE: Please be aware that the imgsz value specified during YOLO export should align with the same value used when defining imgsz for any of the processes. Consistency in these settings is crucial for optimal performance.

Smaller models will run faster but may have lower accuracy, while larger models will run slower but typically have higher accuracy. Explore the capabilities of edge computing with below models using edge-tpu-silva library.

Download Link Process Base Model imgsz Object Classes
Download Model Detection yolov8n.pt 240 COCO128
Download Model Segmentation yolov8n-seg.pt 240 COCO128
Download Model Detection yolov8n.pt 192 COCO128
Download Model Segmentation yolov8n-seg.pt 192 COCO128
Download Model Classification yolov8n-cls.pt 640 ImageNet
Download Model Detection yolov9c.pt 240 COCO128

NOTE: The YOLOv9 model, particularly the YOLOv9c.pt version, is substantial in size, which leads to its TensorFlow Lite version also being quite large. As a result, its processing speed on an Edge TPU is comparatively slower.

Usage

To perform object detection using the process_detection function, you can follow this example:

from edge_tpu_silva import process_detection

# Run the object detection process
outs = process_detection(model_path='path/to/your/model.tflite', input_path='path/to/your/input/video.mp4', imgsz=192)

for _, _ in outs:
  pass

Running process_detection in the Terminal: Using the Entry Point "silvatpu"

To perform object detection with the process_detection function from the command line, you can use the user-friendly entry point silvatpu. Here's an example command:

silvatpu -p det -m path/to/model.tflite -i path/to/input/video.mp4 -z 192 -t 0.5 -v True

To perform object segmentation using the process_segmentation function, you can follow this example:

from edge_tpu_silva import process_segmentation

# Run the object segmentation process
outs = process_segmentation(model_path='path/to/your/model.tflite', input_path='path/to/your/input/video.mp4', imgsz=192)

for _, _ in outs:
  pass

Running process_segmentation in the Terminal: Using the Entry Point "silvatpu"

To perform object segmentation with the process_segmentation function from the command line, you can use the user-friendly entry point silvatpu. Here's an example command:

silvatpu -p seg -m path/to/model.tflite -i path/to/input/video.mp4 -z 192 -t 0.5 -v True

Process detection, segmentation and classification Function Input Parameters

Parameter Description Default Value
model_path Path to the object segmentation model. -
input_path File path of image/video to process (Camera(0|1|2)). -
imgsz Defines the image size for inference. -
threshold Threshold for detected objects. 0.4
verbose Display prints to the terminal. True
show Display frame with segmentation. False
classes Filters predictions to a set of class IDs. None

Process detection, segmentation and classification Function Output

Each process function yields the following output:

Output Parameter Description
objs_lst List of objects detected in frame.
fps Frames per second (fps) of the processed frame.

Example usage:

from edge_tpu_silva import process_detection

# Run the object detection process
outs = process_detection(model_path='path/to/your/model.tflite', input_path='path/to/your/input/video.mp4', imgsz=192)

for objs_lst, fps in outs:
    # Access the output parameters as needed
    print(f"Processed frame with {len(objs_lst)} objects. FPS: {fps}")
    print("List of object predictions in frame:")
    print(objs_lst)

Contributions are welcome! If you find any issues or have suggestions for improvements, please open an issue or submit a pull request.

Python Package Index Maintainer(s) (c) [2024] David Nyarko

edge-tpu-silva's People

Contributors

davidnyarko123 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

edge-tpu-silva's Issues

Raspberry Pi (Bullseye)

Trying to setup edge-tpu-silva on my bullseye os and I get this error message

image

It work's perfectly fine on bookworm, but I want to use bullseye instead of bookworm

ValueError: Failed to load delegate from libedgetpu.so.1

As was happened to you in the video I am facing same issue:

Loading 192_yolov8n_full_integer_quant_edgetpu.tflite for TensorFlow Lite Edge TPU inference...
Traceback (most recent call last):
  File "/home/edgeai/edgeai/.venv/lib/python3.9/site-packages/tflite_runtime/interpreter.py", line 160, in load_delegate
    delegate = Delegate(library, options)
  File "/home/edgeai/edgeai/.venv/lib/python3.9/site-packages/tflite_runtime/interpreter.py", line 119, in __init__
    raise ValueError(capture.message)
ValueError

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/edgeai/edgeai/example.py", line 6, in <module>
    for _, _ in outs:
  File "/home/edgeai/edgeai/.venv/lib/python3.9/site-packages/edge_tpu_silva/silva/silva_detect.py", line 38, in process_detection
    outs = model.predict(
  File "/home/edgeai/edgeai/.venv/lib/python3.9/site-packages/ultralytics/engine/model.py", line 445, in predict
    self.predictor.setup_model(model=self.model, verbose=is_cli)
  File "/home/edgeai/edgeai/.venv/lib/python3.9/site-packages/ultralytics/engine/predictor.py", line 297, in setup_model
    self.model = AutoBackend(
  File "/home/edgeai/edgeai/.venv/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/home/edgeai/edgeai/.venv/lib/python3.9/site-packages/ultralytics/nn/autobackend.py", line 337, in __init__
    interpreter = Interpreter(model_path=w, experimental_delegates=[load_delegate(delegate)])
  File "/home/edgeai/edgeai/.venv/lib/python3.9/site-packages/tflite_runtime/interpreter.py", line 162, in load_delegate
    raise ValueError('Failed to load delegate from {}\n{}'.format(
ValueError: Failed to load delegate from libedgetpu.so.1

The thing is I couldn't fix it. Please do you know what is the way to solve it ? I changed the threshold and I use images with exactly 240x240 and 192x192 for the models 192 and 240 but nothing works. Thank you in advance.

do this run on coral m.2 interface

do this will run on m.2 edge tpu devices? i can run yolov5 on my coral m.2 on pi5, on this edgetpu_silva, i can also run to get the fps only ,getting the objs_list is empty, so i cannot draw bounding box

CSI camera path

how do i give the CSI camera (which is connected to rasberry pi 5) path to your input_path variable. i am not able to do that. please help out ?
Also give me insight about the segmentation fault which occured your video as well.
Screenshot 2024-05-09 125557

Failed to load delegate from libedgetpu.so.1

(.venv) rpi5@raspberrypi:~/silva $ silvatpu -p det -m 240_yolov8n_full_integer_quant_edgetpu.tflite -i Working_Man_7.mp4 -z 192 -t 0.5
Loading 240_yolov8n_full_integer_quant_edgetpu.tflite for TensorFlow Lite Edge TPU inference...
Traceback (most recent call last):
File "/home/rpi5/silva/.venv/lib/python3.9/site-packages/tflite_runtime/interpreter.py", line 160, in load_delegate
delegate = Delegate(library, options)
File "/home/rpi5/silva/.venv/lib/python3.9/site-packages/tflite_runtime/interpreter.py", line 119, in init
raise ValueError(capture.message)
ValueError

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/rpi5/silva/.venv/bin/silvatpu", line 8, in
sys.exit(silvatpu())
File "/home/rpi5/silva/.venv/lib/python3.9/site-packages/edge_tpu_silva/main.py", line 97, in silvatpu
for _, _ in outs:
File "/home/rpi5/silva/.venv/lib/python3.9/site-packages/edge_tpu_silva/silva/silva_detect.py", line 38, in process_detection
outs = model.predict(
File "/home/rpi5/silva/.venv/lib/python3.9/site-packages/ultralytics/engine/model.py", line 446, in predict
self.predictor.setup_model(model=self.model, verbose=is_cli)
File "/home/rpi5/silva/.venv/lib/python3.9/site-packages/ultralytics/engine/predictor.py", line 297, in setup_model
self.model = AutoBackend(
File "/home/rpi5/silva/.venv/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/home/rpi5/silva/.venv/lib/python3.9/site-packages/ultralytics/nn/autobackend.py", line 337, in init
interpreter = Interpreter(model_path=w, experimental_delegates=[load_delegate(delegate)])
File "/home/rpi5/silva/.venv/lib/python3.9/site-packages/tflite_runtime/interpreter.py", line 162, in load_delegate
raise ValueError('Failed to load delegate from {}\n{}'.format(
ValueError: Failed to load delegate from libedgetpu.so.1

(.venv) rpi5@raspberrypi:~/silva $

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.