Git Product home page Git Product logo

Comments (11)

randomstring avatar randomstring commented on June 12, 2024

Interesting discussion here on CD about the tradeoffs of tunable pipeline parameters for FPS, accuracy, and max distance for sensing.

https://www.chiefdelphi.com/t/limelight-2023-1-megatag-performance-boost-my-mistake/423943

In particular the Crop feature to limit the area of the image to search for AprilTags was very good at increasing the FPS (and also potentially eliminating error). we need to see if PhotonVision has this feature. We should investigate using multiple pipelines with different crop levels work better for different distances.

For example we could have one pipeline for when we are in the community (close to april tags) that crops to the part of the image that the April tags will appear in and optimized for closer tags. And have a second pipeline that is optimized for further out when we are outside of the community.

from 2023-robot-code.

randomstring avatar randomstring commented on June 12, 2024

If I'm reading the documents correctly LL will compute a camera (or robot) pose on the LL. While PhotonVision only returns the Pose of each detected AprilTag and the estimated robot pose need to be calculated on the roboRIO. The LL's MegaTag feature to combine multiple tags into one pose estimate looks like a big win for getting a more accurate pose.

from 2023-robot-code.

randomstring avatar randomstring commented on June 12, 2024

Interesting documentation from LL on optimizing parameters for AprilTag detection accuracy, speed, and range.

https://docs.limelightvision.io/en/latest/apriltags_in_2d.html#tips

[Edit to add AprilTag tips]

Tips from PhontVision devs:

  • Low exposure is good: this reduces motion blur
  • brightness will depend on the environment.
  • Autoexposure OFF
  • angle camera 10-15degrees up or down (NOT straight on)

Tips from LimeLight:

For ideal tracking, consider the following:

  • Your tags should be as flat as possible.
  • Your Limelight should be mounted above or below tag height and angled up/down. Your target should look as trapezoidal as possible from your camera’s perspective. You don’t want your camera to ever be completely “head-on” with a tag if you want to avoid tag flipping.

There is an interplay between the following variables for AprilTag Tracking:

  • Increasing capture resolution will always increase 3D accuracy and increase 3d stability. This will also reduce the rate of ambiguity flipping from most perspectives. It will usually increase range. This will reduce pipeline framerate.
  • Increasing detector downscale will always increase pipeline framerate. It will decrease effective range, but in some cases this may be negligible. It will not affect 3D accuracy, 3D stability, or decoding accuracy.
  • Reducing exposure will always improve motion-blur resilience. This is actually really easy to observe. This may reduce range.
  • Reducing the brightness and contrast of the image will generally improve pipeline framerate and reduce range.
  • Increasing Sensor gain allows you to increase brightness without increasing exposure. It may reduce 3D stability, and it may reduce tracking stability.

from 2023-robot-code.

LuminousLlama avatar LuminousLlama commented on June 12, 2024

Just popped into my head, there is a problem where using 2 of the exact same camera leads to photonvision not working. If I remember correctly there is a working around but this should be a critical thing we need to explore as it interferes with our vision solution.

from 2023-robot-code.

randomstring avatar randomstring commented on June 12, 2024

If we run two cameras on a single Orange pi, we might run into the problem with duplicate camera names. We have the option to either run two Orange Pis with one camera each, or some combination of LimeLight hardware and Orange pi.

from 2023-robot-code.

randomstring avatar randomstring commented on June 12, 2024

@davidemassarenti-optio3 and @dylanh12210 solved the duplicate camera by using a software tool to update the USB cameras' serial number. I think they changed the serial number to "left camera" and "right camera" can either of you post a link to the software you used so we have a record for future use?

from 2023-robot-code.

davidemassarenti-optio3 avatar davidemassarenti-optio3 commented on June 12, 2024

I think it's this:

https://docs.arducam.com/UVC-Camera/Serial-Number-Tool-Guide/

from 2023-robot-code.

randomstring avatar randomstring commented on June 12, 2024

photonvision version 2023.3.0 https://github.com/PhotonVision/photonvision/releases/tag/v2023.3.0

from 2023-robot-code.

randomstring avatar randomstring commented on June 12, 2024

Based on my research I think we want the camera mounted

  • 22.8" off the ground (this is exactly between the centers of the low and high AprilTags),
  • angled down 10 degrees (important to avoid ambiguity, and down to reduce glare),
  • angled out 30 degrees (this allows us to see tags nearly directly in front of us with a 70deg FOV).

If that's not possible, then mounted low about 10" from the ground, angled up 10 degrees, and angled out 30 degrees.

from 2023-robot-code.

davidemassarenti-optio3 avatar davidemassarenti-optio3 commented on June 12, 2024

The glare is a function of the relative position of cameras and tag. The angle up/down of the camera shouldn't affect reflections.

The angle of the camera could affect the ambiguity, although I wouldn't be too worried about that. We are doing multiple samples and a sample that teleports us to the other side of the world would be easy to drop.

I think angling the cameras out would be enough. When we care most about tags, in front of the pickup or drop areas, the tags will not be parallel to the cameras

from 2023-robot-code.

randomstring avatar randomstring commented on June 12, 2024

OrangePi vision setup https://docs.google.com/document/d/17DNCNHxUo31Rh-7VmXXyn-Y25UtGND3NPoGL9gRosaQ/edit

from 2023-robot-code.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.