Git Product home page Git Product logo

pytorch-segmentation's Introduction

pytorch-segmentation

Training of semantic segmentation networks with PyTorch

pytorch-segmentation's People

Contributors

dusty-nv avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

pytorch-segmentation's Issues

Quick tutorial for custom datasets?

Hi @dusty-nv ,

Thanks for all the work you do on the inference repo. I realized you are using this repo to provide the models for the segmentation.

I was wondering if there was a quick and simple tutorial somewhere that explains the best way to use this repo in order to train on custom datasets and then use with the jetsoninfernce libs.

Thanks!

PyTorch Version

Hi There,

I'm having some issues trying to use this to train a custom image segmentation model and having some issues. I see the custom fork for torchvision v.0.3.0 but just wondering which version of torch this was written for.

Thanks

Custom training of segmentation model using jetson-inference

I am having issue in converting a custom trained segmentation network using pytorch on resnet101 network to convert to ONNX format using the onnx_export.py utility. It does not accept the export_onnx=True in the models.segmentation.__dict__[arch](num_classes=num_classes, aux_loss=None, pretrained=False, export_onnx=True). Is it safe OK to remove this parameter? Also how to load the custom trained onnx file for model inference? @dusty-nv

Training SegNet Deep Scene on personal dataset

Hello,
Thank you for the work you provided. I am now trying to use the network fcn-resnet18-deepscene-576x320 in my research. I run into trouble because the environment I am testing in has more shadows. So I need to fine tune the model with a dataset that I gathered myself.
It may sound silly, but I wonder how can I retrain your model? (or better said how to fine tune the model with extra dataset?) (I use Python 3.6 mainly)
I appreciate your help.

Best Regards,
erKarim

Dataset folder format for PASACAL VOC dataset

Hi,
I have downloaded VOCtrainval_11-May-2012 dataset.
Is there any pre-process for above data to run code available in this repo.
What is the folder format should be provided.
Available code is only for training or can do inference also.
I am a C++ guy. Not aware of python coding.
Kindly do help.

Thank You.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.