Git Product home page Git Product logo

Comments (7)

ahundt avatar ahundt commented on June 29, 2024 1

keras-contrib is where new functionality now goes for Keras until it is ready for prime time:
https://github.com/fchollet/keras/blob/master/CONTRIBUTING.md#pull-requests

Kept replies numbered below so we can refer back to them, the best version of the DenseNetFCN model code is in ahundt/keras-contrib with the densenet-atrous branch, and Keras-FCN.

The most hot burning of any item is (4), since I've got evidence it works in Keras-FCN with ResNets, but this is not DenseNetFCN specific.
I'd say second most burning which is specific for the tiramisu DenseNetFCN network might be (6a) + (1) which are both easy steps.

  1. NADAM one seems to be a concrete improvement to Adam, I might try it out if there are better hyperparameters as they mentioned in there tensorflow/tensorflow#9175. You're definitely right that they don't solve all the world's problems. :-)
  2. That's a lot of epochs! How could they make progress for that long?
  3. That mistake has been accounted for in the linked DenseNetFCN
  4. Pretrained DenseNet ImageNet weights + Atrous convolutions look like they might be a solid non-tiramisu approach since it worked for resnet 50. I mention this in keras-team/keras-contrib#63 and have an implementation in https://github.com/aurora95/Keras-FCN/blob/master/models.py#L235, that would require transferring original densenet imagenet weights or training from scratch.
  5. A larger/better dataset always helps, I've been working on that with coco in Keras-FCN, I think tweaking that to work could make a huge difference. (5a) One peculiarity that still need to be resolved is a single pixel can be in multiple classes, I was thinking of changing the output to be single class, but add an option for one-hot encoding so categorical-crossentropy will give more credit for one match in any category. This involves reasonably small changes in Keras-FCN. (5b) I was thinking of going to loading the segmentation from the masks directly from pycocotools rather than the files like this loop without numpy.save
  6. Other datasets that might also be good options and are easily integrated via what I've already implemented for other datasets as per the Keras-FCN dataset instructions:
  7. What page/column of the paper is the ROI thing you mention?

from one-hundred-layers-tiramisu.

Fahim-F avatar Fahim-F commented on June 29, 2024 1

Hello,
Excuse me, I want to know about the file "fc-densenet-model.py", Does it work or not??

from one-hundred-layers-tiramisu.

ahundt avatar ahundt commented on June 29, 2024 1

I know this one does https://github.com/farizrahman4u/keras-contrib/blob/master/keras_contrib/applications/densenet.py

from one-hundred-layers-tiramisu.

0bserver07 avatar 0bserver07 commented on June 29, 2024
  1. Hey! Yes, I would be happy to Collab, thanks for bringing these to my attention, I'm not quite familiar with this repo, by Upstream you mean that it will be merged into Keras sometime soon?

  2. and Yes it seems we both are going on that same path of DenseNets plus or minus something, meanwhile facing same issues.

I will look into this / your comments, let me know if there is one specific one that's hot-burning I can look at for you also.

See I realized that in terms of the model, my previous implementation of the SegNet with further complexity can do wonders.

However, at the meantime, I'm implementing Mask R-CNN, but I want to replicate the result this time around 😄 (hopefully)

I'm running into 2 problems, 1. I can't fit the model into memory 😄 (I think switching backend to TF all the sudden is causing some trouble)
and 2ndly, I don't know how to create this custom ROI-Align thingy from the paper, which is aligning 2 tasks Pixel-wise loss.

All in all, if that works out :), I can guarantee better results for same tasks.

PS: I just looked at (keras-team/keras-contrib#63)

  1. I tried some crazy Hyper-Params and some of them did interesting stuff.
  • SGD with Cyclical weight decay, like it goes down down down and then back up.
  • Adding more augmentation to the Training set and reducing regularization
  • I tried Adam, anything other the RMSProp was better.
  1. I realized that the authors run one of there models for 750 Epochs! 0.o , I mean, I ran one for 350 Epochs, but the gradient just turns into an empty tuna can, nothing but smells!

OH and also, the paper has a little tiny mistake on the Diagram, keep that in mind for calculating the Param m, which the growth rate.

from one-hundred-layers-tiramisu.

ahundt avatar ahundt commented on June 29, 2024
  1. also worth noting is https://github.com/nicolov/segmentation_keras

from one-hundred-layers-tiramisu.

0bserver07 avatar 0bserver07 commented on June 29, 2024

A lot of good stuff here, i will get to it tonight : )

from one-hundred-layers-tiramisu.

Fahim-F avatar Fahim-F commented on June 29, 2024

Thanks :)

from one-hundred-layers-tiramisu.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.