Git Product home page Git Product logo

splitbrainauto's People

Contributors

richzhang avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

splitbrainauto's Issues

how to use the regression loss to train the splitbrain?

I am reimplementing the Split brain on my dataset, the my network is here:
the path of ab->L:(input:84,84,2)->(42,42,16)->(21,21,32)->(21,21,32)->(11,11,64), and the feature(11,11,64) is average pooled across channel, get a (11,11,1)map, than, the loss is the L2 loss between the pooled feature map (11,11,1) and the downsampled L channel of the original image (11,11,1), the downsample operation is (84,84,1)->(11,11,1),
I do not know whether my reimplementation is correct, please help me, thanks a lot!

Questions about encoding/decoding and network architecture

Hi,
First of all, thank you for this repository, and please excuse me if the following questions are very basic.
I have started studying the splitbrain network model and reading the paper, but I have not yet understood where the encoding part ends and the decoding part starts in deploy_lab.prototxt. It seems to me that the output of the network gives a feature representation, and not the recreated image I expected to find. Could you please point me in the right direction?
Also, it seems to me that if you wanted to replicate the architecture described in your paper, you would need to use two networks. Is it correct?
Thank you in advance.

Question Regarding Classification Loss

Hi, I don't quite understand the quantization procedure for classification loss. The L channel is quantized into 100 bins. The ab channels are quantized to the Q=313 values which are in-gamut..? Can you explain the procedure for identifying those values?

I quantize L_labels as np.digitize(image[:,:,0] , np.linspace(0,100,100))
How does it work for ab_labels..?

Secondly, is the regression loss is with respect to these quantized labels?

Apologies if these questions are too trivial.
Thanks! I found your work really fascinating!

Layers fc1 and fc2

Are fc1 and fc2 convolutional or fully connected layers. If they are in a fact convolutional layers please tell the number of filters.

Thanks

Disjoint sub-networks

Hi!
I understand the split-brain encoder is formed by concatenating two sub-networks, and thereby the group of each convolution layer should be 2. However, some conv layers in the provided deploy.prototxt only have 1 group, such as conv1 and conv3.
Thanks.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.