Git Product home page Git Product logo

Comments (10)

drcdr avatar drcdr commented on July 28, 2024

@vikasverma1077 or @alexmlamb - any thoughts?

from manifold_mixup.

vikasverma1077 avatar vikasverma1077 commented on July 28, 2024

Hi @drcdr thanks for your interest. Unfortunately, I do not have time to go through the details for your experiments at the moment. I would recommend using the same packages as in the README and reproduce the results first. Several people have reproduced the results so I am pretty sure it will work for you as well.

answer to your questions:

Are the results in the arXiv paper the "Best" value, or the "End" value?
 Best value
I assume the results in the paper use {mixup_hidden, alpha=2} for mixup?
 {mixup_hidden, alpha=2} is Manifold Mixup

Is the current github software different from that used in the paper, in any substantial way?
No
Curious, are my run-times in the same ballpark as yours?
Yes

from manifold_mixup.

alexmlamb avatar alexmlamb commented on July 28, 2024

Thanks for taking the time to look into it. It's good that you got similar results for manifold mixup on the preactresnet architectures.

Also if you fixed the data loader for a newer pytorch version, can you open a pull request for that? I think other users would benefit from that change.

WRN28-10: about the same NoMixup and Input Mixup, much worse Manifold Mixup

I'd have to check, but I wonder if the choice of layers to mix in could be set incorrectly for WRN?

The paper says:

"When using Manifold Mixup, we selected the layer to perform mixing uniformly at random from a set of eligible layers. In all our experiments, for the PreActResNets architectures, the eligible layers for mixing in Manifold Mixup were : the input layer, the output from the first resblock, and the output from the second resblock. For Wide-ResNet-20-10 architecture, the eligible layers for mixing in Manifold Mixup were: the input layer and the output from the first resblock."

So maybe the code is mixing in too many layers for WRN? I haven't investigated closely.

from manifold_mixup.

drcdr avatar drcdr commented on July 28, 2024

Thanks guys, I'm trying to figure out where to go next. Trying a model or two in torchvision 0.2.1 seems like a good idea, given what you both have said; I would just need some time. I could maybe try and diff these 3 models in the two torchvisions, but I suppose that's not 100% conclusive either.

I'm trying to think through the relative high variability between Best and End, and what that means.
Since the Best results seem like outliers to me (they are the min value over 100's of iterations) (todo: histogram the test error), I'm not exactly sure what comparing these results really means then.

But since the primary goal here was reproducibility, I suppose I should focus on that first. I'll try and repost in a few days. Thanks!

from manifold_mixup.

alexmlamb avatar alexmlamb commented on July 28, 2024

Can you clarify which of the results in the table you posted are from your experiments and which are taken from the paper?

from manifold_mixup.

drcdr avatar drcdr commented on July 28, 2024

Yes. The first three columns (Header, Err μ, Err σ) are taken from the the first two columns of Table 1(a) in the paper. The rest of the columns refer to my experiments.

from manifold_mixup.

alexmlamb avatar alexmlamb commented on July 28, 2024

What is the difference between "Manifold Mixup (α = 2)" and "Manifold Mixup (α = 2) , but not mixup_hidden" for the WRN results?

from manifold_mixup.

drcdr avatar drcdr commented on July 28, 2024

"Manifold Mixup (α = 2)": I ran the command line as given on README.md, for "Manifold mixup WRN-28-10"

"Manifold Mixup (α = 2) , but not mixup_hidden": I accidentally used '--train mixup' instead of '--train mixup_hidden', but otherwise the same as "Manifold Mixup (α = 2)"

from manifold_mixup.

drcdr avatar drcdr commented on July 28, 2024

I would recommend using the same packages as in the README and reproduce the results first...

I've run the first two experiments, on WRN28_10, using the same packages on the README. Results for Best Error:

  • Input Mixup (α = 1): paper= 2.92+/-.088; result from above=2.76; new result =2.67
  • Manifold Mixup (α = 2): paper=2.55+/-.024; result from above = 2.82; new result = 2.77

Also, I compared the printouts of the WRN model, from both torchvision 0.2.1 and 0.3, they are identical.

from manifold_mixup.

drcdr avatar drcdr commented on July 28, 2024

Update

Here is a table of Test Error results, with updates from using the packages on the README (columns K-O).

  • You can focus on columns B, I, and N for the Best Errors from each of the runs (paper, current PyTorch, and old PyTorch, respectively).
  • Colored highlights in the 'Best z' columns (J,O) give a relative indication of results compared to the paper: the greener, the better the paper; the redder, the better these results.
  • Blackout indicates a trial that was not run.

image

Here's the plot of TestError vs. Epoch:
ManifoldMixupTestErr-2

Summary

  1. On Manifold Mixup repeatability (line 9, 14, 18): I get roughly repeatable results for Resnet, but worse results for WRN28.
  2. I don't think there is a significant difference in PyTorch versions (columns I and N). Where one is better or worse, it doesn't seem to be statistically that much different. A notable difference might be row 19, but that's not a 'README case'.
  3. The test-error divergence anomaly for (PreActResNet18, Vanilla) was repeated. The first time, it blew up at epochs (701, 919); the second time, at (698, 889). Strange.
  4. Regarding plain Input Mixup, I am getting somewhat better results for WRN28-10 (row 17: 2.76, 2.67 vs 2.92) and substantially better for PreActResNet18 (row 8: 3.15, 3.06 vs 3.82).

I think this issue could be kept open to track (1) [WRN28 MM worse] and possibly (3) [PARN18-vanilla test-error divergence] and (4) [what results do you get for line 8, e.g.]. If there is anything you can think of that I can do for (1) or (3), please let me know.

Possible PR

@alexmlamb - re the pull request: would you want me to first test with the latest pytorch/torchvision beforehand (torch vision is now 0.5.0!)? For anyone who wants to run CIFAR10 with torchvsion 0.3.0, the change is one line in load_data.py:

#train_sampler, valid_sampler, unlabelled_sampler = get_sampler(train_data.train_labels, labels_per_class, valid_labels_per_class)  # older torchvision
train_sampler, valid_sampler, unlabelled_sampler = get_sampler(train_data.targets, labels_per_class, valid_labels_per_class)  # newer torchvision 

Other changes may be needed for other datasets, but I don't have the time/GPU cards to test all of these. Also, in torchvision, there is actually a warning for MNIST (but not for CIFAR10) - see:
https://github.com/pytorch/vision/blob/master/torchvision/datasets/mnist.py#L45

from manifold_mixup.

Related Issues (16)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.