meliketoy / fine-tuning.pytorch Goto Github PK
View Code? Open in Web Editor NEWPytorch implementation of fine tuning pretrained imagenet weights
License: MIT License
Pytorch implementation of fine tuning pretrained imagenet weights
License: MIT License
Hi,
Just a minor typo. In the requirements section, I guess you meant python2.7 and not PyTorch2.7. The link is redirecting to PyTorch website.
i really appreciate your github. Thanks a lot for your works on fine-tuning models.
When i tried to run your code locally, i faced a issue and it looks like a multi-threading issue. This is the log:
C:\Users\dk12a7\Desktop\New folder\fine-tuning.pytorch>python main.py --finetune
[Phase 1] : Data Preperation
C:\Users\dk12a7\Anaconda3\lib\site-packages\torchvision-0.2.1-py3.7.egg\torchvision\transforms\transforms.py:199: UserWarning: The use of the transforms.Scale transform is deprecated, please use transforms.Resize instead.
C:\Users\dk12a7\Anaconda3\lib\site-packages\torchvision-0.2.1-py3.7.egg\torchvision\transforms\transforms.py:604: UserWarning: The use of the transforms.RandomSizedCrop transform is deprecated, please use transforms.RandomResizedCrop instead.
| Preparing model trained on hymenoptera_data dataset...
[Phase 2] : Model setup
| Downloading ImageNet fine-tuned ResNet-50...
[Phase 3] : Training Model
| Training Epochs = 50
| Initial Learning Rate = 0.001000
| Optimizer = SGD
=> Training Epoch #1, LR=0.001000
[Phase 1] : Data Preperation
C:\Users\dk12a7\Anaconda3\lib\site-packages\torchvision-0.2.1-py3.7.egg\torchvision\transforms\transforms.py:199: UserWarning: The use of the transforms.Scale transform is deprecated, please use transforms.Resize instead.
C:\Users\dk12a7\Anaconda3\lib\site-packages\torchvision-0.2.1-py3.7.egg\torchvision\transforms\transforms.py:604: UserWarning: The use of the transforms.RandomSizedCrop transform is deprecated, please use transforms.RandomResizedCrop instead.
| Preparing model trained on hymenoptera_data dataset...
[Phase 2] : Model setup
| Downloading ImageNet fine-tuned ResNet-50...
[Phase 1] : Data Preperation
C:\Users\dk12a7\Anaconda3\lib\site-packages\torchvision-0.2.1-py3.7.egg\torchvision\transforms\transforms.py:199: UserWarning: The use of the transforms.Scale transform is deprecated, please use transforms.Resize instead.
C:\Users\dk12a7\Anaconda3\lib\site-packages\torchvision-0.2.1-py3.7.egg\torchvision\transforms\transforms.py:604: UserWarning: The use of the transforms.RandomSizedCrop transform is deprecated, please use transforms.RandomResizedCrop instead.
| Preparing model trained on hymenoptera_data dataset...
[Phase 2] : Model setup
| Downloading ImageNet fine-tuned ResNet-50...
[Phase 1] : Data Preperation
C:\Users\dk12a7\Anaconda3\lib\site-packages\torchvision-0.2.1-py3.7.egg\torchvision\transforms\transforms.py:199: UserWarning: The use of the transforms.Scale transform is deprecated, please use transforms.Resize instead.
C:\Users\dk12a7\Anaconda3\lib\site-packages\torchvision-0.2.1-py3.7.egg\torchvision\transforms\transforms.py:604: UserWarning: The use of the transforms.RandomSizedCrop transform is deprecated, please use transforms.RandomResizedCrop instead.
| Preparing model trained on hymenoptera_data dataset...
[Phase 2] : Model setup
| Downloading ImageNet fine-tuned ResNet-50...
[Phase 1] : Data Preperation
C:\Users\dk12a7\Anaconda3\lib\site-packages\torchvision-0.2.1-py3.7.egg\torchvision\transforms\transforms.py:199: UserWarning: The use of the transforms.Scale transform is deprecated, please use transforms.Resize instead.
C:\Users\dk12a7\Anaconda3\lib\site-packages\torchvision-0.2.1-py3.7.egg\torchvision\transforms\transforms.py:604: UserWarning: The use of the transforms.RandomSizedCrop transform is deprecated, please use transforms.RandomResizedCrop instead.
| Preparing model trained on hymenoptera_data dataset...
After get into the training phase for the first time, the code keep repeating the phase 1 and phase 2. Do you have any idea about this issue. Thank you
Thanks for your great work. I think I am too unfamiliar with ML to understand some parts of the README. I would be very happy if you could help me out.
Can you explain a little bit more what this means?
data_base = [:dir to your original dataset]
aug_base = [:dir to your actually trained dataset]
What is the difference? Where should I put my data? Do I need to preprocess it myself?
Are there any cases where you want to omit --resetClassifier
? I though fine-tuning is all about resetting the classifier to a new task.
What is mean
and std
in config.py
. Are those the values for my own dataset? Do I have to compute them myself?
Hi,
As per Pytorch documentation, to fine-tune a CNN we need to set the 'requires_grad' parameter of all the model layers except the last one to be False (so that no backprop happens)
https://pytorch.org/docs/0.3.0/notes/autograd.html (under requires_grad section)
I think the main.py code is missing that part, without which the whole model is being trained using initialised weights which doesn't qualify as finetuning .. can someone please confirm?
Thanks!
Hey,
could you please add a license (e.g. MIT) to the repo? This would allow other people to build upon your work.
Kind Regards
Johannes
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.