Git Product home page Git Product logo

ma_ssrl_tensorflow's Introduction

MULTI-AUGMENTATION FOR EFFICIENT VISUAL REPRESENTATION LEARNING FOR SELF-SUPERVISED PRE-TRAINING

This repo is official TensorFlow implementation MASSRL.

MASSRL Paper link

[Blog Post]("Coming Soon")

This repo contains the source code for the MASSRL multi-Augmentation Strategies in Tensorflow models effortless and less error-prone.

Table of Contents

Installation

pip or conda installs these dependents in your local machine
  • tensorflow==2.7.0, tensorflow-addons==0.15.0, tensorflow-datasets==4.4.0, tensorflow-estimator==2.7.0
  • tqdm
  • wandb
  • imgaug

Visualization MASSRL Multi-Augmentation Strategies

Open In Colab Visualization Multi-Augmentation Strategies on Google-Colab Notebook: https://colab.research.google.com/drive/1fquGOr_psJfDXxOmdFVkfrbedGfi1t-X?usp=sharing

Note the Visualization Augmentation do not need to be trained --- we are only Visualize Image after apply different Augmentation transformations. However, you need to make sure that the dataset is appropriately passed down to the constructor of all submodules. If you want to see this happen, please upvote [this Repo issue]

Configuration Self-Supervised Pretraining

This implementation supports Single-gpu, Multi-GPUs training.

To do self-superivsed pre-training of a ResNet-50 model on ImageNet in an (1-8)-gpus following Three Stesp:

**1.Training Hyperparaneters Configures**: 

- you can change training hyperparameters setting (Dataset paths, All other training hyperperameters) base on
config/non_contrast_config_v1.py as Reference configure
- Consider you GPUs memmory >= 12G ResNet50 --> Recommend training on 4-> 8 GPUs.

**2.Execute MASSRL With 3 Augmentations Strategies SimCRL'Augmentation Pipeline, RandAug, AutoAugment**: 

-Nevigate to this directory
self_supervised_learning_frameworks/none_contrastive_framework/run_MASSRL.py
- Execute the ๐Ÿƒโ€โ™€๏ธ file.
python run_MASSRL.py 

Note: for 8-gpus training, we recommend following the linear lr scaling recipe: --lr 0.2 --batch-size 128. Other Hyperparameters can set default. for 1-gpu training, we recommend following the linear lr scaling recipe: --lr 0.3 --batch-size 256. Other Hyperparameters can set default.

Dataset

Note: Public ImageNet dataset is implement in this work, if you have your own dataset you can change the path corresponding.

Downloading ImageNet-1K dataset (https://www.image-net.org/download.php).

Using your own dataset

Update Soon

Changing dataset path(your path) in pretraining Flags:

Update Soon

Hyperparameter Setting

Update Soon

Number Augmentation Strategies Implementation

Update Soon

Training Single or Multiple GPUs

Update Soon

Checkout Guideline for Contribution

Awesome! Thank You for being a part this project > > Before you start to contribute for this repository, please quick go through Guidelines. Update Soon

See Also

Citation for Our Paper

@Article{TranMASSRL,
  author  = {Van-Nhiem Tran, Chi-En Huang, Shen-Hsuan Liu, Kai-Lin Yang, Timothy Ko, Yung-Hui Li},
  title   = {Multi-Augmentation Strategies Disentangle represenation learning Self-Supervised},
  journal = {https://arxiv.org/abs/2205.11772},
  year    = {2022},
}

ma_ssrl_tensorflow's People

Contributors

huangchien avatar huangjoseph avatar tigerlittle1 avatar trannhiem avatar

Stargazers

 avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.