Git Product home page Git Product logo

antonyalexos / lwta-sbp-adversarial-robustness-dnn Goto Github PK

View Code? Open in Web Editor NEW
4.0 2.0 0.0 89.93 MB

A repository about Robust Deep Neural Networks with Uncertainty, Local Competition and Error-Correcting-Output-Codes in TensorFlow.

Python 19.22% Jupyter Notebook 80.78%
attack-model ibp ensemble-models local-competition adversarial-attacks adversarial-machine-learning uncertainty uncertainty-neural-networks uncertainty-estimation robustness

lwta-sbp-adversarial-robustness-dnn's Introduction

Local Competition and Uncertainty for Adversarial Robustness

This repository is the official implementation of Local Competition and Uncertainty for Adversarial Robustness.

Requirements

We have provided the conda environment adver_environment.yml, in which we conducted the experiments. You can also see the dependencies in the file requirements.txt. There is a possibility that the command pip install -r requirements.txt may fail. For installing the conda environment use conda env create -f adver_environment.yml.

If you want to install CleverHans separately, please follow the instructions and you should probably use

pip install git+https://github.com/tensorflow/cleverhans.git#egg=cleverhans

Files

We give a brief description of the important files provided in this repository:

  • Model.py : Abstract base class implementing a baseline or ensemble model. Look at the implementation of "defineModel" in this file to see or modify the neural network architecture used by all ensemble models.

  • Model_Implementations.py : Implements model-specific methods of Model.py. Look at the implementation of "defineModelBaseline" in this file to see or modify the neural network architecture used by all baseline models.

  • Attack_Model.ipynb/Attack_Model.py : Code that runs the attacks. We have used mostly the notebook file.

  • Train_Model.ipynb/Train_Model.py : Code for training the models. We have used mostly the notebook file.

  • automatic_plot.ipynb : It is the code for the probability distribution figures from the text.

  • distributions.py : File that contains the functions for probabilities, distributions, sampling, etc.

  • lwta_conv2d_activation.py : main code for the LWTA activation for convolutional layers.

  • lwta_dense_activation.py: main code for the LWTA activation for dense layers.

  • sbp_lwta_con2d_layer.py : file that contains the code our Convolution layer with IBP and LWTA.

  • sbp_lwta_dense_layer.py : file that contains the code our Dense layer with IBP and LWTA.

Training

To train the model(s) in the paper, run either the Train_Model.ipynb or Train_Model.py. For the latter run:

python Train_Model.py

It is important to mention that the code runs eagerly. So for training comment the line 9 in Model_Implementations.py, and line 11 in Model.py. For MNIST dataset uncomment the lines for MNIST dataset parameters in file Train_Model.py, or Train_Model.ipynb(whichever you use), and comment the lines for CIFAR10 dataset parameters. For CIFAR10 do the opposite. For Mnist also you need to uncomment line 49 in Model_Implementations.py and comment line 47; also uncomment lines 114,115 in Model.py and comment lines 111,112. For CIFAR10 do the opposite of the previous sentence. There also some helpful comments on the code to guide through this process.

If you want to run a pretrained model, make sure to uncomment line 221 in Model.py, or else the model will start training from the beginning.

Adversarial Attacks.

To run the attacks you can use either Attack_Model.ipynb or Attack_Model.py. We used mostly the notebook. In order to run the attacks you have to disable eager execution from the files Model_Implementations.py and Model.py. There are some information/instructions inside the attacks file. It works just like the Training file. Only in this case you have to take the model and its parameters from the Train_Model.py, or Train_Model.ipynband put them inside the attack file. After the execution of the attacks you can see the cells that run the plot the figures and the probabilities from the LWTA activations as we have presented them in the paper.

Pre-trained Models

Having space constraints on the supplementary material(100 mb) we have not included all the pretrained models. We uploaded only a few models with 4 competing units since they do not produce good results, and all the models with 2 competing units.

References

We have used code from here and here

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.