Git Product home page Git Product logo

bayes_attack's People

Contributors

satyanshukla avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

bayes_attack's Issues

Bug report, when I run this code on ImageNet, it reports warning and GPU memory is exhausted (OOM error)!

It seems the bayes does not work, because after long time, it seems that it cannot attack successfully even after many iterations.
It reports:

warnings.py:110] /home/yiyangzhao/anaconda3/lib/python3.7/site-packages/gpytorch/distributions/multivariate_normal.py:230: NumericalWarning: Negative variance values detected. This is likely due to numerical instabilities. Rounding negative variances up to 1e-06.
  NumericalWarning,

And then

    adv_images, query, distortion_with_max_queries, success = self.bayes_opt(batch_index, images, true_labels[0].item(), args)
  File "bayes_attack/attack.py", line 193, in bayes_opt
    new_x, new_obj = self.optimize_acqf_and_get_observation(qEI, x0, y0)
  File "bayes_attack/attack.py", line 113, in optimize_acqf_and_get_observation
    raw_samples=200,
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/botorch/optim/optimize.py", line 376, in joint_optimize
    sequential=False,
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/botorch/optim/optimize.py", line 150, in optimize_acqf
    options=options,
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/botorch/optim/initializers.py", line 104, in gen_batch_initial_conditions
    X_rnd[start_idx:end_idx].to(device=device)
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
    result = self.forward(*input, **kwargs)
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/botorch/utils/transforms.py", line 171, in decorated
    return method(cls, X)
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/botorch/acquisition/analytic.py", line 137, in forward
    posterior = self._get_posterior(X=X)
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/botorch/acquisition/analytic.py", line 69, in _get_posterior
    posterior = self.model.posterior(X)
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/botorch/models/gpytorch.py", line 276, in posterior
    mvn = self(X)
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/gpytorch/models/exact_gp.py", line 319, in __call__
    predictive_mean, predictive_covar = self.prediction_strategy.exact_prediction(full_mean, full_covar)
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/gpytorch/models/exact_prediction_strategies.py", line 317, in exact_prediction
    self.exact_predictive_mean(test_mean, test_train_covar),
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/gpytorch/models/exact_prediction_strategies.py", line 335, in exact_predictive_mean
    res = (test_train_covar @ self.mean_cache.unsqueeze(-1)).squeeze(-1)
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/gpytorch/lazy/lazy_tensor.py", line 1889, in __matmul__
    return self.matmul(other)
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/gpytorch/lazy/lazy_tensor.py", line 1117, in matmul
    return func.apply(self.representation_tree(), other, *self.representation())
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/gpytorch/lazy/lazy_evaluated_kernel_tensor.py", line 316, in representation_tree
    return self.evaluate_kernel().representation_tree()
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/gpytorch/utils/memoize.py", line 59, in g
    return _add_to_cache(self, cache_name, method(self, *args, **kwargs), *args, kwargs_pkl=kwargs_pkl)
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/gpytorch/lazy/lazy_evaluated_kernel_tensor.py", line 274, in evaluate_kernel
    res = self.kernel(x1, x2, diag=False, last_dim_is_batch=self.last_dim_is_batch, **self.params)
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/gpytorch/kernels/kernel.py", line 396, in __call__
    res = lazify(super(Kernel, self).__call__(x1_, x2_, last_dim_is_batch=last_dim_is_batch, **params))
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/gpytorch/module.py", line 28, in __call__
    outputs = self.forward(*inputs, **kwargs)
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/gpytorch/kernels/scale_kernel.py", line 92, in forward
    orig_output = self.base_kernel.forward(x1, x2, diag=diag, last_dim_is_batch=last_dim_is_batch, **params)
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/gpytorch/kernels/matern_kernel.py", line 102, in forward
    distance = self.covar_dist(x1_, x2_, diag=diag, **params)
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/gpytorch/kernels/kernel.py", line 329, in covar_dist
    res = self.distance_module._dist(x1, x2, postprocess, x1_eq_x2)
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/gpytorch/kernels/kernel.py", line 54, in _dist
    res = self._sq_dist(x1, x2, postprocess=False, x1_eq_x2=x1_eq_x2)
  File "/home/yiyangzhao/anaconda3/lib/python3.7/site-packages/gpytorch/kernels/kernel.py", line 43, in _sq_dist
    res = x1_.matmul(x2_.transpose(-2, -1))
RuntimeError: CUDA out of memory. Tried to allocate 1.55 GiB (GPU 0; 10.76 GiB total capacity; 6.55 GiB already allocated; 1.39 GiB free; 8.11 GiB reserved in total by PyTorch)

I guess this is because the train_x is always added new one. So gpu memory is exhausted.

RuntimeError

Your code uses the function torch.symeig(), and sometimes it will get RuntimeError: symeig_cpu: the algorithm failed to converge; 2 off-diagonal elements of an intermediate tridiagonal form did not converge to zero. Did you come across this problem? How can I fix it?

  • Dataset: ImageNet ILSVRC2012_val
  • Model: inception_v3, resnet18, resnet101 and densenet121.

Code for targeted attack

Hello,
Nicely written paper and code! One query, I couldn't find the code for targeted attack? Could someone please point me towards it?

Regards.

Reproduce Issues

Hi, your article intrigues me and I am amazed at the huge reduction in query costs. However I am having a bit of trouble reproducing it and need your help. According to the experiments described in the article, the CIFAR10 dataset uses infinite norms, so its low resolution subspace should correspond to nearest neighbor interpolation (NNI), however I didn't find this function in your code. In addition, the experimental section only gives the eps parameter for CIFAR10, the rest of the parameter settings are still unknown. I would appreciate if you could reply me when you have time!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.