Git Product home page Git Product logo

adatime's People

Contributors

emadeldeen24 avatar mohamedr002 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

adatime's Issues

Sweep doesn't work

Hi,

Im using sweep.py for sweeping the hyperparameters for my algorithm. However, with trainer.train(), I'm only able to run rugular training process. And if I change the code to trainer.sweep(), I will get KeyError('batch_size') from wandb.
image
image

Loss of CDAN

Hi,

First, thank you for this huge piece of work, it's a very useful one. I have a little question about CDAN Loss. I saw that you added a conditional entropy loss computed on the target features only, this doesn't seemsto be implemented (or maybe not this way) in the original CDAN code. What is the use of this loss and where does it come from ?

Best regards

Dataset not found: HAR

Hello! I'm trying to run the project, but I have a little trouble.

  1. clone the project and put the folder HAR on data
har # tree -L 2 .
.
├── algorithms
│   ├── algorithms.py
│   └── __pycache__
├── configs
│   ├── data_model_configs.py
│   ├── hparams.py
│   ├── __pycache__
│   └── sweep_params.py
├── data
│   ├── HAR -> /home/xxx/research/dataset/HAR
│   └── README.md
...

13 directories, 20 files
  1. run
python main.py --experiment_description exp1  \ 
                --run_description run_1 \
                --da_method DANN \
                --backbone CNN \
                --num_runs 5 \
                --is_sweep False
  1. report the error:
Traceback (most recent call last):
  File "main.py", line 45, in <module>
    trainer = cross_domain_trainer(args)
  File "/home/xxx/research/code/AdaTime/trainer.py", line 59, in __init__
    self.dataset_configs, self.hparams_class = self.get_configs()
  File "/home/xxx/research/code/AdaTime/trainer.py", line 203, in get_configs
    dataset_class = get_dataset_class(self.dataset)
  File "/home/xxx/research/code/AdaTime/configs/data_model_configs.py", line 4, in get_dataset_class
    raise NotImplementedError("Dataset not found: {}".format(dataset_name))
NotImplementedError: Dataset not found: HAR

Validation set for the datasets

First of all, Thanks a lot for all your effort!

I just have a small concern regarding the usage of preprocessed versions of the data that you provide. Preprocessed datasets always include train and test splits, however, preprocessing scripts seem to include validation sets as well. (In some cases lines for val set are commented out, in some cases they are not.)

To create validation set (i.e. for early-stopping), what would you suggest me to do? Should I, for instance, further split train split into train and val. In that case, is there anything that I should take into account to prevent information leakage across the splits?

Best wishes,

PS: To make the issue more concrete, you can see in the WISDM preprocessing script that the dataset is first split into train and test, and the train set is further split into train and val. However, val set is not saved.

I found this error about config.py of WISDOM

I can run these dataset(HAR)
But I met this error when I run this on WISDOM using CoTMix
1
cuz the WISDOM class hasn't step_size and lr_decay in the config.py

one more error:
2
cuz the size of trg isn't equal to the size of src in the last batch.

Number of sweeps provided as argument ignored

Hi emadeldeen24,
I'm trying to reproduce the results listed in your paper with the following setup:
python main.py --experiment_description domain-adapt-test --run_description domain-adapt-run --da_method Deep_Coral --dataset HAR --sweep_project_wandb domain-adapt-sweep --num_runs 1 --device cpu --is_sweep True --num_sweeps 1
Somehow the parameter is ignored and wandb runs an infinite number of sweeps (stopped run at 50) nevertheless.
Can you help?
Thanks a lot,
Nicole

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.