Git Product home page Git Product logo

Comments (6)

zheng-da avatar zheng-da commented on May 20, 2024

what are your arguments for training?

from dgl-ke.

jiag19 avatar jiag19 commented on May 20, 2024

My training command is:

dglke_train --model_name RotatE --data_path ./Mydata/ --data_files train.txt valid.txt test.txt --format raw_udd_hrt --batch_size 1024 --log_interval 1000 --neg_sample_size 128 --regularization_coef 1e-07 --hidden_dim 200 --gamma 12.0 --lr 0.001 --batch_size_eval 256 --test -adv -de --max_step 250000 --num_thread 8 --neg_deg_sample --mix_cpu_gpu --num_proc 8 --gpu 0 1 2 3 4 5 6 7 --async_update --rel_part --force_sync_interval 1000 --save_path ./ckpts/Mydata --dataset Mydata --neg_sample_size_eval 10000

from dgl-ke.

classicsong avatar classicsong commented on May 20, 2024

Did you add --neg_sample_size_eval 10000 command during dglke_eval?

from dgl-ke.

jiag19 avatar jiag19 commented on May 20, 2024

Did you add --neg_sample_size_eval 10000 command during dglke_eval?

Thanks for your kind suggestion. It seems to work after adding "--neg_sample_size_eval 10000 ". But the results are slightly different from the training, which shows as below:

-------------- Test result --------------
Test average MRR: 0.2569992566366979
Test average MR: 846.05375
Test average HITS@1: 0.15541666666666668
Test average HITS@3: 0.2941666666666667
Test average HITS@10: 0.4608333333333333

In fact, the test results seem to be changing for each time I run the dgl_eval command, please see the following results. Does this due to the choice of random seed? Can we keep it same as the training process? Thanks.

-------------- Test result --------------
Test average MRR: 0.25908662648355973
Test average MR: 846.9891666666666
Test average HITS@1: 0.16083333333333333
Test average HITS@3: 0.2941666666666667
Test average HITS@10: 0.45666666666666667

from dgl-ke.

classicsong avatar classicsong commented on May 20, 2024

Because when using --neg_sample_size_eval 10000, it randomly sampled 10000 negative edges.

from dgl-ke.

jiag19 avatar jiag19 commented on May 20, 2024

Because when using --neg_sample_size_eval 10000, it randomly sampled 10000 negative edges.

Got it. But it would be better if there is an argument for user to specify the random seed. Thanks.

from dgl-ke.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.