Git Product home page Git Product logo

Comments (13)

berlino avatar berlino commented on July 4, 2024

Probably because of the version of pytorch. Note that I only tested the system on pytorch 0.3.1.

from overlapping-ner-em18.

csJd avatar csJd commented on July 4, 2024

Thanks, I will try with a lower version pytorch.

Probably because of the version of pytorch. Note that I only tested the system on pytorch 0.3.1.

from overlapping-ner-em18.

csJd avatar csJd commented on July 4, 2024

Probably because of the version of pytorch. Note that I only tested the system on pytorch 0.3.1.

Thanks, I can now run your code successfully with pytorch 0.3.1.

How can I set the parameters in config.py to achieve the result in your paper. I did'n find the parameter setting in your paper.

from overlapping-ner-em18.

berlino avatar berlino commented on July 4, 2024

All the hyperparameters for reproducing the results are listed in the supplementary paper

from overlapping-ner-em18.

berlino avatar berlino commented on July 4, 2024

I have updated the readme file accordingly.

from overlapping-ner-em18.

csJd avatar csJd commented on July 4, 2024

All the hyperparameters for reproducing the results are listed in the supplementary paper

Thank you, the parameters shown in your supplementary paper is not exactly the same as your config.py.

I do following alteration in your config.py. Is that right for GENIA?

self.h_hidden_size = 256
self.f_hidden_size = 256
self.input_dropout = 0.6
self.f_lstm_dropout = 0.6
# following is the same as best value and not changed
self.token_embed = 100
self.pos_embed = 32
self.beta = 3
self.l2 = 1e-4

from overlapping-ner-em18.

berlino avatar berlino commented on July 4, 2024

Seems good to you. Do you still have trouble reproducing the results?

from overlapping-ner-em18.

csJd avatar csJd commented on July 4, 2024

Seems good to you. Do you still have trouble reproducing the results?

Yes, it performs worse when using above config.

And it does't matter.

from overlapping-ner-em18.

berlino avatar berlino commented on July 4, 2024

What's the performance gap then?

from overlapping-ner-em18.

csJd avatar csJd commented on July 4, 2024

What's the performance gap then?

About 2% in terms of F1.

from overlapping-ner-em18.

berlino avatar berlino commented on July 4, 2024

I just noticed that you set self.f_lstm_dropout to 0.6. It should not be equal to the input dropout. You could simply set it to 0 or 0.1

from overlapping-ner-em18.

csJd avatar csJd commented on July 4, 2024

OK, I will try later, thank you for your reply.

from overlapping-ner-em18.

csJd avatar csJd commented on July 4, 2024

I just noticed that you set self.f_lstm_dropout to 0.6. It should not be equal to the input dropout. You could simply set it to 0 or 0.1

I set self.f_lstm_dropout = 0.1 and run again, but I still cannot get comparable result, but it doesn't matter.

the log is shown as following:

python train.py
{'root_path': '.', 'if_margin': True, 'beta': 3, 'data_set': 'genia', 'lowercase': True, 'batch_size': 32,
 'if_shuffle': True, 'if_backward': False, 'if_interactions': True, 'voc_size': 15989, 'pos_size': 47, 
'label_size': 5, 'token_feat_size': None, 'span_feat_size': None, 't_null_id': None, 's_null_id': None, 
'h_hidden_size': 256, 'token_embed': 100, 'if_pos': True, 'pos_embed': 32, 'input_dropout': 0.6, 
'f_hidden_size': 256, 'f_layers': 1, 'f_lstm_dropout': 0.1, 'semi_hidden_size': 256, 'embed_path': 
'./data/word_vec_genia_100.pkl', 'epoch': 500, 'print_per_epoch': 50, 'if_gpu': True, 'opt': 'Adam', 'lr': 
0.005, 'l2': 0.0001, 'check_every': 1, 'clip_norm': 3, 'lr_patience': 3, 'decay_patience': 2, 'pre_trained': 
True, 'data_path': './data/genia', 'model_path': './dumps/genia_model.pt', 'if_C': False, 'C': 6, 
'char_size': 83}
Loading from ./data/word_vec_genia_100.pkl with size torch.Size([15989, 100])
549 batches expected for training
Epoch:  1
batch 0 with 32 instance and sentece length 16 loss 528.5094604492188
batch 50 with 32 instance and sentece length 12 loss 9.585504531860352
batch 100 with 32 instance and sentece length 17 loss 6.937288284301758
batch 150 with 1 instance and sentece length 60 loss 42.42529296875
batch 200 with 32 instance and sentece length 28 loss 9.936548233032227
batch 250 with 32 instance and sentece length 35 loss 12.68535041809082
batch 300 with 32 instance and sentece length 36 loss 13.2315092086792
batch 350 with 21 instance and sentece length 65 loss 19.49949073791504
batch 400 with 32 instance and sentece length 25 loss 6.697878837585449
batch 450 with 1 instance and sentece length 198 loss 1.75439453125
batch 500 with 32 instance and sentece length 23 loss 5.706954002380371
4929 3687 5014 3690
Precision 0.7480219111381619, Recall 0.7359393697646589, F1 0.7419314519018433
Dev step took 427.3614709377289 seconds
Epoch:  2
batch 0 with 32 instance and sentece length 16 loss 5.398239612579346
batch 50 with 32 instance and sentece length 12 loss 4.660111427307129
batch 100 with 32 instance and sentece length 17 loss 3.5176782608032227
batch 150 with 1 instance and sentece length 60 loss 20.19671630859375
batch 200 with 32 instance and sentece length 28 loss 7.34967565536499
batch 250 with 32 instance and sentece length 35 loss 9.869304656982422
batch 300 with 32 instance and sentece length 36 loss 9.423975944519043
batch 350 with 21 instance and sentece length 65 loss 14.547648429870605
batch 400 with 32 instance and sentece length 25 loss 5.2583513259887695
batch 450 with 1 instance and sentece length 198 loss 0.0927734375
batch 500 with 32 instance and sentece length 23 loss 5.622788906097412
5134 3789 5014 3794
Precision 0.7380210362290611, Recall 0.7566812923813323, F1 0.7472346845376192
Dev step took 426.46704745292664 seconds
Epoch:  3
batch 0 with 32 instance and sentece length 16 loss 4.528696060180664
batch 50 with 32 instance and sentece length 12 loss 3.843074321746826
batch 100 with 32 instance and sentece length 17 loss 3.2306599617004395
batch 150 with 1 instance and sentece length 60 loss 13.32366943359375
batch 200 with 32 instance and sentece length 28 loss 5.909056186676025
batch 250 with 32 instance and sentece length 35 loss 8.96019172668457
batch 300 with 32 instance and sentece length 36 loss 8.551804542541504
batch 350 with 21 instance and sentece length 65 loss 12.303472518920898
batch 400 with 32 instance and sentece length 25 loss 5.211538314819336
batch 450 with 1 instance and sentece length 198 loss 0.58740234375
batch 500 with 32 instance and sentece length 23 loss 4.716223239898682
5301 3829 5014 3834
Precision 0.7223165440482928, Recall 0.7646589549262066, F1 0.7428848882564526
Dev step took 426.854487657547 seconds
Epoch:  4
batch 0 with 32 instance and sentece length 16 loss 3.677178382873535
batch 50 with 32 instance and sentece length 12 loss 3.7535338401794434
batch 100 with 32 instance and sentece length 17 loss 3.3191256523132324
batch 150 with 1 instance and sentece length 60 loss 10.8851318359375
batch 200 with 32 instance and sentece length 28 loss 5.363184928894043
batch 250 with 32 instance and sentece length 35 loss 8.57153034210205
batch 300 with 32 instance and sentece length 36 loss 8.27442741394043
batch 350 with 21 instance and sentece length 65 loss 12.913166999816895
batch 400 with 32 instance and sentece length 25 loss 4.1260809898376465
batch 450 with 1 instance and sentece length 198 loss 0.1513671875
batch 500 with 32 instance and sentece length 23 loss 4.851208686828613
5228 3825 5014 3831
Precision 0.731637337413925, Recall 0.7640606302353411, F1 0.7474975526065257
Dev step took 424.5438988208771 seconds
Epoch:  5
batch 0 with 32 instance and sentece length 16 loss 4.496860980987549
batch 50 with 32 instance and sentece length 12 loss 3.1827728748321533
batch 100 with 32 instance and sentece length 17 loss 3.1428613662719727
batch 150 with 1 instance and sentece length 60 loss 12.5185546875
batch 200 with 32 instance and sentece length 28 loss 4.713732719421387
batch 250 with 32 instance and sentece length 35 loss 8.726579666137695
batch 300 with 32 instance and sentece length 36 loss 7.672467231750488
batch 350 with 21 instance and sentece length 65 loss 12.156211853027344
batch 400 with 32 instance and sentece length 25 loss 4.42149019241333
batch 450 with 1 instance and sentece length 198 loss 1.801025390625
batch 500 with 32 instance and sentece length 23 loss 4.838799476623535
5042 3789 5014 3795
Precision 0.7514875049583498, Recall 0.7568807339449541, F1 0.7541744776022177
Dev step took 424.40398478507996 seconds
Epoch:  6
batch 0 with 32 instance and sentece length 16 loss 3.4913690090179443
batch 50 with 32 instance and sentece length 12 loss 3.2640769481658936
batch 100 with 32 instance and sentece length 17 loss 2.7167744636535645
batch 150 with 1 instance and sentece length 60 loss 10.60699462890625
batch 200 with 32 instance and sentece length 28 loss 5.046211242675781
batch 250 with 32 instance and sentece length 35 loss 8.499637603759766
batch 300 with 32 instance and sentece length 36 loss 7.528899192810059
batch 350 with 21 instance and sentece length 65 loss 12.976995468139648
batch 400 with 32 instance and sentece length 25 loss 3.8782448768615723
batch 450 with 1 instance and sentece length 198 loss 1.131591796875
batch 500 with 32 instance and sentece length 23 loss 4.683186054229736
5250 3824 5014 3829
Precision 0.7283809523809524, Recall 0.7636617471080973, F1 0.7456042254634955
Dev step took 427.1732294559479 seconds
Epoch:  7
batch 0 with 32 instance and sentece length 16 loss 3.919600009918213
batch 50 with 32 instance and sentece length 12 loss 2.88189697265625
batch 100 with 32 instance and sentece length 17 loss 2.4995851516723633
batch 150 with 1 instance and sentece length 60 loss 9.7147216796875
batch 200 with 32 instance and sentece length 28 loss 4.696895599365234
batch 250 with 32 instance and sentece length 35 loss 8.022673606872559
batch 300 with 32 instance and sentece length 36 loss 7.496114730834961
batch 350 with 21 instance and sentece length 65 loss 10.091758728027344
batch 400 with 32 instance and sentece length 25 loss 3.57883358001709
batch 450 with 1 instance and sentece length 198 loss 209.575439453125
batch 500 with 32 instance and sentece length 23 loss 4.680888652801514
5103 3821 5014 3827
Precision 0.7487752302567118, Recall 0.7632628639808536, F1 0.7559496402924187
Dev step took 427.3118300437927 seconds
Epoch:  8
batch 0 with 32 instance and sentece length 16 loss 3.52482533454895
batch 50 with 32 instance and sentece length 12 loss 3.16167950630188
batch 100 with 32 instance and sentece length 17 loss 2.5021820068359375
batch 150 with 1 instance and sentece length 60 loss 12.20062255859375
batch 200 with 32 instance and sentece length 28 loss 4.267910003662109
batch 250 with 32 instance and sentece length 35 loss 8.567618370056152
batch 300 with 32 instance and sentece length 36 loss 7.1137285232543945
batch 350 with 21 instance and sentece length 65 loss 10.1345853805542
batch 400 with 32 instance and sentece length 25 loss 3.1251139640808105
batch 450 with 1 instance and sentece length 198 loss 0.87939453125
batch 500 with 32 instance and sentece length 23 loss 4.156781196594238
4981 3799 5014 3805
Precision 0.7626982533627785, Recall 0.7588751495811727, F1 0.7607818985086429
Dev step took 428.0654981136322 seconds
Epoch:  9
batch 0 with 32 instance and sentece length 16 loss 3.9437317848205566
batch 50 with 32 instance and sentece length 12 loss 3.0538012981414795
batch 100 with 32 instance and sentece length 17 loss 2.239175796508789
batch 150 with 1 instance and sentece length 60 loss 6.4278564453125
batch 200 with 32 instance and sentece length 28 loss 4.2736711502075195
batch 250 with 32 instance and sentece length 35 loss 8.175654411315918
batch 300 with 32 instance and sentece length 36 loss 7.292889595031738
batch 350 with 21 instance and sentece length 65 loss 9.688374519348145
batch 400 with 32 instance and sentece length 25 loss 3.645817756652832
batch 450 with 1 instance and sentece length 198 loss 75.4111328125
batch 500 with 32 instance and sentece length 23 loss 4.671082496643066
5101 3798 5014 3804
Precision 0.7445598902176044, Recall 0.7586757080175509, F1 0.7515515233081186
Dev step took 427.0255992412567 seconds
Epoch:  10
batch 0 with 32 instance and sentece length 16 loss 3.6652379035949707
batch 50 with 32 instance and sentece length 12 loss 2.816722869873047
batch 100 with 32 instance and sentece length 17 loss 2.4745192527770996
batch 150 with 1 instance and sentece length 60 loss 6.76202392578125
batch 200 with 32 instance and sentece length 28 loss 4.64898681640625
batch 250 with 32 instance and sentece length 35 loss 7.4596662521362305
batch 300 with 32 instance and sentece length 36 loss 6.8010149002075195
batch 350 with 21 instance and sentece length 65 loss 10.608006477355957
batch 400 with 32 instance and sentece length 25 loss 3.8895435333251953
batch 450 with 1 instance and sentece length 198 loss 0.501220703125
batch 500 with 32 instance and sentece length 23 loss 4.801102161407471
5089 3808 5014 3814
Precision 0.7482806052269602, Recall 0.7606701236537694, F1 0.7544245012266394
Dev step took 426.9521062374115 seconds
Epoch:  11
batch 0 with 32 instance and sentece length 16 loss 3.7751235961914062
batch 50 with 32 instance and sentece length 12 loss 2.915480852127075
batch 100 with 32 instance and sentece length 17 loss 2.8308067321777344
batch 150 with 1 instance and sentece length 60 loss 6.95458984375
batch 200 with 32 instance and sentece length 28 loss 3.9693431854248047
batch 250 with 32 instance and sentece length 35 loss 8.378277778625488
batch 300 with 32 instance and sentece length 36 loss 6.827816009521484
batch 350 with 21 instance and sentece length 65 loss 10.069838523864746
batch 400 with 32 instance and sentece length 25 loss 3.8416061401367188
batch 450 with 1 instance and sentece length 198 loss 0.140380859375
batch 500 with 32 instance and sentece length 23 loss 4.7320170402526855
4954 3767 5014 3773
Precision 0.76039563988696, Recall 0.7524930195452733, F1 0.7564236899261773
Dev step took 424.7168560028076 seconds
Epoch:  12
batch 0 with 32 instance and sentece length 16 loss 4.5410614013671875
batch 50 with 32 instance and sentece length 12 loss 3.0821478366851807
batch 100 with 32 instance and sentece length 17 loss 2.5394949913024902
batch 150 with 1 instance and sentece length 60 loss 12.85906982421875
batch 200 with 32 instance and sentece length 28 loss 4.479831218719482
batch 250 with 32 instance and sentece length 35 loss 6.553382873535156
batch 300 with 32 instance and sentece length 36 loss 6.564929008483887
batch 350 with 21 instance and sentece length 65 loss 8.699430465698242
batch 400 with 32 instance and sentece length 25 loss 3.7041549682617188
batch 450 with 1 instance and sentece length 198 loss 0.066162109375
batch 500 with 32 instance and sentece length 23 loss 4.36357307434082
5048 3783 5014 3789
Precision 0.749405705229794, Recall 0.755684084563223, F1 0.7525317999810682
Dev step took 428.0272765159607 seconds
Adjust lr to  0.0005
Epoch:  13
batch 0 with 32 instance and sentece length 16 loss 3.20072603225708
batch 50 with 32 instance and sentece length 12 loss 2.1751410961151123
batch 100 with 32 instance and sentece length 17 loss 2.1681222915649414
batch 150 with 1 instance and sentece length 60 loss 8.37548828125
batch 200 with 32 instance and sentece length 28 loss 3.409791946411133
batch 250 with 32 instance and sentece length 35 loss 6.692264556884766
batch 300 with 32 instance and sentece length 36 loss 5.674688339233398
batch 350 with 21 instance and sentece length 65 loss 7.839314937591553
batch 400 with 32 instance and sentece length 25 loss 2.6043624877929688
batch 450 with 1 instance and sentece length 198 loss 0.007080078125
batch 500 with 32 instance and sentece length 23 loss 3.6698646545410156
5081 3816 5014 3822
Precision 0.7510332611690612, Recall 0.7622656561627443, F1 0.7566077726857409
Dev step took 425.0413930416107 seconds
Epoch:  14
batch 0 with 32 instance and sentece length 16 loss 3.667945623397827
batch 50 with 32 instance and sentece length 12 loss 2.6571710109710693
batch 100 with 32 instance and sentece length 17 loss 2.223240375518799
batch 150 with 1 instance and sentece length 60 loss 4.613037109375
batch 200 with 32 instance and sentece length 28 loss 3.3543291091918945
batch 250 with 32 instance and sentece length 35 loss 5.49796199798584
batch 300 with 32 instance and sentece length 36 loss 6.792667388916016
batch 350 with 21 instance and sentece length 65 loss 6.796872138977051
batch 400 with 32 instance and sentece length 25 loss 1.8226842880249023
batch 450 with 1 instance and sentece length 198 loss 0.01513671875
batch 500 with 32 instance and sentece length 23 loss 3.5431742668151855
5123 3838 5014 3844
Precision 0.7491704079640835, Recall 0.7666533705624252, F1 0.7578110681831443
Dev step took 427.2519931793213 seconds
Epoch:  15
batch 0 with 32 instance and sentece length 16 loss 2.9205527305603027
batch 50 with 32 instance and sentece length 12 loss 2.441267251968384
batch 100 with 32 instance and sentece length 17 loss 2.1479287147521973
batch 150 with 1 instance and sentece length 60 loss 4.1927490234375
batch 200 with 32 instance and sentece length 28 loss 2.88955020904541
batch 250 with 32 instance and sentece length 35 loss 5.7675275802612305
batch 300 with 32 instance and sentece length 36 loss 5.899411201477051
batch 350 with 21 instance and sentece length 65 loss 6.8359198570251465
batch 400 with 32 instance and sentece length 25 loss 1.9376587867736816
batch 450 with 1 instance and sentece length 198 loss 0.009765625
batch 500 with 32 instance and sentece length 23 loss 2.9136266708374023
5149 3854 5014 3860
Precision 0.7484948533695863, Recall 0.769844435580375, F1 0.7590195447364397
Dev step took 427.0658984184265 seconds
Epoch:  16
batch 0 with 32 instance and sentece length 16 loss 3.551365375518799
batch 50 with 32 instance and sentece length 12 loss 2.4267735481262207
batch 100 with 32 instance and sentece length 17 loss 2.1857242584228516
batch 150 with 1 instance and sentece length 60 loss 7.87274169921875
batch 200 with 32 instance and sentece length 28 loss 2.940352439880371
batch 250 with 32 instance and sentece length 35 loss 5.838342666625977
batch 300 with 32 instance and sentece length 36 loss 5.825372695922852
batch 350 with 21 instance and sentece length 65 loss 6.667899131774902
batch 400 with 32 instance and sentece length 25 loss 2.163764476776123
batch 450 with 1 instance and sentece length 198 loss 0.030029296875
batch 500 with 32 instance and sentece length 23 loss 3.000728130340576
5080 3829 5014 3835
Precision 0.753740157480315, Recall 0.7648583964898285, F1 0.7592585765516525
Dev step took 424.8317244052887 seconds
Adjust lr to  5e-05
Epoch:  17
batch 0 with 32 instance and sentece length 16 loss 3.064420223236084
batch 50 with 32 instance and sentece length 12 loss 2.0797672271728516
batch 100 with 32 instance and sentece length 17 loss 1.7247443199157715
batch 150 with 1 instance and sentece length 60 loss 3.28594970703125
batch 200 with 32 instance and sentece length 28 loss 3.516556739807129
batch 250 with 32 instance and sentece length 35 loss 5.786201477050781
batch 300 with 32 instance and sentece length 36 loss 5.0926313400268555
batch 350 with 21 instance and sentece length 65 loss 6.1456298828125
batch 400 with 32 instance and sentece length 25 loss 1.8682680130004883
batch 450 with 1 instance and sentece length 198 loss 0.03515625
batch 500 with 32 instance and sentece length 23 loss 2.6819100379943848
5019 3801 5014 3807
Precision 0.7573221757322176, Recall 0.7592740327084164, F1 0.7582968482018613
Dev step took 428.0084297657013 seconds
Epoch:  18
batch 0 with 32 instance and sentece length 16 loss 3.071497917175293
batch 50 with 32 instance and sentece length 12 loss 2.2785584926605225
batch 100 with 32 instance and sentece length 17 loss 1.6287803649902344
batch 150 with 1 instance and sentece length 60 loss 5.76495361328125
batch 200 with 32 instance and sentece length 28 loss 3.016219139099121
batch 250 with 32 instance and sentece length 35 loss 5.666339874267578
batch 300 with 32 instance and sentece length 36 loss 5.34001350402832
batch 350 with 21 instance and sentece length 65 loss 5.791806221008301
batch 400 with 32 instance and sentece length 25 loss 1.9682135581970215
batch 450 with 1 instance and sentece length 198 loss 0.015869140625
batch 500 with 32 instance and sentece length 23 loss 2.8456382751464844
5015 3802 5014 3808
Precision 0.7581256231306082, Recall 0.7594734742720383, F1 0.7587989501563246
Dev step took 426.6219871044159 seconds
Epoch:  19
batch 0 with 32 instance and sentece length 16 loss 3.1922507286071777
batch 50 with 32 instance and sentece length 12 loss 2.1628100872039795
batch 100 with 32 instance and sentece length 17 loss 1.9283547401428223
batch 150 with 1 instance and sentece length 60 loss 2.94537353515625
batch 200 with 32 instance and sentece length 28 loss 2.916536331176758
batch 250 with 32 instance and sentece length 35 loss 6.153505325317383
batch 300 with 32 instance and sentece length 36 loss 5.0463056564331055
batch 350 with 21 instance and sentece length 65 loss 5.711056709289551
batch 400 with 32 instance and sentece length 25 loss 2.223219871520996
batch 450 with 1 instance and sentece length 198 loss 0.390380859375
batch 500 with 32 instance and sentece length 23 loss 3.2013611793518066
5031 3810 5014 3816
Precision 0.7573047107930829, Recall 0.7610690067810132, F1 0.7591821926353225
Dev step took 424.9007396697998 seconds
Epoch:  20
batch 0 with 32 instance and sentece length 16 loss 3.5252015590667725
batch 50 with 32 instance and sentece length 12 loss 1.9531855583190918
batch 100 with 32 instance and sentece length 17 loss 1.7794303894042969
batch 150 with 1 instance and sentece length 60 loss 2.90838623046875
batch 200 with 32 instance and sentece length 28 loss 2.6259355545043945
batch 250 with 32 instance and sentece length 35 loss 5.442547798156738
batch 300 with 32 instance and sentece length 36 loss 4.854525566101074
batch 350 with 21 instance and sentece length 65 loss 5.657668113708496
batch 400 with 32 instance and sentece length 25 loss 2.172929286956787
batch 450 with 1 instance and sentece length 198 loss 0.009765625
batch 500 with 32 instance and sentece length 23 loss 2.479541301727295
5011 3797 5014 3803
Precision 0.7577329874276592, Recall 0.758476266453929, F1 0.7581044447549548
Dev step took 427.28153252601624 seconds

Training step took 23890.82668852806 seconds
Best dev acc 0.7607818985086429

Metric on test dataset of best model:
/home/cike/deng/src/sh_wang/model/SemiMention.py:69: UserWarning: RNN module weights are not part of single contiguous chunk of memory. This means they need to be compacted at every call, possibly greatly increasing memory usage. To compact weights again call flatten_parameters().
  lstm_out, (hid_states, cell_states) = self.rnn(word_cat)
5132 3934 5600 3937
Precision 0.7665627435697584, Recall 0.7030357142857143, F1 0.7334261724210138
Test step took 460.8337290287018 seconds
Dumping model to ./dumps/genia_model.pt_212

from overlapping-ner-em18.

Related Issues (2)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.