cornell-zhang / graphzoom Goto Github PK
View Code? Open in Web Editor NEWGraphZoom: A Multi-level Spectral Approach for Accurate and Scalable Graph Embedding
License: BSD 3-Clause "New" or "Revised" License
GraphZoom: A Multi-level Spectral Approach for Accurate and Scalable Graph Embedding
License: BSD 3-Clause "New" or "Revised" License
Hi, I'm in a situation where I have a directed graph data for input. And I was wondering if your graphzoom can handle directed graph data. If yes, which part of your code should I rewrite? Thank you.
Dear Zhang,
I am now running the experiments from your code. I get the embedding representation and then compute l2 distance as the kernel for node classification and link prediction, but it is poor performance.
Could you recommend the embedding space pls?
Thanks
Wei
Hi, thanks for your great work. I have a question about graph fusion coefficient beta, where can i find it in your codes?
Thank you in advance for your help.
Hi, I have a question about pseudo code of spectral_coarsening algorithm in your paper, p.15.
Which part of this code does the 8th to 12th lines of pseudo code correspond to? I looked for your codes, but I couldn't find the part that corresponds to this. Also, could you explain what does these lines indicate? Thank you in advance.
Hi, I am trying to do the experiment on my computer which has a Mac system and I have downloaded the corresponding matlab runtime for Mac.
However it seems that coarsening cannot be properly executed. I guess it has something wrong with the LD_LIBRARY_PATH in the run_coarsening.sh file.
My question is: if I would like to run it with a Mac system, how should I modify the files?
About the graph coarsening step, in the paper, it is said that
In this work, a similarity-aware spectral sparsification tool “GRASS” (Feng, 2018) has been adopted for achieving a desired graph sparsity at the coarsest level.
which refers to this paper:
[1] Zhuo Feng. Similarity-aware spectral sparsification by edge filtering. Design Automation Conference (DAC), pp. 1–6, 2018.
In the README of this repo, it is said that LAMG algorithm is used. And I have not found much information about LAMG in the above paper [1].
Where can I find more information about this LAMG algorithm used in the program?
Dear Dr. Zhang,
I am interested in your algorithm, and running the code in my experiment. For a new dataset, how can I generate new Mapping.mtx in the directory of reduction_results? graphzoom.py just calls for mapping_path = "{}Mapping.mtx".format(reduce_results), and I cannot find how to generate in graphzoom.py and utils.py. No Mapping.mtx, the code cannot run successfully.
I would appreciate it very much if you could help me.
Thanks
Wei
Hello,
Thank you for the code. I installed the dependencies from requirement but somehow I am not able to run the code. Can you confirm that gensim==3.4.0
and numpy==1.16.4
?
When I used gensim==3.4.0
, I got this error
However, if I update the gensim to latest version 4.2.0, the numpy will also be updated to 1.21.6 and I get an error loading the data (dataset/citeseer/citeseer-feats.npy
).
File "graphzoom.py", line 105, in main
feature = np.load(feature_path)
File "/home/chen/anaconda3/envs/graphzoom/lib/python3.7/site-packages/numpy/lib/npyio.py", line 441, in load
pickle_kwargs=pickle_kwargs)
File "/home/chen/anaconda3/envs/graphzoom/lib/python3.7/site-packages/numpy/lib/format.py", line 787, in read_array
array.shape = shape
ValueError: cannot reshape array of size 4082800 into shape (3327,3703)
Would you like to clarify the version of numpy and gensim? Thank you!
hi, something unclear about this. In the paper you said you train a two-layer graphsage model for one epoch on original graph and then send the embeddings for classification?
Hi, I am trying to run this program on my windows PC, but it shows 'AttributeError: Can't pickle local object 'DeepWalk_Original.generate_walks..rnd_walk_workers'' on Pycharm.
What should I do to fix this error? Or should I run it on Linux?
Thank you!
Hi, thanks for your great work.
I have a question about the projections stored in graphzoom/reduction_results/Projection_1.mtx
It seems that the 2708 nodes are projected onto 1169 nodes. But there's no explanation about what the number 1169 indicates. Could you provide an explanation about it and how did you create this projections?
Thank you in advance.
Hi, I read your scipts and seems that we need to control the coarsen level by reduction_ratio in bash. what;s their relationship and how can I determine coarsen level based on that variable?
when I try reduction_ratio=3 or 4 on cora, the final coarsen level is the same(level=2) and the size of coarsed graph remains the same, so I am cuious.
Hi, thanks for your great work;
I have been reading your code. As for the code file in mat_coarsen, I wonder if I can modify and cmopile the binary file?
Could you provide the instructions for this,
thanks ;)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.