Comments (5)
Hi! If you add sigmoid to the last layer, the output will be restricted to [0,1], which can be seen as link probabilities. If you want to map them to link existence, you can set a threshold such as 0.5 to classify all links with > 0.5 probability as positive links. In your example, apparently all links have similar (near 0.5) probabilities, which might suggest the model isn't well trained. You may increase the train_percent to something larger.
If you only want to evaluate ranking metrics such as AUC/Hits/MRR, you don't need the sigmoid. The repo supports directly outputting the ranking metric numbers.
from seal_ogb.
- I increase to nk * 1 and then use 1D-conv with kernel size k. This is equivalent to using an MLP on each node's final feature vector to get a new representation for 1D convolution.
- This is answered in 1.
- Given two sequences of node representations, they might be different due to using different node orders. But after sorting they become the same. E.g., [1,2,3] and [2,1,3] are different. But after sorting, [1,2,3] = [1,2,3]. Sorting makes isomorphic graphs have the same representation no matter which initial node ordering you use.
from seal_ogb.
Thanks Mr M.Zhang for the response. I was just running a first time training and use that model to predict and get data for understanding. After reading your paper and code I notice that:
- When using model DGCNN after a sort pooling I got a matrix nk but after that you reshape the matrix to nk* 1 vector. What's is a main point when increasing size like that ?
- I also notice that the first convolution filter you use which is really large (k size). Why would you use such that big filter.
- After reading you paper, I'm a little confuse about the sort pooling layer "to achieve an isomorphism invariant node ordering". Can you describe more about this.
Sorry if those question annoy you. I'm currently study on this aspect and want to understand deeply.
When waiting for your reply I wish you good health and many thanks for your support
from seal_ogb.
Very much thank you.
I also have another question. In this #2 you have said that using node embedding will make GNN not inductive. But as far as I know GNN already embedding the node into matrix E so why adding the node embedding algorithm will make it not inductive. Please correct me if I'm wrong.
Another things when you said [1,2,3] and [2,1,3] in your example. Is that the feature of node itself or this representations is for the structure of those node surrounded and any one that have the same node surrounded is more likely "have a linked". I didn't quite get your idea about this.
I really appreciate your help when responding those questions for me.
Again Very Much Thank You
from seal_ogb.
GNN learns node embeddings in an inductive way, but traditional network embedding methods such as node2vec/DeepWalk embeds nodes in a transductive way (do not generalize to unseen nodes).
[1,2,3] and [2,1,3] mean node representations before SortPooling (i.e., their graph structure has been absorbed into node representations through graph convolution). These two are isomorphic graphs, so their node representations are the same, up to a permutation of node ordering. After sorting, [1,2,3] and [2,1,3] become the same, meaning that SortPooling can map isomorphic graphs to the same representation.
from seal_ogb.
Related Issues (20)
- about SEALDynamicDataset HOT 1
- SEAL - Utilizing multiple edge features HOT 6
- About Planetoid edge_index utilization HOT 6
- Batch in the graph mean ? HOT 3
- Problem with z_score? HOT 6
- IndexError: too many indices for tensor of dimension 1 HOT 4
- Multi-relational link prediction in a heterogeneous graph HOT 4
- Can I learn the embedding of individual nodes? HOT 1
- Test with custom dataset HOT 1
- Issues with heuristic methods, i.e., CN and AA HOT 2
- How to reduce training time cost? HOT 5
- Requesting command line arguments needed for reproducing OGBL leaderboard results HOT 3
- Question regarding `use_edge_weight` parameter for training `ogbl-ddi` and `ogbl-ppa` HOT 2
- A clarification on SEALβs underlying GNN engine HOT 6
- Regarding the question of changing the output link score to link probability. HOT 4
- Tensor size mismatch error when dynamic options are used HOT 5
- can you teach me how to predict link in graphs? HOT 2
- Query regarding `--use_valedges_as_input` HOT 2
- Regarding SEAL's performance on Cora, Citeseer and Pubmed HOT 4
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. πππ
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google β€οΈ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from seal_ogb.