Paper Reading: Bilinear Graph Neural Network with Neighbor Interactions

venue: IJCAI 2020 If the aggregation function of previous GNN layers (e.g. GCN and GAT) is then the paper extends it with a bilinear aggregator: where It sums up the elementwise product of every pair of neighbor nodes of a target node (self-interactions excluded). The experimental results show that BGAT (BGCN) outperforms vanilla GAT(GCN) by 1.5% (1.6%). A linear combination of AGG output and BA … Continue reading Paper Reading: Bilinear Graph Neural Network with Neighbor Interactions

Paper Reading: Out-of-Vocabulary Embedding Imputation with Grounded Language Information by Graph Convolutional Networks

venue: ACL 2019 The paper proposes a GCN-based method to produce word embeddings for out-of-vocabulary(OOV) words. 1. Graph Construction To construct a knowledge graph, vocabulary is constructed from Wikipedia English dataset (3B tokens). To note that, this vocabulary includes OOV words which are not in the vocabulary of pre-trained embeddings such as GLOVE. For each node/word, they define the concatenation of Wikipedia page summary and … Continue reading Paper Reading: Out-of-Vocabulary Embedding Imputation with Grounded Language Information by Graph Convolutional Networks

Paper Reading: Strategies for Pre-training Graph Neural Networks

venue: ICLR 2020 paper link: here This paper proposes strategies to pre-train a GNN at node-level and graph-level. 1. Node-Level Pre-training Node-level pre-training is to use unlabeld data to capture domain knowledge in the graph. Two methods are proposed for node-level pre-training. 1.1 Context Prediction In this task, subgraphs are used to predict their surrouding graph structures. The goal is to let a pre-trained GNN … Continue reading Paper Reading: Strategies for Pre-training Graph Neural Networks

Paper Reading: Neural IR Meets Graph Embedding: A Ranking Model for Product Search

This paper claims it is the first study on how to use the click-graph features in neural models for retrieval. The graph embedding techniques proposed in this paper can be plugged into other scenarios where graph-structure information is available(ensemble). 1. Baseline First we describe the basic IR model for product search without proposed graph embedding techniques(baseline). As shown in Figure 1, CNN is used to … Continue reading Paper Reading: Neural IR Meets Graph Embedding: A Ranking Model for Product Search