Graph Representation Ensemble Learning
Ieee account.
- Change Username/Password
- Update Address
Purchase Details
- Payment Options
- Order History
- View Purchased Documents
Profile Information
- Communications Preferences
- Profession and Education
- Technical Interests
- US & Canada: +1 800 678 4333
- Worldwide: +1 732 981 0060
- Contact & Support
- About IEEE Xplore
- Accessibility
- Terms of Use
- Nondiscrimination Policy
- Privacy & Opting Out of Cookies
A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.
HHSE: heterogeneous graph neural network via higher-order semantic enhancement
- Regular Article
- Published: 22 January 2024
- Volume 106 , pages 865–887, ( 2024 )
Cite this article
- Cuntao Ma 1 ,
- Depeng Lu 1 &
- Jingrui Liu 1
Heterogeneous graph representation learning has strong expressiveness when dealing with large-scale relational graph data, and its purpose is to effectively represent the semantic information and heterogeneous structure information of nodes in the graph. Current methods typically use shallow models to embed semantic information on low-order neighbor nodes in the graph, which prevents the complete retention of higher-order semantic feature information. To address this issue, this paper proposes a heterogeneous graph network for higher-order semantic enhancement called HHSE. Specifically, our model uses the identity mapping mechanism of residual attention at the node feature level to enhance the information representation of nodes in the hidden layer, and then utilizes two aggregation strategies to improve the retention of high-order semantic information. The semantic feature level aims to learn the semantic information of nodes in various meta path subgraphs. Extensive experiments on node classification and node clustering on three real-existing datasets show that the proposed approach makes practical improvements compared to the state-of-the-art methods. Besides, our method is applicable to large-scale heterogeneous graph representation learning.
This is a preview of subscription content, log in via an institution to check access.
Access this article
Price excludes VAT (USA) Tax calculation will be finalised during checkout.
Instant access to the full article PDF.
Rent this article via DeepDyve
Institutional subscriptions
Similar content being viewed by others
HetGNN-SF: Self-supervised learning on heterogeneous graph neural network via semantic strength and feature similarity
Chao Li, Xinming Liu, … Qingtian Zeng
Heterogeneous graph neural networks analysis: a survey of techniques, evaluations and applications
Rui Bing, Guan Yuan, … Shaojie Qiao
An interlayer feature fusion-based heterogeneous graph neural network
Ke Feng, Guozheng Rao, … Qing Cong
Availability of data and materials
The three datasets used in this paper are available from DBLP: https://dblp.uni-trier.de ; ACM: http://dl.acm.org ; IMDB: https://www.imdb.com and are referenced in the text where relevant.
Atwood J, Towsley D (2016) Diffusion-convolutional neural networks. Adv Neural Inf Process Syst, 29
Wang D, Cui P, Zhu W (2016) Structural deep network embedding. In: Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, pp 1225–1234
Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907
Han J, Sun Y, Yan X, Yu PS (2010) Mining heterogeneous information networks. In: Tutorial at the 2010 ACM SIGKDD conference on knowledge discovery and data mining (KDD’10), Washington, DC
Tajeuna EG, Bouguessa M, Wang S (2018) Modeling and predicting community structure changes in time-evolving social networks. IEEE Trans Knowl Data Eng 31(6):1166–1180
Article Google Scholar
Bordes A, Usunier N, Garcia-Duran A, Weston J, Yakhnenko O (2013) Translating embeddings for modeling multi-relational data. Adv Neural Inf Process Syst 26
Sun Z, Deng Z-H, Nie J-Y, Tang J (2019) Rotate: knowledge graph embedding by relational rotation in complex space. arXiv preprint arXiv:1902.10197
Hong H, Guo H, Lin Y, Yang X, Li Z, Ye J (2020) An attention-based graph neural network for heterogeneous structural learning. In: Proceedings of the AAAI conference on artificial intelligence, vol 34, pp 4132–4139
Zhang C, Swami A, Chawla NV (2019) Shne: representation learning for semantic-associated heterogeneous networks. In: Proceedings of the twelfth ACM international conference on web search and data mining, pp 690–698
Wang X, He X, Wang M, Feng F, Chua T-S (2019) Neural graph collaborative filtering. In: Proceedings of the 42nd international ACM SIGIR conference on research and development in information retrieval, pp 165–174
Tang J, Qu M, Wang M, Zhang M, Yan J, Mei Q (2015) Line: Large-scale information network embedding. In: Proceedings of the 24th international conference on world wide web, pp 1067–1077
Perozzi B, Al-Rfou R, Skiena S (2014) Deepwalk: Online learning of social representations. In: Proceedings of the 20th ACM SIGKDD international conference on knowledge discovery and data mining, pp 701–710
Grover A, Leskovec J (2016) node2vec: Scalable feature learning for networks. In: Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, pp 855–864
Goldberg Y, Levy O (2014) word2vec explained: deriving mikolov et al.’s negative-sampling word-embedding method. arXiv preprint arXiv:1402.3722
Defferrard M, Bresson X, Vandergheynst P (2016) Convolutional neural networks on graphs with fast localized spectral filtering. Adv Neural Inf Process Syst, 29
Hamilton W, Ying Z, Leskovec J (2017) Inductive representation learning on large graphs. Adv Neural Inf Process Syst, 30
Zhang J, Shi X, Xie J, Ma H, King I, Yeung D-Y (2018) Gaan: gated attention networks for learning on large and spatiotemporal graphs. arXiv preprint arXiv:1803.07294
Zhang J, Shi X, Zhao S, King I (2019) Star-gcn: stacked and reconstructed graph convolutional networks for recommender systems. arXiv preprint arXiv:1905.13129
Li Y, Tarlow D, Brockschmidt M, Zemel R (2015) Gated graph sequence neural networks. arXiv preprint arXiv:1511.05493
Veličković P, Cucurull G, Casanova A, Romero A, Lio P, Bengio Y (2017) Graph attention networks. arXiv preprint arXiv:1710.10903
Shi C, Li Y, Zhang J, Sun Y, Philip SY (2016) A survey of heterogeneous information network analysis. IEEE Trans Knowl Data Eng 29(1):17–37
Gori M, Monfardini G, Scarselli F (2005) A new model for learning in graph domains. In: Proceedings. 2005 IEEE International joint conference on neural networks, vol 2, pp 729–734
Shang J, Qu M, Liu J, Kaplan LM, Han J, Peng J (2016) Meta-path guided embedding for similarity search in large-scale heterogeneous information networks. arXiv preprint arXiv:1610.09769
Yun S, Jeong M, Kim R, Kang J, Kim HJ (2019) Graph transformer networks. Adv Neural Inf Process Syst, 32
Hu Z, Dong Y, Wang K, Sun Y (2020) Heterogeneous graph transformer. In: Proceedings of the web conference 2020, pp 2704–2710
Zhang C, Song D, Huang C, Swami A, Chawla NV (2019) Heterogeneous graph neural network. In: Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery and data mining, pp 793–803
Dong Y, Chawla NV, Swami A (2017) metapath2vec: Scalable representation learning for heterogeneous networks. In: Proceedings of the 23rd ACM SIGKDD international conference on knowledge discovery and data mining, pp 135–144
Wang X, Ji H, Shi C, Wang B, Ye Y, Cui P, Yu PS (2019) Heterogeneous graph attention network. In: The World wide web conference, pp 2022–2032
Schlichtkrull M, Kipf TN, Bloem P, Berg Rvd, Titov I, Welling M (2018) Modeling relational data with graph convolutional networks. In: European semantic web conference, pp 593–607. Springer
Sun Y, Han J, Yan X, Yu PS, Wu T (2011) Pathsim: meta path-based top-k similarity search in heterogeneous information networks. Proc VLDB Endow 4(11):992–1003
Scarselli F, Gori M, Tsoi AC, Hagenbuchner M, Monfardini G (2008) The graph neural network model. IEEE Trans Neural Netw 20(1):61–80
Article PubMed Google Scholar
Krizhevsky A, Sutskever I, Hinton GE (2017) Imagenet classification with deep convolutional neural networks. Commun ACM 60(6):84–90
Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser Ł, Polosukhin I (2017) Attention is all you need. Adv Neural Inf Process Syst, 30
Cho K, Van Merriënboer B, Gulcehre C, Bahdanau D, Bougares F, Schwenk H, Bengio Y (2014) Learning phrase representations using rnn encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078
Chen H, Yin H, Wang W, Wang H, Nguyen QVH, Li X (2018) Pme: projected metric embedding on heterogeneous networks for link prediction. In: Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery and data mining, pp 1177–1186
Sun L, He L, Huang Z, Cao B, Xia C, Wei X, Philip SY (2018) Joint embedding of meta-path and meta-graph for heterogeneous information networks. In: 2018 IEEE international conference on big knowledge (ICBK), pp 131–138. IEEE
Fu X, Zhang J, Meng Z, King I (2020) Magnn: metapath aggregated graph neural network for heterogeneous graph embedding. In: Proceedings of the web conference 2020, pp 2331–2341
Li W, Ni L, Wang J, Wang C (2022) Collaborative representation learning for nodes and relations via heterogeneous graph neural network. Knowl-Based Syst 255:109673
Wang Z, Yu D, Li Q, Shen S, Yao S (2023) Sr-hgn: semantic-and relation-aware heterogeneous graph neural network. Expert Syst Appl 224:119982
Han M, Zhang H, Li W, Yin Y (2023) Semantic-guided graph neural network for heterogeneous graph embedding. Expert Syst Appl, 120810
Zhao Y, Li W, Liu F, Wang J, Luvembe AM (2024) Integrating heterogeneous structures and community semantics for unsupervised community detection in heterogeneous networks. Expert Syst Appl 238:121821
He Y, Yan D, Zhang Y, He Q, Yang Y (2022) Semantic tradeoff for heterogeneous graph embedding. IEEE Trans Comput Soc Syst
Kingma DP, Ba J (2014) Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980
Download references
Acknowledgements
We thank the High Performance Computing Research Department of the Gansu Provincial Computing Center, China, for providing computing services to support this work.
This research was supported by the National Science and Natural Foundation of China [No. 61962054].
Author information
Authors and affiliations.
School of Computer Science and Engineering, Northwest Normal University, No. 967 Anning East Road, Lanzhou, 730070, Gansu, China
Hui Du, Cuntao Ma, Depeng Lu & Jingrui Liu
You can also search for this author in PubMed Google Scholar
Corresponding author
Correspondence to Cuntao Ma .
Ethics declarations
Conflict of interest.
The author declares that there is no conflict of interest in the publication of this article.
Ethics approval
Meet the requirements.
Consent to participate
Consent for publication, code availability.
The code is available, but currently not uploaded to the online platform.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
Reprints and permissions
About this article
Du, H., Ma, C., Lu, D. et al. HHSE: heterogeneous graph neural network via higher-order semantic enhancement. Computing 106 , 865–887 (2024). https://doi.org/10.1007/s00607-023-01246-x
Download citation
Received : 19 June 2023
Accepted : 20 December 2023
Published : 22 January 2024
Issue Date : March 2024
DOI : https://doi.org/10.1007/s00607-023-01246-x
Share this article
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
- Graph neural networks
- Deep heterogeneous information networks
- Graph representation learning
- Higher-order semantic information embedding
Mathematics Subject Classification
- Find a journal
- Publish with us
- Track your research
IMAGES
VIDEO
COMMENTS
Graph Representation Learning via Aggregation Enhancement. Graph neural networks (GNNs) have become a powerful tool for processing graph-structured data but still face challenges in effectively aggregating and propagating information between layers, which limits their performance. We tackle this problem with the kernel regression (KR) approach ...
Graph Representation Learning via Aggregation Enhancement KR KR + KR GIRL (a) (b) KR loss Kernel Regression Self-Supervised Loss Input graph GNN Representations Figure 1. (a) A schematic representation of GIRL algorithm, where L SSLis the sum of KR losses between G inand G outof each GNN layer g '. (b) A schematic representation of KR.
Especially in the Subgraph-Agg stage, to obtain meaningful graph representations from the view of the subgraphs, we introduce an auto-regressive method, a universal self-supervised framework for the graph generation. The overall pipeline of proposed method can be seen in Fig. 2. 3.1. Preliminary. Unsupervised Learning on Graphs.
This work highlights the potential of KR to advance the field of graph representation learning and enhance the performance of GNNs, using KR loss as the primary loss in self-supervised settings or as a regularization term in supervised settings. Graph neural networks (GNNs) have become a powerful tool for processing graph-structured data but still face challenges in effectively aggregating and ...
In the era of statistical learning, such symbolic representations are widely used in graph-based NLP such as TextRank [], where word and sentence graphs can be respectively built for keyword extraction [] and extractive document summarization [].Although convenient and straightforward, the representation of the adjacency matrix suffers from the scalability problem.
Recent graph neural networks for graph representation learning depend on a neighborhood aggregation process. Several works focus on simplifying the neighborhood aggregation process and model structures. However, as the depth of the models increases, the simplified models will encounter oversmoothing, resulting in a decrease in model performance. Several works leverage sophisticated learnable ...
And GRACE [33] introduces a framework for unsupervised graph representation learning by generating two graph views and maximizing the agreement of node representations in these two views. There are also pre-train methods [34] of using contrastive learning, which can achieve excellent results on various graph-level downstream tasks with few labels.
We study the graph representation learning problem that has emerged with the advent of numerous graph analysis tasks in the recent past. The task of representation learning from graphs of heterogeneous object attributes and complex topological structures is important yet challenging in practice. We propose an Attribute-interactive Neighborhood-aggregative Graph learning scheme (ANGraph), which ...
Graph representation learning (GRL) is a powerful tool for graph analysis, which has gained massive attention from both academia and industry due to its superior performance in various real-world applications. However, the majority of existing works for GRL are dedicated to node-based tasks and thus focus on producing node representations.
While Graph Neural Network (GNN) has shown superiority in learning node representations of homogeneous graphs, leveraging GNN on heterogeneous graphs remains a challenging problem. The dominating reason is that GNN learns node representations by aggregating neighbors' information regardless of node types. Some work is proposed to alleviate such issue by exploiting relations or meta-path to ...
In recent years, graph neural networks (GNNs) have been widely used in many domains due to their powerful capability in representation learning on graph-structured data. While a majority of extant studies focus on mitigating the over-smoothing problem, recent works also reveal the limitation of GNN from a new over-correlation perspective which ...
In this paper, we propose a novel end-to-end MVC method called Multi-view contrAstive clustering with Integrated Graph Aggregation and confidence enhance (MAGA).MAGA simultaneously performs multi-view information aggregation and contrast, as illustrated in Fig. 1.Here's a detailed overview of our approach: We begin by learning a latent representation for each view using view-specific ...
Unsupervised Learning on Graphs. In unsupervised case, given a set of graphs G = fG1;G2;:::gwithout labels (jGjis the number of graphs batch), we aim to learn a d-dimensional representation for every graph Gi. We denote the number of nodes in G ias jGjand the matrix of repre-sentations of all graphs as X G 2Rn d. Semi-supervised Learning on ...
We propose a representation learning approach that leads to a robust model able to deal with the sparsity of the data. From learned continuous projections of the users, our approach is able to ...
Representation learning on graphs has been gaining attention due to its wide applicability in predicting missing links and classifying and recommending nodes. Most embedding methods aim to preserve specific properties of the original graph in the low dimensional space. However, real-world graphs have a combination of several features that are difficult to characterize and capture by a single ...
Heterogeneous graph representation learning has strong expressiveness when dealing with large-scale relational graph data, and its purpose is to effectively represent the semantic information and heterogeneous structure information of nodes in the graph. Current methods typically use shallow models to embed semantic information on low-order neighbor nodes in the graph, which prevents the ...
Abstract: Knowledge graph (KG) helps to improve the accuracy, diversity, and interpretability of a recommender systems. KG has been applied in recommendation systems, exploiting graph neural networks (GNNs), but most existing recommendation models based on GNNs ignore the influence of node types and the loss of information during aggregation.
1. Introduction. Graphs are powerful data representations that can model numerous complex systems across various areas such as social media [1], knowledge graphs [2], e-commerce transactions [3], and many other research areas [4].Because of the expressive power of graphs, a massive amount of graph-structured data has been accumulated, and graph machine learning (GML) has attracted considerable ...
Bootstrapped Graph Latents (BGRL) is introduced - a graph representation learning method that learns by predicting alternative augmentations of the input and is thus scalable by design, achieving state-of-the-art performance and improving over supervised baselines where representations are shaped only through label information. Expand.
Graph Neural Aggregation-diffusion with Metastability Kaiyuan Cui *, Xinyan Wang , Zicheng Zhang*, ... challenge in the graph representation learning. Specifically, over-smoothing refers to the phenomenon that ... [28] Q. Li, T. Lin, and Z. Shen, "Deep learning via dynamical systems: An approximation perspective," Journal of the European ...
Due to the success observed in deep neural networks with contrastive learning, there has been a notable surge in research interest in graph contrastive learning, primarily attributed to its superior performance in graphs with limited labeled data. Within contrastive learning, the selection of a "view" dictates the information captured by the representation, thereby influencing the model's ...