|Table of Contents|

Graph Clustering Based on Auto-Encoder and Contrastive Loss(PDF)

《南京师大学报(自然科学版)》[ISSN:1001-4616/CN:32-1239/N]

Issue:
2025年01期
Page:
75-84
Research Field:
计算机科学与技术
Publishing date:

Info

Title:
Graph Clustering Based on Auto-Encoder and Contrastive Loss
Author(s):
Wang Jinghong123Wang Hui1Yuan Chuo4
(1.College of Computer and Cyber Security,Hebei Normal University,Shijiazhuang 050024,China)
(2.Hebei Provincial Key Laboratory of Network and Information Security,Shijiazhuang 050024,China)
(3.Hebei Provincial Engineering Research Center for Supply Chain Big Data Analytics and Data Security,Shijiazhuang 050024,China)
(4.Hebei Institute of Engineering Technology,Shijiazhuang 050020,China)
Keywords:
graph clusteringAuto-Encoderinfluence contrastive lossgraph embeddingself-supervised clustering
PACS:
TP391
DOI:
10.3969/j.issn.1001-4616.2025.01.010
Abstract:
Graph clustering,which can find communities or groups based on the intrinsic relationships of graph data,is an important research problem in data analysis. In recent years,Auto-Encoder based methods have effectively extracted node attribute representations,but do not include structural information. Due to the widespread application of graph neural networks,the fusion of structural information based on semi-supervised graph convolutional networks and graph Auto-Encoder models has achieved better results compared to traditional clustering methods. However,labeling data and using convolution operations are expensive. This paper proposed a graph clustering method based on Auto-Encoder and Contrastive Loss(GC-AECL). Firstly,the model used a simple multilayer perceptron to design the Autoencoders,and pretrained the Auto-Encoder learning node attribute representation. Then the model combined the influence contrastive loss to enrich structural information to learn the graph embedded representation. And then the model iteratively optimized the embedded representation and self-supervised clustering tasks at the same time. Finally,the experiments were compared with the benchmark models on multiple citation network datasets. The experimental results showed that the clustering performance had been improved,and parameter sensitivity analysis and variants study had been conducted to verify the effectiveness of impact contrastive loss and self-supervised clustering.

References:

[1]李邵莹,孟丹,孔超,等. 面向社交推荐的自适应高阶隐式关系建模[J]. 软件学报,2023,34(10):4851-4869.
[2]HUANG L,CHEN X,ZHANG Y,et al. Identification of topic evolution:network analytics with piecewise linear representation and word embedding[J]. Scientometrics,2022,127(9):5353-5383.
[3]HROVATIN K,FISCHER D S,THEIS F J. Toward modeling metabolic state from single-cell transcriptomics[J]. Molecular metabolism,2022,57:101396.
[4]YUAN Q M,CHEN J W,ZHAO H Y,et al. Structure-aware protein-protein interaction site prediction using deep graph convolutional network[J]. Bioinformatics,2022,38(1):125-132.
[5]刘会,张璇,杨兵,云炜,等. 用于社交推荐的增强影响扩散模型[J]. 计算机学报,2023,46(3):626-642.
[6]HINTON G E,SALAKHUTDINOV R R. Reducing the dimensionality of data with neural networks[J]. Science,2006,313(5786):504-507.
[7]BO D Y,WANG X,SHI C,et al. Structural deep clustering network[C]//Proceedings of the Web Conference 2020. New York:Association for Computing Machinery,2020:1400-1410.
[8]PENG Z,LIU H,JIA Y,et al. Attention-driven graph clustering network[C]//Proceedings of the 29th ACM International Conference on Multimedia. New York:Association for Computing Machinery,2021:935-943.
[9]XIE J Y,GIRSHICK R,FARHADI A. Unsupervised deep embedding for clustering analysis[C]//Proceedings of International Conference on Machine Learning. New York:Association for Computing Machinery,2016:478-487.
[10]GUO X F,GAO L,LIU X W,et al. Improved deep embedded clustering with local structure preservation[C]//Proceedings of IJCAI. New York:Association for Computing Machinery,2017:1753-1759.
[11]KIPF T N,WELLING M. Semi-supervised classification with graph convolutional networks[J/OL]. arXiv preprint arXiv:1609.02907,2016.
[12]KIPF T N,WELLING M. Variational graph auto-encoders[J/OL]. arXiv preprint arXiv:1611.07308,2016.
[13]WANG C,PAN S R,HU R Q,et al. Attributed graph clustering:A deep attentional embedding approach[J/OL]. arXiv preprint arXiv:1906.06532,2019.
[14]林晶晶,冶忠林,赵海兴,等. 超图神经网络综述[J]. 计算机研究与发展,2024,61(2):362-384.
[15]WANG C,PAN S R,LONG G D,et al. Mgae:Marginalized graph autoencoder for graph clustering[C]//Proceedings of the 2017 ACM on Conference on Information and Knowledge Management. New York:Association for Computing Machinery,2017:889-898.
[16]YOU Y N,CHEN T L,SUI Y D,et al. Graph contrastive learning with augmentations[J]. Advances in neural information processing systems,2020,33:5812-5823.
[17]HU Y,YOU H X,WANG Z C,et al. Graph-mlp:Node classification without message passing in graph[J/OL]. arXiv preprint arXiv:2106.04051,2021.
[18]KIPF T,VAN DER POL E,WELLING M. Contrastive learning of structured world models[J/OL]. arXiv preprint arXiv:1911.12247,2019.
[19]PAN S R,HU R Q,LONG G D,et al. Adversarially regularized graph autoencoder for graph embedding[J/OL]. arXiv preprint arXiv:1802.04407,2018.
[20]YANG X H,HU X C,ZHOU S H,et al. Interpolation-based contrastive learning for few-label semi-supervised learning[J]. IEEE transactions on neural networks and learning systems,2022,35(2):2054-2065.
[21]XIA J,WU L,CHEN J,et al. SimGRACE:A simple framework for graph contrastive learning without data augmentation[J/OL]. arXiv preprint arXiv:2202.03104,2022.
[22]WANG Y,CAI Y,LIANG Y,et al. Adaptive data augmentation on temporal graphs[J]. Advances in neural information processing systems,2021(34):1440-1452.
[23]CHEN T,KORNBLITH S,NOROUZI M,et al. A simple framework for contrastive learning of visual representations[C]//Proceedings of International Conference on Machine Learning. New York:Association for Computing Machinery,2020:1597-1607.
[24]HASSANI K,KHASAHMADI A H. Contrastive multi-view representation learning on graphs[C]//Proceedings of International Conference on Machine Learning. New York:Association for Computing Machinery,2020:4116-4126.
[25]ZHU J,ROSSI R A,RAO A,et al. Graph neural networks with heterophily[C]//Proceedings of the AAAI Conference on Artificial Intelligence. New York:Association for Computing Machinery,2021:11168-11176.
[26]VELICKOVIC P,FEDUS W,HAMILTON W L,et al. Deep graph infomax[J]. ICLR,2019,2(3):4.
[27]蒋林浦,陈可佳. 基于对比预测的自监督动态图表示学习方法[J]. 计算机科学,2023,50(7):207-221.
[28]LIKAS A,VLASSIS N,VERBEEK J J. The global k-means clustering algorithm[J]. Pattern recognition,2003,36(2):451-461.
[29]VON LUXBURG U. A tutorial on spectral clustering[J]. Statistics and computing,2007,17:395-416.
[30]PEROZZI B,AL-RFOU R,SKIENA S. Deepwalk:Online learning of social representations[C]//Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York:Association for Computing Machinery,2014:701-710.
[31]TIAN F,GAO B,CUI Q,et al. Learning deep representations for graph clustering[C]//Proceedings of the AAAI Conference on Artificial Intelligence. New York:Association for Computing Machinery,2014.
[32]CAO S,LU W,XU Q. Deep neural networks for learning graph representations[C]//Proceedings of the AAAI Conference on Artificial Intelligence. New York:Association for Computing Machinery,2016.
[33]ZHANG X T,LIU H,LI Q M,et al. Attributed graph clustering via adaptive graph convolution[J/OL]. arXiv preprint arXiv:1906.01210,2019.
[34]HE D X,SONG Y,JIN D,et al. Community-centric graph convolutional network for unsupervised community detection[C]//Proceedings of the 29th International Conference on International Joint Conferences on Artificial Intelligence. San Francisco:International Joint Conferences on Artificial Intelligence,2021:3515-3521.
[35]ZHANG X T,LIU H,WU X M,et al. Spectral embedding network for attributed graph clustering[J]. Neural networks,2021,142:388-396.
[36]SALEHI A,DAVULCU H. Graph attention auto-encoders[J/OL]. arXiv preprint arXiv:1905.10715,2019.
[37]WANG C,PAN S R,CELINA P Y,et al. Deep neighbor-aware embedding for node clustering in attributed graphs[J]. Pattern recognition,2022,122:108230.

Memo

Memo:
-
Last Update: 2025-02-15