[1]王静红,王 慧,袁 绰.基于自编码器及对比损失的图聚类方法[J].南京师大学报(自然科学版),2025,48(01):75-84.[doi:10.3969/j.issn.1001-4616.2025.01.010]
 Wang Jinghong,Wang Hui,Yuan Chuo.Graph Clustering Based on Auto-Encoder and Contrastive Loss[J].Journal of Nanjing Normal University(Natural Science Edition),2025,48(01):75-84.[doi:10.3969/j.issn.1001-4616.2025.01.010]
点击复制

基于自编码器及对比损失的图聚类方法()
分享到:

《南京师大学报(自然科学版)》[ISSN:1001-4616/CN:32-1239/N]

卷:
48
期数:
2025年01期
页码:
75-84
栏目:
计算机科学与技术
出版日期:
2025-02-15

文章信息/Info

Title:
Graph Clustering Based on Auto-Encoder and Contrastive Loss
文章编号:
1001-4616(2025)01-0075-10
作者:
王静红123王 慧1袁 绰4
(1.河北师范大学计算机与网络空间安全学院,河北 石家庄 050024)
(2.河北省网络与信息安全重点实验室,河北 石家庄 050024)
(3.供应链大数据分析与数据安全河北省工程研究中心,河北 石家庄 050024)
(4.河北工程技术学院,河北 石家庄 050020)
Author(s):
Wang Jinghong123Wang Hui1Yuan Chuo4
(1.College of Computer and Cyber Security,Hebei Normal University,Shijiazhuang 050024,China)
(2.Hebei Provincial Key Laboratory of Network and Information Security,Shijiazhuang 050024,China)
(3.Hebei Provincial Engineering Research Center for Supply Chain Big Data Analytics and Data Security,Shijiazhuang 050024,China)
(4.Hebei Institute of Engineering Technology,Shijiazhuang 050020,China)
关键词:
图聚类自编码器影响对比损失图嵌入自监督聚类
Keywords:
graph clusteringAuto-Encoderinfluence contrastive lossgraph embeddingself-supervised clustering
分类号:
TP391
DOI:
10.3969/j.issn.1001-4616.2025.01.010
文献标志码:
A
摘要:
图聚类根据图数据的内在关系找到组或社区,是数据分析中一项重要的研究问题. 近年来,基于自编码器的方法能够获得有效节点属性表示,但未融合结构信息. 由于图神经网络的广泛应用,基于半监督图卷积网络和图自编码器的模型能够融合结构信息,与传统聚类方法相比取得了较好的效果,但标记数据和卷积操作代价昂贵. 因此,本文提出了一种基于自编码器及对比损失的图聚类方法. 首先该方法使用简单的多层感知器设计自编码器,预训练自编码器学习节点属性表示. 其次结合影响对比损失学习图嵌入表示,融合丰富的图结构信息,然后同时迭代优化嵌入表示和自监督聚类任务. 最后,使用多个引文网络数据集与基准模型进行对比实验. 实验表明,聚类性能得到有效提升,并且参数敏感性分析和变体实验验证了影响对比损失和自监督聚类的有效性.
Abstract:
Graph clustering,which can find communities or groups based on the intrinsic relationships of graph data,is an important research problem in data analysis. In recent years,Auto-Encoder based methods have effectively extracted node attribute representations,but do not include structural information. Due to the widespread application of graph neural networks,the fusion of structural information based on semi-supervised graph convolutional networks and graph Auto-Encoder models has achieved better results compared to traditional clustering methods. However,labeling data and using convolution operations are expensive. This paper proposed a graph clustering method based on Auto-Encoder and Contrastive Loss(GC-AECL). Firstly,the model used a simple multilayer perceptron to design the Autoencoders,and pretrained the Auto-Encoder learning node attribute representation. Then the model combined the influence contrastive loss to enrich structural information to learn the graph embedded representation. And then the model iteratively optimized the embedded representation and self-supervised clustering tasks at the same time. Finally,the experiments were compared with the benchmark models on multiple citation network datasets. The experimental results showed that the clustering performance had been improved,and parameter sensitivity analysis and variants study had been conducted to verify the effectiveness of impact contrastive loss and self-supervised clustering.

参考文献/References:

[1]李邵莹,孟丹,孔超,等. 面向社交推荐的自适应高阶隐式关系建模[J]. 软件学报,2023,34(10):4851-4869.
[2]HUANG L,CHEN X,ZHANG Y,et al. Identification of topic evolution:network analytics with piecewise linear representation and word embedding[J]. Scientometrics,2022,127(9):5353-5383.
[3]HROVATIN K,FISCHER D S,THEIS F J. Toward modeling metabolic state from single-cell transcriptomics[J]. Molecular metabolism,2022,57:101396.
[4]YUAN Q M,CHEN J W,ZHAO H Y,et al. Structure-aware protein-protein interaction site prediction using deep graph convolutional network[J]. Bioinformatics,2022,38(1):125-132.
[5]刘会,张璇,杨兵,云炜,等. 用于社交推荐的增强影响扩散模型[J]. 计算机学报,2023,46(3):626-642.
[6]HINTON G E,SALAKHUTDINOV R R. Reducing the dimensionality of data with neural networks[J]. Science,2006,313(5786):504-507.
[7]BO D Y,WANG X,SHI C,et al. Structural deep clustering network[C]//Proceedings of the Web Conference 2020. New York:Association for Computing Machinery,2020:1400-1410.
[8]PENG Z,LIU H,JIA Y,et al. Attention-driven graph clustering network[C]//Proceedings of the 29th ACM International Conference on Multimedia. New York:Association for Computing Machinery,2021:935-943.
[9]XIE J Y,GIRSHICK R,FARHADI A. Unsupervised deep embedding for clustering analysis[C]//Proceedings of International Conference on Machine Learning. New York:Association for Computing Machinery,2016:478-487.
[10]GUO X F,GAO L,LIU X W,et al. Improved deep embedded clustering with local structure preservation[C]//Proceedings of IJCAI. New York:Association for Computing Machinery,2017:1753-1759.
[11]KIPF T N,WELLING M. Semi-supervised classification with graph convolutional networks[J/OL]. arXiv preprint arXiv:1609.02907,2016.
[12]KIPF T N,WELLING M. Variational graph auto-encoders[J/OL]. arXiv preprint arXiv:1611.07308,2016.
[13]WANG C,PAN S R,HU R Q,et al. Attributed graph clustering:A deep attentional embedding approach[J/OL]. arXiv preprint arXiv:1906.06532,2019.
[14]林晶晶,冶忠林,赵海兴,等. 超图神经网络综述[J]. 计算机研究与发展,2024,61(2):362-384.
[15]WANG C,PAN S R,LONG G D,et al. Mgae:Marginalized graph autoencoder for graph clustering[C]//Proceedings of the 2017 ACM on Conference on Information and Knowledge Management. New York:Association for Computing Machinery,2017:889-898.
[16]YOU Y N,CHEN T L,SUI Y D,et al. Graph contrastive learning with augmentations[J]. Advances in neural information processing systems,2020,33:5812-5823.
[17]HU Y,YOU H X,WANG Z C,et al. Graph-mlp:Node classification without message passing in graph[J/OL]. arXiv preprint arXiv:2106.04051,2021.
[18]KIPF T,VAN DER POL E,WELLING M. Contrastive learning of structured world models[J/OL]. arXiv preprint arXiv:1911.12247,2019.
[19]PAN S R,HU R Q,LONG G D,et al. Adversarially regularized graph autoencoder for graph embedding[J/OL]. arXiv preprint arXiv:1802.04407,2018.
[20]YANG X H,HU X C,ZHOU S H,et al. Interpolation-based contrastive learning for few-label semi-supervised learning[J]. IEEE transactions on neural networks and learning systems,2022,35(2):2054-2065.
[21]XIA J,WU L,CHEN J,et al. SimGRACE:A simple framework for graph contrastive learning without data augmentation[J/OL]. arXiv preprint arXiv:2202.03104,2022.
[22]WANG Y,CAI Y,LIANG Y,et al. Adaptive data augmentation on temporal graphs[J]. Advances in neural information processing systems,2021(34):1440-1452.
[23]CHEN T,KORNBLITH S,NOROUZI M,et al. A simple framework for contrastive learning of visual representations[C]//Proceedings of International Conference on Machine Learning. New York:Association for Computing Machinery,2020:1597-1607.
[24]HASSANI K,KHASAHMADI A H. Contrastive multi-view representation learning on graphs[C]//Proceedings of International Conference on Machine Learning. New York:Association for Computing Machinery,2020:4116-4126.
[25]ZHU J,ROSSI R A,RAO A,et al. Graph neural networks with heterophily[C]//Proceedings of the AAAI Conference on Artificial Intelligence. New York:Association for Computing Machinery,2021:11168-11176.
[26]VELICKOVIC P,FEDUS W,HAMILTON W L,et al. Deep graph infomax[J]. ICLR,2019,2(3):4.
[27]蒋林浦,陈可佳. 基于对比预测的自监督动态图表示学习方法[J]. 计算机科学,2023,50(7):207-221.
[28]LIKAS A,VLASSIS N,VERBEEK J J. The global k-means clustering algorithm[J]. Pattern recognition,2003,36(2):451-461.
[29]VON LUXBURG U. A tutorial on spectral clustering[J]. Statistics and computing,2007,17:395-416.
[30]PEROZZI B,AL-RFOU R,SKIENA S. Deepwalk:Online learning of social representations[C]//Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York:Association for Computing Machinery,2014:701-710.
[31]TIAN F,GAO B,CUI Q,et al. Learning deep representations for graph clustering[C]//Proceedings of the AAAI Conference on Artificial Intelligence. New York:Association for Computing Machinery,2014.
[32]CAO S,LU W,XU Q. Deep neural networks for learning graph representations[C]//Proceedings of the AAAI Conference on Artificial Intelligence. New York:Association for Computing Machinery,2016.
[33]ZHANG X T,LIU H,LI Q M,et al. Attributed graph clustering via adaptive graph convolution[J/OL]. arXiv preprint arXiv:1906.01210,2019.
[34]HE D X,SONG Y,JIN D,et al. Community-centric graph convolutional network for unsupervised community detection[C]//Proceedings of the 29th International Conference on International Joint Conferences on Artificial Intelligence. San Francisco:International Joint Conferences on Artificial Intelligence,2021:3515-3521.
[35]ZHANG X T,LIU H,WU X M,et al. Spectral embedding network for attributed graph clustering[J]. Neural networks,2021,142:388-396.
[36]SALEHI A,DAVULCU H. Graph attention auto-encoders[J/OL]. arXiv preprint arXiv:1905.10715,2019.
[37]WANG C,PAN S R,CELINA P Y,et al. Deep neighbor-aware embedding for node clustering in attributed graphs[J]. Pattern recognition,2022,122:108230.

相似文献/References:

[1]戚小莎,曾 静,吉根林.双交叉注意力自编码器改进视频异常检测[J].南京师大学报(自然科学版),2023,46(01):110.[doi:10.3969/j.issn.1001-4616.2023.01.015]
 Qi Xiaosha,Zeng Jing,Ji Genlin.Improved Video Anomaly Detection with Dual Criss-Cross Attention Auto Encoder[J].Journal of Nanjing Normal University(Natural Science Edition),2023,46(01):110.[doi:10.3969/j.issn.1001-4616.2023.01.015]

备注/Memo

备注/Memo:
收稿日期:2023-06-12.
基金项目:河北省自然科学基金项目(F2021205014、F2019205303)、河北省高等学校科学技术研究项目(ZD2022139)、中央引导地方科技发展资金项目(226Z1808G)、河北师范大学科技类研究基金项目(L2023J05)、河北师范大学研究生创新资助项目(XCXZZSS202315).
通讯作者:王静红,博士,教授,研究方向:人工智能、数据挖掘. E-mail:wangjinghong@126.com
更新日期/Last Update: 2025-02-15