目前主要的出发点还是通过Graph数据本身的一些信号,比如节点特征,边特征,图级别特征作为自监督的信号来训练整个模型。实际上,经典的图嵌入模型Deepwalk就是图上自监督训练的模式,通过对图进行随机游走产生序列,然后通过中心结点预测上下文结点的假定任务来完成结点表示的训练。
从模型的角度来说,主要有基于Bert的模型,基于随机游走的模型,基于自编码器的模型以及基于GNN的模型。
1 相关论文
小编整理了最近发表的关于图上进行自监督学习/预训练相关的综述文章。
Strategies for Pre-training Graph Neural Networks. 2020. ICLR
GPT-GNN: Generative Pre-Training of Graph Neural Networks. 2020. KDD
Pre-Training Graph Neural Networks for Generic Structural Feature Extraction. 2020
Self-supervised Learning: Generative or Contrastive. 2020.
Gaining insight into SARS-CoV-2 infection and COVID-19 severity using self-supervised edge features and Graph Neural Networks. 2020. ICML
When Does Self-Supervision Help Graph Convolutional Networks? 2020. ICML
Multi-Stage Self-Supervised Learning for Graph Convolutional Networks on Graphs with Few Labeled Nodes. 2020. AAAI
GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training. 2020. KDD
Self-Supervised Graph Representation Learning via Global Context Prediction. 2020.
Contrastive Multi-View Representation Learning on Graphs. 2020.
Self-supervised Training of Graph Convolutional Networks. 2020.
Self-supervised Learning on Graphs: Deep Insights and New Directions. 2020.
GRAPH-BERT: Only Attention is Needed for Learning Graph Representations. 2020.
Graph Neural Distance Metric Learning with GRAPH-BERT. 2020.
Segmented GRAPH-BERT for Graph Instance Modeling. 2020.
2 参考资料
[KDD2020图神经网络预训练模型]
https://zhuanlan.zhihu.com/p/149222809
[图上自监督学习综述]
https://zhuanlan.zhihu.com/p/150112070
[图神经网络的预训练策略]
https://zhuanlan.zhihu.com/p/124663407
[GNN 教程-图上的预训练任务下篇]
https://archwalker.github.io/blog/2019/08/08/GNN-Pretraining-1.html
[0].一文尽览推荐系统模型演变史(文末可下载)
[1].KDD2020推荐系统论文聚焦
[2].推荐系统之FM与MF傻傻分不清楚
[3].利用对抗技术来权衡推荐精度与用户隐私
喜欢的话点个在看吧????
