图神经网络框架DGL学习——101
DGL是一个主流的开源的图神经网络,支持tensorflow, torch等语言,用的较多的是torch。具体介绍,请见官方主页:https://docs.dgl.ai/index.html
图神经网络的几个关键流程: 1.图的构建 2.特征传递给边或者节点 3.图神经网络模型的构建 4.模型训练 5.模型可视化
DGL可以帮助我们更快的建立一个图神经网络,主要体现在图的构建、特征赋予节点/边、自带各类图神经网络层、可视化上。以下是官方文档的入门教程代码。
“Zachary’s karate club” 问题为例。Zachary’s karate club有34个成员,下图代表34个成员之间的社会联系,分裂成两个团体。已知0号成员和34号成员分别属于两个团体(黄色/红色)。需要根据34各成员的社会联系图,预测其他成员的团体归属。所以,这是一个节点层面的分类问题。 如何构建一个图呢?首先,找出每一个联系(边)的起始节点src和终止节点dst,分别形成数组,用于描述图中的关系。然后使用dgl.DGLGraph()函数构建图,代买如下:
import dgl import numpy as np def build_karate_club_graph(): # All 78 edges are stored in two numpy arrays. One for source endpoints # while the other for destination endpoints. src = np.array([1, 2, 2, 3, 3, 3, 4, 5, 6, 6, 6, 7, 7, 7, 7, 8, 8, 9, 10, 10, 10, 11, 12, 12, 13, 13, 13, 13, 16, 16, 17, 17, 19, 19, 21, 21, 25, 25, 27, 27, 27, 28, 29, 29, 30, 30, 31, 31, 31, 31, 32, 32, 32, 32, 32, 32, 32, 32, 32, 32, 32, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33, 33]) dst = np.array([0, 0, 1, 0, 1, 2, 0, 0, 0, 4, 5, 0, 1, 2, 3, 0, 2, 2, 0, 4, 5, 0, 0, 3, 0, 1, 2, 3, 5, 6, 0, 1, 0, 1, 0, 1, 23, 24, 2, 23, 24, 2, 23, 26, 1, 8, 0, 24, 25, 28, 2, 8, 14, 15, 18, 20, 22, 23, 29, 30, 31, 8, 9, 13, 14, 15, 18, 19, 20, 22, 23, 26, 27, 28, 29, 30, 31, 32]) # Edges are directional in DGL; Make them bi-directional. u = np.concatenate([src, dst]) v = np.concatenate([dst, src]) # Construct a DGLGraph return dgl.DGLGraph((u, v)) G = build_karate_club_graph() print('We have %d nodes.' % G.number_of_nodes()) print('We have %d edges.' % G.number_of_edges())在图神经网络中,特征是赋给边或者节点的。对于 “Zachary’s karate club” 问题,是要赋给节点。这里采用的是5维可训练的嵌入变量对34个节点进行赋值。
# In DGL, you can add features for all nodes at once, using a feature tensor that # batches node features along the first dimension. The code below adds the learnable # embeddings for all nodes: import torch import torch.nn as nn import torch.nn.functional as F embed = nn.Embedding(34, 5) # 34 nodes with embedding dim equal to 5 G.ndata['feat'] = embed.weight # print out node 2's input feature print(G.ndata['feat'][2]) # print out node 10 and 11's input features print(G.ndata['feat'][[10, 11]])这里基于GDL的GraphConv,构建一个简单的两层的图卷积神经网络。
from dgl.nn.pytorch import GraphConv class GCN(nn.Module): def __init__(self, in_feats, hidden_size, num_classes): super(GCN, self).__init__() self.conv1 = GraphConv(in_feats, hidden_size) self.conv2 = GraphConv(hidden_size, num_classes) def forward(self, g, inputs): h = self.conv1(g, inputs) h = torch.relu(h) h = self.conv2(g, h) return h # The first layer transforms input features of size of 5 to a hidden size of 5. # The second layer transforms the hidden layer and produces output features of # size 2, corresponding to the two groups of the karate club. net = GCN(5, 5, 2)模型的数据输入就是刚才建立的可训练的嵌入向量。标签,由于只知道节点0和节点33的标签,而其他节点的标签并不清楚,所以应该是半监督学习问题。因此只能对节点33和节点0进行标记。
inputs = embed.weight labeled_nodes = torch.tensor([0, 33]) # only the instructor and the president nodes are labeled labels = torch.tensor([0, 1]) # their labels are different模型训练:
import itertools optimizer = torch.optim.Adam(itertools.chain(net.parameters(), embed.parameters()), lr=0.01) all_logits = [] #用于记录训练过程中,各个节点的分类概率 for epoch in range(50): logits = net(G, inputs) #图卷积网络输出 # we save the logits for visualization later all_logits.append(logits.detach()) logp = F.log_softmax(logits, 1) #分类 # we only compute loss for labeled nodes loss = F.nll_loss(logp[labeled_nodes], labels) #只计算已经标记节点的损失 optimizer.zero_grad() loss.backward() optimizer.step() print('Epoch %d | Loss: %.4f' % (epoch, loss.item()))使用netweokx进行。
图的可视化, 结果如下图:
import networkx as nx # Since the actual graph is undirected, we convert it for visualization # purpose. nx_G = G.to_networkx().to_undirected() # Kamada-Kawaii layout usually looks pretty for arbitrary graphs pos = nx.kamada_kawai_layout(nx_G) nx.draw(nx_G, pos, with_labels=True, node_color=[[.7, .7, .7]])模型训练过程可视化:
import matplotlib.animation as animation import matplotlib.pyplot as plt def draw(i): cls1color = '#00FFFF' cls2color = '#FF00FF' pos = {} colors = [] for v in range(34): pos[v] = all_logits[i][v].numpy() cls = pos[v].argmax() colors.append(cls1color if cls else cls2color) ax.cla() ax.axis('off') ax.set_title('Epoch: %d' % i) nx.draw_networkx(nx_G.to_undirected(), pos, node_color=colors, with_labels=True, node_size=300, ax=ax) fig = plt.figure(dpi=150) fig.clf() ax = fig.subplots() draw(0) # draw the prediction of the first epoch plt.close()动态图片:
ani = animation.FuncAnimation(fig, draw, frames=len(all_logits), interval=200)。。。不知道如何在中显示动态图,结果省略。 另外,在pycharm和jupyter notebook中动态图的显示,需要设置一下,否则显示不出来。 参考:https://blog.csdn.net/qq_42182596/article/details/106528274 https://www.jianshu.com/p/c6b362fde21c 当然,你一定找得到你的Python Scientific,好像社区版是没有的。。。。