Tensorboard可视化操作

    技术2022-07-10  165

    在网络训练过程中,为了更直观的看到训练过程参数,可以利用tensorboard进行可视化。

    在使用中主要有两个添加数据函数:

    添加标量参数:tf.summary.scalar(tags, values, collections=None, name=None) 例如:tf.summary.scalar('loss', loss) 添加直方图参数:tf.summary.histogram(tags, values, collections=None, name=None) 例如:tf.summary.histogram( 'weight', w) 添加图像:tf.summary.image(tags, image) 例如:tf.summary.image('image', img)

    其他函数:

    自动管理函数:tf.summary.merge_all() 例如:merged = tf.summary.merge_all() 创建文件函数:tf.summary.FileWriter(path, session.graph) 例如:writer = tf.summary.FileWriter("./log", sess.graph) 数据写入函数:writer.add_summary(result, step) 例如:writer.add_summary(result, i) 关闭文件函数:writer.close()

    打开可视化界面: 首先找到 1、 tensorboard.exe文件(一般在你安装对应环境下) 2、 在tensorboard.exe文件夹下打开终端窗口 3、 在终端窗口输入tensorboard --logdir=输入你生成的tensorboard数据文件夹路径(例如 tensorboard --logdir=D:\my_graph\log) 4、 在浏览器中输入生成的http://localhost:6006/便可以打开可视化窗口。 测试代码:

    import tensorflow as tf import numpy as np # Make up some real data x_data = np.linspace(-1, 1, 300)[:, np.newaxis] noise = np.random.normal(0, 0.05, x_data.shape) y_data = np.square(x_data) - 0.5 + noise def fc_layer(x, in_kernels, out_kernels, layer_name): '''全连接层''' with tf.variable_scope(layer_name): w = tf.get_variable('weight', shape=[in_kernels, out_kernels], initializer=tf.truncated_normal_initializer(stddev=0.1)) tf.summary.histogram(layer_name + '/weight', w) # 权重值加入到tensorboard b = tf.get_variable('bias', shape=[out_kernels], initializer=tf.truncated_normal_initializer(stddev=0.1)) tf.summary.histogram(layer_name + '/bias', b) # 偏执值加入到tensorboard y = tf.nn.relu(tf.nn.bias_add(tf.matmul(x, w), b)) return y def out_layer(x, in_kernels, out_kernels, layer_name): '''输出层''' with tf.variable_scope(layer_name): w = tf.get_variable('weight', shape=[in_kernels, out_kernels], initializer=tf.truncated_normal_initializer(stddev=0.1)) tf.summary.histogram(layer_name + '/weight', w) # 权重值加入到tensorboard b = tf.get_variable('bias', shape=[out_kernels], initializer=tf.truncated_normal_initializer(stddev=0.1)) tf.summary.histogram(layer_name + '/bias', b) # 偏执值加入到tensorboard y = tf.nn.bias_add(tf.matmul(x, w), b) return y # define placeholder for inputs to network with tf.name_scope('inputs'): xs = tf.placeholder(tf.float32, [None, 1], name='x_input') ys = tf.placeholder(tf.float32, [None, 1], name='y_input') # add hidden layer fc_1 = fc_layer(xs, 1, 10, 'fc_1') # add output layer prediction = out_layer(fc_1, 10, 1, 'out') # the error between prediciton and real data with tf.name_scope('loss'): loss = tf.reduce_mean(tf.reduce_sum(tf.square(ys - prediction), reduction_indices=[1])) tf.summary.scalar('loss', loss) with tf.name_scope('train'): train_step = tf.train.GradientDescentOptimizer(0.1).minimize(loss) sess = tf.Session() #合并 merged = tf.summary.merge_all() writer = tf.summary.FileWriter("./log", sess.graph) init = tf.global_variables_initializer() sess.run(init) # saver = tf.train.Saver() for i in range(1000): sess.run(train_step, feed_dict={xs: x_data, ys: y_data}) if i % 50 == 0: # saver.save(sess, './model/model.ckpt', global_step=i) result = sess.run(merged,feed_dict={xs: x_data, ys: y_data}) print(sess.run(loss,feed_dict={xs: x_data, ys: y_data})) #i 就是记录的步数 writer.add_summary(result, i) writer.close() sess.close()
    Processed: 0.009, SQL: 9