卷积神经网络基本框架搭建

    技术2025-10-06  5

    神经网络 卷积神经网络的实质就是矩阵乘法以及权值共享,我们利用pytorch进行开发。PyTorch的优点比较明显,原因是pytorch安装起来很方便,根据pip命令安装或者conda安装即可,不需要手动安装依赖库。第二个很大的优点便是简洁易用,下面就来介绍一下如何用pytorch搭建简单的卷积神经网络:

    卷积层的输出满足这样的表达式: 网络框架:

    import torch.nn as nn import torch.optim import torch.nn.functional as F from sklearn.model_selection import train_test_split import torch.utils import numpy as np from torch.utils.data import Dataset, DataLoader, TensorDataset class CNN(nn.Module): def __init__(self): super(CNN, self).__init__() self.conv1 = nn.Conv1d(in_channels=3, out_channels=10, kernel_size=3, stride=2) self.max_pool1 = nn.MaxPool1d(kernel_size=3, stride=2) self.conv2 = nn.Conv1d(10, 20, 3, 2) self.max_pool2 = nn.MaxPool1d(3, 2) self.conv3 = nn.Conv1d(20, 40, 3, 2) self.liner1 = nn.Linear(40 * 4, 120) self.liner2 = nn.Linear(120, 84) self.liner3 = nn.Linear(84, 4) def forward(self, x): x = F.relu(self.conv1(x)) x = self.max_pool1(x) x = F.relu(self.conv2(x)) x = self.max_pool2(x) x = F.relu(self.conv3(x)) # print(x.shape) x = x.view(-1, 40 * 4) x = F.relu(self.liner1(x)) x = F.relu(self.liner2(x)) x = self.liner3(x) return x

    网络训练:

    net = CNN() criterion = nn.CrossEntropyLoss() optimizer = torch.optim.SGD(net.parameters(), lr=0.001, momentum=0.9) for epoch in range(100): running_loss = 0 for i, input_data in enumerate(x_train, 0): # print(input_data.shape) label = y_train[i] optimizer.zero_grad() outputs = net(input_data) loss = criterion(outputs, label) loss.backward() optimizer.step() running_loss += loss.item() if i % 100 == 99: print('[%d, %5d] loss: %0.3f' % (epoch + 1, i + 1, running_loss / 10)) running_loss = 0.0
    Processed: 0.014, SQL: 9