前言:回归任务是神经网络的一个重要的任务,通俗的讲,回归任务就是给你一系列的输入,然后预测出输出的任务,比如预测气温,预测股票等等,都是回归任务。

下面还是直接看代码,根据代码来学习回归任务

第一步:处理输入数据

import numpy as np

import pandas as pd

import matplotlib.pyplot as plt

import torch

import torch.optim as optim

import warnings

打印看一下数据的样子

features = pd.read_csv('temps.csv')

#看看数据长什么样子

features.head()

actual为标签,其余的均为输入

print('数据维度:', features.shape)

打印以下数据的形状:(348,9)

将输入数据可视化:

# 准备画图

# 指定默认风格

plt.style.use('fivethirtyeight')

# 设置布局

fig, ((ax1, ax2), (ax3, ax4)) = plt.subplots(nrows=2, ncols=2, figsize = (10,10))

fig.autofmt_xdate(rotation = 45)

# 标签值

ax1.plot(dates, features['actual'])

ax1.set_xlabel(''); ax1.set_ylabel('Temperature'); ax1.set_title('Max Temp')

# 昨天

ax2.plot(dates, features['temp_1'])

ax2.set_xlabel(''); ax2.set_ylabel('Temperature'); ax2.set_title('Previous Max Temp')

# 前天

ax3.plot(dates, features['temp_2'])

ax3.set_xlabel('Date'); ax3.set_ylabel('Temperature'); ax3.set_title('Two Days Prior Max Temp')

ax4.plot(dates, features['friend'])

ax4.set_xlabel('Date'); ax4.set_ylabel('Temperature'); ax4.set_title('Friend Estimate')

plt.tight_layout(pad=2)

结果为:

 然后把星期几转化为独热编码的格式,独热编码就是不重复样本数,将所有样本按照0或者1进行编码,如:0000001代表星期日

features = pd.get_dummies(features)

features.head(5)

最后编码后的结果为:

将数据处理成特征和标签的形式:

# 标签

labels = np.array(features['actual'])

# 在特征中去掉标签

features= features.drop('actual', axis = 1)

# 名字单独保存一下,以备后患

feature_list = list(features.columns)

# 转换成合适的格式

features = np.array(features)

features.shape

labels.shape

 最后得到的特征为(348,14),标签为(348,)

第二步:构建网络模型

x = torch.tensor(input_features, dtype = float)

y = torch.tensor(labels, dtype = float)

# 权重参数初始化

weights = torch.randn((14, 128), dtype = float, requires_grad = True)

biases = torch.randn(128, dtype = float, requires_grad = True)

weights2 = torch.randn((128, 1), dtype = float, requires_grad = True)

biases2 = torch.randn(1, dtype = float, requires_grad = True)

learning_rate = 0.001

losses = []

for i in range(1000):

# 计算隐层

hidden = x.mm(weights) + biases

# 加入激活函数

hidden = torch.relu(hidden)

# 预测结果

predictions = hidden.mm(weights2) + biases2

# 通计算损失

loss = torch.mean((predictions - y) ** 2)

losses.append(loss.data.numpy())

# 打印损失值

if i % 100 == 0:

print('loss:', loss)

#返向传播计算

loss.backward()

#更新参数

weights.data.add_(- learning_rate * weights.grad.data)

biases.data.add_(- learning_rate * biases.grad.data)

weights2.data.add_(- learning_rate * weights2.grad.data)

biases2.data.add_(- learning_rate * biases2.grad.data)

# 每次迭代都得记得清空

weights.grad.data.zero_()

biases.grad.data.zero_()

weights2.grad.data.zero_()

biases2.grad.data.zero_()

上面的网络模型的构建是具体的过程,有助于理解,实际中一般都是用具体的包,流程为:将数据装化为张量格式---->初始化权重参数和偏置项---->计算前向传播结果---->计算损失---->反向传播---->沿梯度更新参数---->将梯度清零。

更将单的写法为:

input_size = input_features.shape[1]

hidden_size = 128

output_size = 1

batch_size = 16

my_nn = torch.nn.Sequential(

torch.nn.Linear(input_size, hidden_size),

torch.nn.Sigmoid(),

torch.nn.Linear(hidden_size, output_size),

)

cost = torch.nn.MSELoss(reduction='mean')

optimizer = torch.optim.Adam(my_nn.parameters(), lr = 0.001)

接下来就是最重要的训练网络:

# 训练网络

losses = []

for i in range(1000):

batch_loss = []

# MINI-Batch方法来进行训练

for start in range(0, len(input_features), batch_size):

end = start + batch_size if start + batch_size < len(input_features) else len(input_features)

xx = torch.tensor(input_features[start:end], dtype = torch.float, requires_grad = True)

yy = torch.tensor(labels[start:end], dtype = torch.float, requires_grad = True)

prediction = my_nn(xx)

loss = cost(prediction, yy)

loss.backward(retain_graph=True)

optimizer.step()

optimizer.zero_grad()

batch_loss.append(loss.data.numpy())

# 打印损失

if i % 100==0:

losses.append(np.mean(batch_loss))

print(i, np.mean(batch_loss))

流程为:获取batch数据---->送入网络---->获取预测值---->计算损失---->反向传播---->沿梯度更新参数---->将梯度清零

最后可视化

# 转换日期格式

dates = [str(int(year)) + '-' + str(int(month)) + '-' + str(int(day)) for year, month, day in zip(years, months, days)]

print(dates)

dates = [datetime.datetime.strptime(date, '%Y-%m-%d') for date in dates]

#print(dates)

# 创建一个表格来存日期和其对应的标签数值

true_data = pd.DataFrame(data = {'date': dates, 'actual': labels})

# 同理,再创建一个来存日期和其对应的模型预测值

months = features[:, feature_list.index('month')]

days = features[:, feature_list.index('day')]

years = features[:, feature_list.index('year')]

test_dates = [str(int(year)) + '-' + str(int(month)) + '-' + str(int(day)) for year, month, day in zip(years, months, days)]

test_dates = [datetime.datetime.strptime(date, '%Y-%m-%d') for date in test_dates]

predictions_data = pd.DataFrame(data = {'date': test_dates, 'prediction': predict.reshape(-1)})

# 真实值

plt.plot(true_data['date'], true_data['actual'], 'b-', label = 'actual')

# 预测值

plt.plot(predictions_data['date'], predictions_data['prediction'], 'ro', label = 'prediction')

plt.xticks(rotation = '60');

plt.legend()

# 图名

plt.xlabel('Date'); plt.ylabel('Maximum Temperature (F)'); plt.title('Actual and Predicted Values');

 总结:

左边流程的链接:(23条消息) 神经网络入门(手写体的识别torch+jupyter+Mnist数据集)_萌新小白一只的博客-CSDN博客

 

相关阅读

评论可见,请评论后查看内容,谢谢!!!
 您阅读本篇文章共花了: