torch.nn.ReLU(inplace=True)的具体含义:(Specific meaning of torch.nn.relu (inplace = true):)

首先根据源文档中的ReLU(x)=max(0,x),得出结论。大于0的数值不变,小于0的数据变成0。

这是对应的文档链接:https://pytorch.org/docs/1.2.0/nn.html#torch.nn.ReLU  Ps:需要自取。

参数:inplace为True,将会改变输入的数据 ,否则不会改变原输入,只会产生新的输出。

好处:省去了反复申请与释放内存的时间,直接代替原来的值。

测试代码:

import torch
from torch import nn
m = nn.ReLU()
# 随机生成5个数 有正有负。
input = torch.randn(5)
# 打印 随机生成的数
print(input)
output = m(input)
# 经过nn.ReLU()作用之后的数
print(output)
# 结果
# tensor([-0.7706, -0.1823,  0.2687,  0.2796, -1.7201])
# tensor([0.0000,   0.0000,  0.2687,  0.2796,  0.0000])

任务完成,下班。

————————

First, draw a conclusion according to relu (x) = max (0, x) in the source document. Values greater than 0 remain unchanged, and data less than 0 becomes 0.

This is the corresponding document link: https://pytorch.org/docs/1.2.0/nn.html#torch.nn.ReLU   PS: self fetching is required.

Parameter: if inplace is true, the input data will be changed. Otherwise, the original input will not be changed and only new output will be generated.

Advantages: it saves the time of repeated application and memory release, and directly replaces the original value.

Test code:

import torch
from torch import nn
m = nn.ReLU()
# 随机生成5个数 有正有负。
input = torch.randn(5)
# 打印 随机生成的数
print(input)
output = m(input)
# 经过nn.ReLU()作用之后的数
print(output)
# 结果
# tensor([-0.7706, -0.1823,  0.2687,  0.2796, -1.7201])
# tensor([0.0000,   0.0000,  0.2687,  0.2796,  0.0000])

Finish the task and get off work.