# torch.nn.ReLU(inplace=True)的具体含义：(Specific meaning of torch.nn.relu (inplace = true):)-torch

## torch.nn.ReLU(inplace=True)的具体含义：(Specific meaning of torch.nn.relu (inplace = true):)

``````import torch
from torch import nn
m = nn.ReLU()
# 随机生成5个数 有正有负。
input = torch.randn(5)
# 打印 随机生成的数
print(input)
output = m(input)
# 经过nn.ReLU()作用之后的数
print(output)
# 结果
# tensor([-0.7706, -0.1823,  0.2687,  0.2796, -1.7201])
# tensor([0.0000,   0.0000,  0.2687,  0.2796,  0.0000])``````

————————

First, draw a conclusion according to relu (x) = max (0, x) in the source document. Values greater than 0 remain unchanged, and data less than 0 becomes 0.

This is the corresponding document link: https://pytorch.org/docs/1.2.0/nn.html#torch.nn.ReLU   PS: self fetching is required.

Parameter: if inplace is true, the input data will be changed. Otherwise, the original input will not be changed and only new output will be generated.

Advantages: it saves the time of repeated application and memory release, and directly replaces the original value.

Test code:

``````import torch
from torch import nn
m = nn.ReLU()
# 随机生成5个数 有正有负。
input = torch.randn(5)
# 打印 随机生成的数
print(input)
output = m(input)
# 经过nn.ReLU()作用之后的数
print(output)
# 结果
# tensor([-0.7706, -0.1823,  0.2687,  0.2796, -1.7201])
# tensor([0.0000,   0.0000,  0.2687,  0.2796,  0.0000])``````

Finish the task and get off work.