pytorch.backward() 举例上手y=w*x,自动求导 import torch from torch.autograd import Variable x=Variable(torch.Tensor([2])) y=Variable(torch.Tensor([10])) w = Variable(torch.randn(1),requires_grad = True) print(w) loss=torch.nn.MSELoss() optimizer=torch.optim.Adam([w],lr=0.05) for i in range(1000): optimiz