Issue
I am trying to use sgd, adam
, and LBFGS
optimizer.
The part of the code is:
for batch_idx, (inputs, targets) in enumerate(trainloader):
batch_size = inputs.size(0)
total += batch_size
one_hot_targets = torch.FloatTensor(batch_size, 10).zero_()
one_hot_targets = one_hot_targets.scatter_(1, targets.view(batch_size, 1), 1.0)
one_hot_targets = one_hot_targets.float()
if use_cuda:
inputs, one_hot_targets = inputs.cuda(), one_hot_targets.cuda()
inputs, one_hot_targets = Variable(inputs), Variable(one_hot_targets)
if optimizer_val=='sgd' or optimizer_val=='adam':
outputs = F.softmax(net(inputs))
loss = criterion(outputs, one_hot_targets)
loss.backward()
optimizer.step()
else:
def closure():
optimizer.zero_grad()
outputs = F.softmax(net(inputs))
loss = criterion(outputs, one_hot_targets)
loss.backward()
return loss
optimizer.step(closure())
In the optimizer.step(closure())
part in LBFGS
(running in else
) I am getting this error:
TypeError: 'Tensor' object is not callable
I checked, the loss
is tensor type.
How to make it work?
Solution
You need to pass a function callback to the optimizer.step
function, don't call it:
optimizer.step(closure)
Answered By - Ivan
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.