Issue
um_epochs = 5
device = torch.device("mps")
d2l.train_ch3(net, train_iter, test_iter, loss, num_epochs, batch_size, None, None, optimizer).to(device)
Then the terminal told me that:
1 num_epochs = 5
2 device = torch.device("mps")
----> 3 d2l.train_ch3(net, train_iter, test_iter, loss, num_epochs, batch_size, None, None, optimizer).to(device)
AttributeError: 'NoneType' object has no attribute 'to'
Here is the function:
def train_ch3(net, train_iter, test_iter, loss, num_epochs, batch_size,
params=None, lr=None, optimizer=None):
for epoch in range(num_epochs):
train_l_sum, train_acc_sum, n = 0.0, 0.0, 0
for X, y in train_iter:
y_hat = net(X)
l = loss(y_hat, y).sum()
# Gradient zeroing
if optimizer is not None:
optimizer.zero_grad()
elif params is not None and params[0].grad is not None:
for param in params:
param.grad.data.zero_()
l.backward()
if optimizer is None:
sgd(params, lr, batch_size)
else:
optimizer.step()
train_l_sum += l.item()
train_acc_sum += (y_hat.argmax(dim=1) == y).sum().item()
n += y.shape[0]
test_acc = evaluate_accuracy(test_iter, net)
print('epoch %d, loss %.4f, train acc %.3f, test acc %.3f'
% (epoch + 1, train_l_sum / n, train_acc_sum / n, test_acc))
In the process of searching, I noticed that some alike problems are owing to the overloading mechanism for "=" and "-". But in this function I didn't find any similar slice...
So what's the matter with this NoneType?
Solution
The function train_ch3
does not return anything explicitly, which means that it returns NonType object. And for sure, NoneType object has not attribute named 'to'
I don't know what type of class dl3
is, but if you want to load a tensor or pytorch module onto a specific device, you should use the method to
of that tensor or module like: net.to(device)
or X.to(device), y.to(device)
Answered By - mjung
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.