Issue
How can I jointly optimize the parameters of a model comprising two distinct neural networks with a single optimizer? What I've tried is the following, after having initialized an optimizer:
optim_global = optim.Adam(zip(model1.parameters(), model2.parameters()))
but I get this error
TypeError: optimizer can only optimize Tensors, but one of the params is tuple
Solution
These are generator you can control either with the unpacking operator *
:
>>> optim.Adam([*model1.parameters(), *model2.parameters()])
Or using itertools.chain
>>> optim.Adam(chain(model1.parameters(), model2.parameters()))
Answered By - Ivan
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.