You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Really appreciate for the shared code and it is really helpful.
I have a question about SeqMetaModel's Protomaml Setting.
in this setting, init_weights and init_bias are initilized by ProtoVec.
ProtoVec is calculated by self.learner(for example, bert)
then self.learner is the computing graph.
proto_grads = torch.autograd.grad(loss, [p for p in self.learner.parameters() if p.requires_grad])
meta_grads = [mg + pg for (mg, pg) in zip(meta_grads, proto_grads)]
The meta_grads should not be added pg, because pg already in the graph by ProtoVec.
Really appreciate for your reply.
The text was updated successfully, but these errors were encountered:
Really appreciate for the shared code and it is really helpful.
I have a question about SeqMetaModel's Protomaml Setting.
proto_grads = torch.autograd.grad(loss, [p for p in self.learner.parameters() if p.requires_grad])
meta_grads = [mg + pg for (mg, pg) in zip(meta_grads, proto_grads)]
The meta_grads should not be added pg, because pg already in the graph by ProtoVec.
Really appreciate for your reply.
The text was updated successfully, but these errors were encountered: