-
Notifications
You must be signed in to change notification settings - Fork 422
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
准确率不变 #75
Comments
兄弟,问题解决了吗,我也碰到这个问题,用这个策略训练的时候准确率没什么变化 |
2 similar comments
兄弟,问题解决了吗,我也碰到这个问题,用这个策略训练的时候准确率没什么变化 |
兄弟,问题解决了吗,我也碰到这个问题,用这个策略训练的时候准确率没什么变化 |
我的问题是数据集的问题,不是模型的问题。数据集当时有些标点符号不对,我整理了一下就好了
…---- 回复的原邮件 ----
| 发件人 | ***@***.***> |
| 日期 | 2024年06月13日 10:49 |
| 收件人 | ***@***.***> |
| 抄送至 | Jones ***@***.***>***@***.***> |
| 主题 | Re: [dragen1860/MAML-Pytorch] 准确率不变 (Issue #75) |
兄弟,问题解决了吗,我也碰到这个问题,用这个策略训练的时候准确率没什么变化
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
我使用ALBERT和孪生网络来训练一个主观问题评分模型,训练策略参考的你的代码,孪生网络由双向LSTM和全连接层组成。在训练中,我发现准确率没有提高,一直保持不变。我感觉像是权重没有更新,可能是因为梯度太小导致了权重变化不大。或者,训练策略可能存在问题,但我不确定具体原因。下面是我训练期时的准确率:
`
class MetaTask(nn.Module):
def init(self, args):
super(MetaTask, self).init()
self.device = 'cuda' if torch.cuda.is_available() else 'cpu'
self.loss_fn = nn.CrossEntropyLoss()
self.update_lr = args.update_lr
self.meta_lr = args.meta_lr
self.finetunning_lr = args.finetunning_lr
self.n_way = args.n_way
self.k_spt = args.k_spt
self.k_qry = args.k_qry
self.task_num = args.task_num
self.update_step = args.update_step
self.update_step_test = args.update_step_test
self.net = SubjectiveGradingModel().to(self.device)
self.meta_optim = optim.Adam(self.net.parameters(), lr=self.meta_lr)
class SubjectiveGradingModel(nn.Module):
def init(self, hidden_size=384):
super(SubjectiveGradingModel, self).init()
`
这会是什么原因?
The text was updated successfully, but these errors were encountered: