We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
你好老师,我在自己复现代码过程中发现MMD的Loss每一个Batch的值都不相同,也没有降低趋势,且目标域的测试精度不是收敛状态(有时候大,有时候小),请问,这个MMD正则项是否参与反向传播?这个目标域测试集准确率不收敛的情况是不是因为每一个Batch数据分布不同而造成的?也就是实际上每一批次训练考虑的是一个Batch的分布,不会从测试集整体的数据分布考虑?谢谢
The text was updated successfully, but these errors were encountered:
MMD参与了反向传播,只是值太小了不明显。建议调一下权重看看。
Sorry, something went wrong.
No branches or pull requests
你好老师,我在自己复现代码过程中发现MMD的Loss每一个Batch的值都不相同,也没有降低趋势,且目标域的测试精度不是收敛状态(有时候大,有时候小),请问,这个MMD正则项是否参与反向传播?这个目标域测试集准确率不收敛的情况是不是因为每一个Batch数据分布不同而造成的?也就是实际上每一批次训练考虑的是一个Batch的分布,不会从测试集整体的数据分布考虑?谢谢
The text was updated successfully, but these errors were encountered: