Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SAM, Generative loss explodes during validation phase #162

Open
patfl84 opened this issue Mar 7, 2024 · 0 comments
Open

SAM, Generative loss explodes during validation phase #162

patfl84 opened this issue Mar 7, 2024 · 0 comments

Comments

@patfl84
Copy link

patfl84 commented Mar 7, 2024

Hi,

I'm using SAM with losstype="fgan", but I see that my loss is close to -1 during training epochs, but then goes to around -1e22 and sometimes even NaN when I get to the test epochs (and the generator and discriminator have losses negative of each other). [Ref: https://github.com//issues/61]

image

Do you have an idea of why this might happen?

Edit: I'm seeing that this is happening randomly at any epoch, not just during the validation phase. This could be something to do with outlier samples, but that is unlikely. I will tried with losstype="gan" but that takes significantly more time per model (2 hrs vs 5 minutes).

Also can you explain the loss term in for the fgan: https://github.com/FenTechSolutions/CausalDiscoveryToolbox/blob/master/cdt/causality/graph/SAM.py#L328

gen_loss = -th.mean(th.exp(disc_vars_g - 1), [0, 2]).sum()

Can you also point me to where this is taken from?

Also, it seems like the backward pass is being run for the train as well as the test epochs: https://github.com/FenTechSolutions/CausalDiscoveryToolbox/blob/master/cdt/causality/graph/SAM.py#L365

if epoch < train + test - 1:
        loss.backward()

Is this correct?

@patfl84 patfl84 changed the title [SAM] + Generative loss explodes during validation phase SAM, Generative loss explodes during validation phase Mar 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant