Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loss function computation #24

Open
ThomasMrY opened this issue Dec 17, 2019 · 2 comments
Open

Loss function computation #24

ThomasMrY opened this issue Dec 17, 2019 · 2 comments

Comments

@ThomasMrY
Copy link

ThomasMrY commented Dec 17, 2019

Hi, I have a question.
In training process:"

Forward prop

            outputs, _, _ = net(x_seq, grid_seq, hidden_states, cell_states, PedsList_seq,numPedsList_seq ,dataloader, lookup_seq)

Compute loss

            loss = Gaussian2DLikelihood(outputs, x_seq, PedsList_seq, lookup_seq)"

why the loss is computed by using x_seq and outputs rather than outputs and y_seq?
Thanks

@quancore
Copy link
Owner

x_seq represents the observed part of a trajectory and y_seq is unknown part (will be predicted). During the training, we are using known (observed) part of a trajectory for the training model and calculating training loss.

@xxAna
Copy link

xxAna commented Apr 23, 2020

@quancore hello. I am also confused of the loss calculation. In that case, do you mean that the model mainly for learning the sequence relationship among the hidden state, so it doesn't matter weather its output was compare with the known (observed) part or unknown part? But will it infect the model performance to predict the unknown situation as it was always trained to predict the known part? Thank you and waiting for your reply.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants