Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: Error(s) in loading state_dict for DataParallel #152

Open
kongsa0419 opened this issue Mar 15, 2023 · 0 comments
Open

RuntimeError: Error(s) in loading state_dict for DataParallel #152

kongsa0419 opened this issue Mar 15, 2023 · 0 comments

Comments

@kongsa0419
Copy link

First of all, Thank you for your effort of this great work.

I was following the documentation. while this part, [https://github.com/clovaai/stargan-v2#generating-interpolation-videos]

I got an Erorr "RuntimeError: Error(s) in loading state_dict for DataParallel"
(*Attached entire error statements below)

Now I know that the 'module.' prefix matters, because of the torch.nn.DataParallel, but I'm confused how to solve that error.

Would anyone suggest me the code?
(* I already saw #103 )


Namespace(batch_size=8, beta1=0.0, beta2=0.99, checkpoint_dir='expr/checkpoints/celeba_hq', ds_iter=100000, eval_dir='expr/eval', eval_every=50000, f_lr=1e-06, hidden_dim=512, img_size=256, inp_dir='assets/representative/custom/female', lambda_cyc=1, lambda_ds=1, lambda_reg=1, lambda_sty=1, latent_dim=16, lm_path='expr/checkpoints/celeba_lm_mean.npz', lr=0.0001, mode='sample', num_domains=2, num_outs_per_domain=10, num_workers=4, out_dir='assets/representative/celeba_hq/src/female', print_every=10, randcrop_prob=0.5, ref_dir='assets/representative/celeba_hq/ref', result_dir='expr/results/celeba_hq', resume_iter=100000, sample_dir='expr/samples', sample_every=5000, save_every=10000, seed=777, src_dir='assets/representative/celeba_hq/src', style_dim=64, total_iters=100000, train_img_dir='data/celeba_hq/train', val_batch_size=32, val_img_dir='data/celeba_hq/val', w_hpf=1.0, weight_decay=0.0001, wing_path='expr/checkpoints/wing.ckpt')
Number of parameters of generator: 43467395
Number of parameters of mapping_network: 2438272
Number of parameters of style_encoder: 20916928
Number of parameters of discriminator: 20852290
Number of parameters of fan: 6333603
Initializing generator...
Initializing mapping_network...
Initializing style_encoder...
Initializing discriminator...
Preparing DataLoader for the generation phase...
Preparing DataLoader for the generation phase...
Loading checkpoint from expr/checkpoints/celeba_hq\100000_nets_ema.ckpt...
Traceback (most recent call last):
File "main.py", line 182, in
main(args)
File "main.py", line 73, in main
solver.sample(loaders)
File "C:\Users\Admin\anaconda3\envs\stargan-v2\lib\site-packages\torch\autograd\grad_mode.py", line 28, in decorate_context
return func(*args, **kwargs)
File "C:\stargan-v2\core\solver.py", line 178, in sample
self._load_checkpoint(args.resume_iter)
File "C:\stargan-v2\core\solver.py", line 73, in _load_checkpoint
ckptio.load(step)
File "C:\stargan-v2\core\checkpoint.py", line 50, in load
module.load_state_dict(module_dict[name])
File "C:\Users\Admin\anaconda3\envs\stargan-v2\lib\site-packages\torch\nn\modules\module.py", line 1483, in load_state_dict
self.class.name, "\n\t".join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for DataParallel:
Missing key(s) in state_dict: "module.from_rgb.weight", "module.from_rgb.bias", "module.encode.0.conv1.weight", "module.encode.0.conv1.bias", "module.encode.0.conv2.weight", "module.encode.0.conv2.bias", "module.encode.0.norm1.weight", "module.encode.0.norm1.bias", "module.encode.0.norm2.weight", "module.encode.0.norm2.bias", "module.encode.0.conv1x1.weight", "module.encode.1.conv1.weight", "module.encode.1.conv1.bias", "module.encode.1.conv2.weight", "module.encode.1.conv2.bias", "module.encode.1.norm1.weight", "module.encode.1.norm1.bias", "module.encode.1.norm2.weight", "module.encode.1.norm2.bias", "module.encode.1.conv1x1.weight", "module.encode.2.conv1.weight", "module.encode.2.conv1.bias", "module.encode.2.conv2.weight", "module.encode.2.conv2.bias", "module.encode.2.norm1.weight", "module.encode.2.norm1.bias", "module.encode.2.norm2.weight", "module.encode.2.norm2.bias", "module.encode.2.conv1x1.weight", "module.encode.3.conv1.weight", "module.encode.3.conv1.bias", "module.encode.3.conv2.weight", "module.encode.3.conv2.bias", "module.encode.3.norm1.weight", "module.encode.3.norm1.bias", "module.encode.3.norm2.weight", "module.encode.3.norm2.bias", "module.encode.4.conv1.weight", "module.encode.4.conv1.bias", "module.encode.4.conv2.weight", "module.encode.4.conv2.bias", "module.encode.4.norm1.weight", "module.encode.4.norm1.bias", "module.encode.4.norm2.weight", "module.encode.4.norm2.bias", "module.encode.5.conv1.weight", "module.encode.5.conv1.bias", "module.encode.5.conv2.weight", "module.encode.5.conv2.bias", "module.encode.5.norm1.weight", "module.encode.5.norm1.bias", "module.encode.5.norm2.weight", "module.encode.5.norm2.bias", "module.encode.6.conv1.weight", "module.encode.6.conv1.bias", "module.encode.6.conv2.weight", "module.encode.6.conv2.bias", "module.encode.6.norm1.weight", "module.encode.6.norm1.bias", "module.encode.6.norm2.weight", "module.encode.6.norm2.bias", "module.decode.0.conv1.weight", "module.decode.0.conv1.bias", "module.decode.0.conv2.weight", "module.decode.0.conv2.bias", "module.decode.0.norm1.fc.weight", "module.decode.0.norm1.fc.bias", "module.decode.0.norm2.fc.weight", "module.decode.0.norm2.fc.bias", "module.decode.1.conv1.weight", "module.decode.1.conv1.bias", "module.decode.1.conv2.weight", "module.decode.1.conv2.bias", "module.decode.1.norm1.fc.weight", "module.decode.1.norm1.fc.bias", "module.decode.1.norm2.fc.weight", "module.decode.1.norm2.fc.bias", "module.decode.2.conv1.weight", "module.decode.2.conv1.bias", "module.decode.2.conv2.weight", "module.decode.2.conv2.bias", "module.decode.2.norm1.fc.weight", "module.decode.2.norm1.fc.bias", "module.decode.2.norm2.fc.weight", "module.decode.2.norm2.fc.bias", "module.decode.3.conv1.weight", "module.decode.3.conv1.bias", "module.decode.3.conv2.weight", "module.decode.3.conv2.bias", "module.decode.3.norm1.fc.weight", "module.decode.3.norm1.fc.bias", "module.decode.3.norm2.fc.weight", "module.decode.3.norm2.fc.bias", "module.decode.4.conv1.weight", "module.decode.4.conv1.bias", "module.decode.4.conv2.weight", "module.decode.4.conv2.bias", "module.decode.4.norm1.fc.weight", "module.decode.4.norm1.fc.bias", "module.decode.4.norm2.fc.weight", "module.decode.4.norm2.fc.bias", "module.decode.4.conv1x1.weight", "module.decode.5.conv1.weight", "module.decode.5.conv1.bias", "module.decode.5.conv2.weight", "module.decode.5.conv2.bias", "module.decode.5.norm1.fc.weight", "module.decode.5.norm1.fc.bias", "module.decode.5.norm2.fc.weight", "module.decode.5.norm2.fc.bias", "module.decode.5.conv1x1.weight", "module.decode.6.conv1.weight", "module.decode.6.conv1.bias", "module.decode.6.conv2.weight", "module.decode.6.conv2.bias", "module.decode.6.norm1.fc.weight", "module.decode.6.norm1.fc.bias", "module.decode.6.norm2.fc.weight", "module.decode.6.norm2.fc.bias", "module.decode.6.conv1x1.weight", "module.to_rgb.0.weight", "module.to_rgb.0.bias", "module.to_rgb.2.weight", "module.to_rgb.2.bias", "module.hpf.filter".
Unexpected key(s) in state_dict: "from_rgb.weight", "from_rgb.bias", "encode.0.conv1.weight", "encode.0.conv1.bias", "encode.0.conv2.weight", "encode.0.conv2.bias", "encode.0.norm1.weight", "encode.0.norm1.bias", "encode.0.norm2.weight", "encode.0.norm2.bias", "encode.0.conv1x1.weight", "encode.1.conv1.weight", "encode.1.conv1.bias", "encode.1.conv2.weight", "encode.1.conv2.bias", "encode.1.norm1.weight", "encode.1.norm1.bias", "encode.1.norm2.weight", "encode.1.norm2.bias", "encode.1.conv1x1.weight", "encode.2.conv1.weight", "encode.2.conv1.bias", "encode.2.conv2.weight", "encode.2.conv2.bias", "encode.2.norm1.weight", "encode.2.norm1.bias", "encode.2.norm2.weight", "encode.2.norm2.bias", "encode.2.conv1x1.weight", "encode.3.conv1.weight", "encode.3.conv1.bias", "encode.3.conv2.weight", "encode.3.conv2.bias", "encode.3.norm1.weight", "encode.3.norm1.bias", "encode.3.norm2.weight", "encode.3.norm2.bias", "encode.4.conv1.weight", "encode.4.conv1.bias", "encode.4.conv2.weight", "encode.4.conv2.bias", "encode.4.norm1.weight", "encode.4.norm1.bias", "encode.4.norm2.weight", "encode.4.norm2.bias", "encode.5.conv1.weight", "encode.5.conv1.bias", "encode.5.conv2.weight", "encode.5.conv2.bias", "encode.5.norm1.weight", "encode.5.norm1.bias", "encode.5.norm2.weight", "encode.5.norm2.bias", "encode.6.conv1.weight", "encode.6.conv1.bias", "encode.6.conv2.weight", "encode.6.conv2.bias", "encode.6.norm1.weight", "encode.6.norm1.bias", "encode.6.norm2.weight", "encode.6.norm2.bias", "decode.0.conv1.weight", "decode.0.conv1.bias", "decode.0.conv2.weight", "decode.0.conv2.bias", "decode.0.norm1.fc.weight", "decode.0.norm1.fc.bias", "decode.0.norm2.fc.weight", "decode.0.norm2.fc.bias", "decode.1.conv1.weight", "decode.1.conv1.bias", "decode.1.conv2.weight", "decode.1.conv2.bias", "decode.1.norm1.fc.weight", "decode.1.norm1.fc.bias", "decode.1.norm2.fc.weight", "decode.1.norm2.fc.bias", "decode.2.conv1.weight", "decode.2.conv1.bias", "decode.2.conv2.weight", "decode.2.conv2.bias", "decode.2.norm1.fc.weight", "decode.2.norm1.fc.bias", "decode.2.norm2.fc.weight", "decode.2.norm2.fc.bias", "decode.3.conv1.weight", "decode.3.conv1.bias", "decode.3.conv2.weight", "decode.3.conv2.bias", "decode.3.norm1.fc.weight", "decode.3.norm1.fc.bias", "decode.3.norm2.fc.weight", "decode.3.norm2.fc.bias", "decode.4.conv1.weight", "decode.4.conv1.bias", "decode.4.conv2.weight", "decode.4.conv2.bias", "decode.4.norm1.fc.weight", "decode.4.norm1.fc.bias", "decode.4.norm2.fc.weight", "decode.4.norm2.fc.bias", "decode.4.conv1x1.weight", "decode.5.conv1.weight", "decode.5.conv1.bias", "decode.5.conv2.weight", "decode.5.conv2.bias", "decode.5.norm1.fc.weight", "decode.5.norm1.fc.bias", "decode.5.norm2.fc.weight", "decode.5.norm2.fc.bias", "decode.5.conv1x1.weight", "decode.6.conv1.weight", "decode.6.conv1.bias", "decode.6.conv2.weight", "decode.6.conv2.bias", "decode.6.norm1.fc.weight", "decode.6.norm1.fc.bias", "decode.6.norm2.fc.weight", "decode.6.norm2.fc.bias", "decode.6.conv1x1.weight", "to_rgb.0.weight", "to_rgb.0.bias", "to_rgb.2.weight", "to_rgb.2.bias".

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant