You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I face a Memory Error trying to train a big model. Is there any way to train using all avalilable GPUs?
I am using a linux machine (gcloud) with 8 GPUs.
Traceback (most recent call last):
File "main.py", line 445, in
train_model(parameters, args.dataset)
File "main.py", line 73, in train_model
dataset = build_dataset(params)
File "/mnt/sdc200/nmt-keras/data_engine/prepare_data.py", line 229, in build_dataset
saveDataset(ds, params['DATASET_STORE_PATH'])
File "/mnt/sdc200/nmt-keras/src/keras-wrapper/keras_wrapper/dataset.py", line 52, in saveDataset
pk.dump(dataset, open(store_path, 'wb'), protocol=-1)
MemoryError
The text was updated successfully, but these errors were encountered:
I face a Memory Error trying to train a big model. Is there any way to train using all avalilable GPUs?
I am using a linux machine (gcloud) with 8 GPUs.
Traceback (most recent call last):
File "main.py", line 445, in
train_model(parameters, args.dataset)
File "main.py", line 73, in train_model
dataset = build_dataset(params)
File "/mnt/sdc200/nmt-keras/data_engine/prepare_data.py", line 229, in build_dataset
saveDataset(ds, params['DATASET_STORE_PATH'])
File "/mnt/sdc200/nmt-keras/src/keras-wrapper/keras_wrapper/dataset.py", line 52, in saveDataset
pk.dump(dataset, open(store_path, 'wb'), protocol=-1)
MemoryError
The text was updated successfully, but these errors were encountered: