Skip to content

deel-ai/relu-prime

Repository files navigation

Numerical influence of ReLU’(0) on backpropagation.

This repository contains all the code and results for the NeurIPS 2021 submission Numerical influence of ReLU’(0) on backpropagation.

All the data generated from the experiments are located in paper_results.
All the figures from the paper are generated with this notebook, this notebook and this script.

Code for all the experiments:

  • To run the experiments from section 4.3:

    pyton train_with_best_lr.py --network [NETWORK] --dataset[DATASET] --batch_norm [BATCH_NORM] --epochs [EPOCHS] 

    with [NETWORK] = mnist, vgg11 or resnet18 , [DATASET] = mnist, cifar10 or svhn and [BATCH_NORM] = True or False

    Example:

    python train_with_best_lr.py --network resnet18 --dataset cifar10 --batch_norm True --epochs 200 
  • Additional experiments:
    To run the additional experiments:

    python train_with_best_lr.py --network [NETWORK] --dataset[DATASET] --batch_norm [BATCH_NORM] --epochs 200

    To run the imagenet experiment:

    python train_imagenet.py --dist-url 'tcp://127.0.0.1:9002' --dist-backend 'nccl' --relu [ALPHA] --multiprocessing-distributed --world-size 1 --rank 0 '{[IMAGENET_FOLDER_PATH]}'

The code used to generate the figures is available here

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published