Skip to content

Latest commit

 

History

History
44 lines (36 loc) · 2.67 KB

README.md

File metadata and controls

44 lines (36 loc) · 2.67 KB

Neural Architecture Search Framework

Docs

NAS Framework, as the name suggests, is a framework which facilitates neural architecture search on various datasets. It provides a simple and flexible way to define a search space of arbitrary complexity and an Architect class, which works without modifications in any search space defined following the template.

An Architect is a recurrent neural network, which generates computational graph descriptions, by recursively creating a representation of computational graph predicted to the moment and choosing an action (i.e. point in a particular dimension of the search space) following policy which receives that representation as an input.

The architect is trained used reinforcement learning, specifically an Actor-Critic-like variation of Proximal Policy Optimization algorithm on various datasets.

The package is currently in alpha and supports Multilayer Perceptron and Recurrent Neural Network search spaces out of the box. If your needs can be satisfied by those two search spaces, then all you need to do, in order to perform a neural architecture search, is to make a torch.utils.data.Dataset with your data and to modify the toxic_worker in scripts.train_toxic a bit.

Installation

Install the package via pip, by running:

pip install git+https://github.com/VladislavZavadskyy/nas-framework

Running the demo

There's a demo included, which performs a search of RNN architecture on the Jigsaw Toxic Comment dataset. To run it, follow these steps:

  1. Create a directory named data.
  2. Download train.csv.zip from kaggle competition page and unpack it to the data/toxic.
  3. Download pretrained fasttext embeddings, unpack them and place into data directory.
  4. Run nas toxic (append --help option to see available arguments).

During the search a logs (or one specified with --log-dir) directory will be created, which will contain information about the search process. You can also view descriptions being evaluated, child network training progress and other info by running a tensorboard server in that directory.

Getting the best found description

To get the best foung description, run nas find_best with path to description_reward.json as an argument. See nas find_best --help for options.