Skip to content

Commit

Permalink
Merge pull request #135 from FluxML/type-piracy
Browse files Browse the repository at this point in the history
Remove optimiser type piracy. Make `optimiser` and `builder` "deep" hyper-parmaters
  • Loading branch information
ablaom authored May 17, 2021
2 parents efa3a4c + b16db0a commit 3ef95ae
Show file tree
Hide file tree
Showing 18 changed files with 488 additions and 1,681 deletions.
2 changes: 0 additions & 2 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,6 @@ jobs:
fail-fast: false
matrix:
version:
- '1.3'
- '1.4'
- '1'
- 'nightly'
os:
Expand Down
20 changes: 10 additions & 10 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "MLJFlux"
uuid = "094fc8d1-fd35-5302-93ea-dabda2abf845"
authors = ["Anthony D. Blaom <[email protected]>", "Ayush Shridhar <[email protected]>"]
version = "0.1.11"
version = "0.1.12"

[deps]
CategoricalArrays = "324d7699-5711-5eae-9e2f-1d82baa6b597"
Expand All @@ -15,15 +15,15 @@ Statistics = "10745b16-79ce-11e8-11f9-7d13ad32a3b2"
Tables = "bd369af6-aec1-5ad0-b16a-f7cc5008161c"

[compat]
CategoricalArrays = "^0.10"
ColorTypes = "^0.10.3, 0.11"
ComputationalResources = "^0.3.2"
Flux = "^0.10.4, ^0.11, 0.12"
LossFunctions = "^0.5, ^0.6"
MLJModelInterface = "^0.4.1, 1"
ProgressMeter = "^1.1"
Tables = "^1.0"
julia = "^1.3"
CategoricalArrays = "0.10"
ColorTypes = "0.10.3, 0.11"
ComputationalResources = "0.3.2"
Flux = "0.10.4, 0.11, 0.12"
LossFunctions = "0.5, 0.6"
MLJModelInterface = "1.1"
ProgressMeter = "1.1"
Tables = "1.0"
julia = "1.6"

[extras]
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
Expand Down
176 changes: 95 additions & 81 deletions examples/iris/iris.ipynb

Large diffs are not rendered by default.

4 changes: 3 additions & 1 deletion examples/iris/iris.jl
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@ using Pkg
Pkg.activate(@__DIR__)
Pkg.instantiate()

# **Julia version** is assumed to be 1.6.*

using MLJ
using Flux
import RDatasets
Expand Down Expand Up @@ -75,5 +77,5 @@ plot(curve.parameter_values,
savefig("iris_history.png")

using Literate #src
Literate.markdown(@__FILE__, @__DIR__, execute=true) #src
Literate.markdown(@__FILE__, @__DIR__, execute=false) #src
Literate.notebook(@__FILE__, @__DIR__, execute=true) #src
73 changes: 13 additions & 60 deletions examples/iris/iris.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,18 @@
```@meta
EditURL = "<unknown>/iris.jl"
EditURL = "<unknown>/../../MLJFlux/examples/iris/iris.jl"
```

# Using MLJ with Flux to train the iris dataset

```julia
```@example iris
using Pkg
Pkg.activate(@__DIR__)
Pkg.instantiate()
```

**Julia version** is assumed to be 1.6.*

```@example iris
using MLJ
using Flux
import RDatasets
Expand All @@ -23,11 +27,6 @@ pyplot(size=(600, 300*(sqrt(5)-1)));
nothing #hide
```

```
Activating environment at `~/Dropbox/Julia7/MLJ/MLJFlux/examples/iris/Project.toml`
```

Following is a very basic introductory example, using a default
builder and no standardization of input features.

Expand All @@ -36,91 +35,46 @@ example](https://github.com/FluxML/MLJFlux.jl/blob/dev/examples/mnist).

## Loading some data and instantiating a model

```julia
```@example iris
iris = RDatasets.dataset("datasets", "iris");
y, X = unpack(iris, ==(:Species), colname -> true, rng=123);
NeuralNetworkClassifier = @load NeuralNetworkClassifier
clf = NeuralNetworkClassifier()
```

```
NeuralNetworkClassifier(
builder = Short(
n_hidden = 0,
dropout = 0.5,
σ = NNlib.σ),
finaliser = NNlib.softmax,
optimiser = ADAM(0.001, (0.9, 0.999), IdDict{Any,Any}()),
loss = Flux.Losses.crossentropy,
epochs = 10,
batch_size = 1,
lambda = 0.0,
alpha = 0.0,
optimiser_changes_trigger_retraining = false,
acceleration = CPU1{Nothing}(nothing)) @252
```

## Incremental training

```julia
```@example iris
import Random.seed!; seed!(123)
mach = machine(clf, X, y)
fit!(mach)
training_loss = cross_entropy(predict(mach, X), y) |> mean
```

```
0.8993467f0
```

Increasing learning rate and adding iterations:

```julia
```@example iris
clf.optimiser.eta = clf.optimiser.eta * 2
clf.epochs = clf.epochs + 5
fit!(mach, verbosity=2);
nothing #hide
```

```
┌ Info: Updating Machine{NeuralNetworkClassifier{Short,…},…} @545.
└ @ MLJBase /Users/anthony/.julia/packages/MLJBase/4DmTL/src/machines.jl:342
┌ Info: Loss is 0.853
└ @ MLJFlux /Users/anthony/.julia/packages/MLJFlux/wj7HX/src/core.jl:122
┌ Info: Loss is 0.8207
└ @ MLJFlux /Users/anthony/.julia/packages/MLJFlux/wj7HX/src/core.jl:122
┌ Info: Loss is 0.8072
└ @ MLJFlux /Users/anthony/.julia/packages/MLJFlux/wj7HX/src/core.jl:122
┌ Info: Loss is 0.752
└ @ MLJFlux /Users/anthony/.julia/packages/MLJFlux/wj7HX/src/core.jl:122
┌ Info: Loss is 0.7077
└ @ MLJFlux /Users/anthony/.julia/packages/MLJFlux/wj7HX/src/core.jl:122
```

```julia
```@example iris
training_loss = cross_entropy(predict(mach, X), y) |> mean
```

```
0.7076617f0
```

## Accessing the Flux chain (model)

```julia
```@example iris
chain = fitted_params(mach).chain
```

```
Chain(Chain(Dense(4, 3, σ), Dropout(0.5), Dense(3, 3)), softmax)
```

## Evolution of out-of-sample performance

```julia
```@example iris
r = range(clf, :epochs, lower=1, upper=200, scale=:log10)
curve = learning_curve(clf, X, y,
range=r,
Expand All @@ -133,9 +87,8 @@ plot(curve.parameter_values,
xscale=curve.parameter_scale,
ylab = "Cross Entropy")
```
![](3397330029.png)

```julia
```@example iris
savefig("iris_history.png")
```

Expand Down
Binary file modified examples/iris/iris_history.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit 3ef95ae

Please sign in to comment.