Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fixing layer docs #209

Merged
merged 7 commits into from
Mar 11, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions docs/pages.jl
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ pages = [
"Echo State Network Tutorials" => Any[
"Lorenz System Forecasting" => "esn_tutorials/lorenz_basic.md",
#"Mackey-Glass Forecasting on GPU" => "esn_tutorials/mackeyglass_basic.md",
"Using Different Layers" => "esn_tutorials/change_layers.md",
#"Using Different Layers" => "esn_tutorials/change_layers.md",
"Using Different Reservoir Drivers" => "esn_tutorials/different_drivers.md",
#"Using Different Training Methods" => "esn_tutorials/different_training.md",
"Deep Echo State Networks" => "esn_tutorials/deep_esn.md",
Expand All @@ -17,7 +17,7 @@ pages = [
"States Modifications" => "api/states.md",
"Prediction Types" => "api/predict.md",
"Echo State Networks" => "api/esn.md",
"ESN Layers" => "api/esn_layers.md",
#"ESN Layers" => "api/esn_layers.md",
"ESN Drivers" => "api/esn_drivers.md",
"ReCA" => "api/reca.md"]
]
11 changes: 0 additions & 11 deletions docs/src/api/esn.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,17 +6,6 @@ The core component of an ESN is the `ESN` type. It represents the entire Echo St
ESN
```

## Variations

In addition to the standard `ESN` model, there are variations that allow for deeper customization of the underlying model. Currently, there are two available variations: `Default` and `Hybrid`. These variations provide different ways to configure the ESN. Here's the documentation for the variations:

```@docs
Default
Hybrid
```

The `Hybrid` variation is the most complex option and offers additional customization. Note that more variations may be added in the future to provide even greater flexibility.

## Training

To train an ESN model, you can use the `train` function. It takes the ESN model, training data, and other optional parameters as input and returns a trained model. Here's the documentation for the train function:
Expand Down
71 changes: 0 additions & 71 deletions docs/src/api/esn_layers.md

This file was deleted.

7 changes: 1 addition & 6 deletions docs/src/api/training.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,14 +2,9 @@

## Linear Models

```@docs
StandardRidge
LinearModel
```

## Gaussian Regression

Currently, v0.9 is unavailable.
Currently, v0.10, is unavailable.

## Support Vector Regression

Expand Down
99 changes: 0 additions & 99 deletions docs/src/esn_tutorials/change_layers.md

This file was deleted.

15 changes: 6 additions & 9 deletions docs/src/esn_tutorials/deep_esn.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ end
#solve and take data
prob = ODEProblem(lorenz!, [1.0, 0.0, 0.0], (0.0, 200.0))
data = solve(prob, ABM54(), dt = 0.02)
data = reduce(hcat, data.u)

#determine shift length, training length and prediction length
shift = 300
Expand All @@ -40,20 +41,18 @@ The construction of the ESN is also really similar. The only difference is that
```@example deep_lorenz
using ReservoirComputing

reservoirs = [RandSparseReservoir(99, radius = 1.1, sparsity = 0.1),
RandSparseReservoir(100, radius = 1.2, sparsity = 0.1),
RandSparseReservoir(200, radius = 1.4, sparsity = 0.1)]
reservoirs = [rand_sparse(; radius = 1.1, sparsity = 0.1),
rand_sparse(; radius = 1.2, sparsity = 0.1),
rand_sparse(; radius = 1.4, sparsity = 0.1)]

esn = ESN(input_data;
variation = Default(),
esn = DeepESN(input_data, 3, 200;
reservoir = reservoirs,
input_layer = DenseLayer(),
reservoir_driver = RNN(),
nla_type = NLADefault(),
states_type = StandardStates())
```

As it is possible to see, different sizes can be chosen for the different reservoirs. The input layer and bias can also be given as vectors, but of course, they have to be of the same size of the reservoirs vector. If they are not passed as a vector, the value passed will be used for all the layers in the deep ESN.
The input layer and bias can also be given as vectors, but of course, they have to be of the same size of the reservoirs vector. If they are not passed as a vector, the value passed will be used for all the layers in the deep ESN.

In addition to using the provided functions for the construction of the layers, the user can also choose to build their own matrix, or array of matrices, and feed that into the `ESN` in the same way.

Expand Down Expand Up @@ -89,8 +88,6 @@ plot(p1, p2, p3, plot_title = "Lorenz System Coordinates",
legendfontsize = 12, titlefontsize = 20)
```

Note that there is a known bug at the moment with using `WeightedLayer` as the input layer with the deep ESN. We are in the process of investigating and solving it. The leak coefficient for the reservoirs has to always be the same in the current implementation. This is also something we are actively looking into expanding.

## Documentation

[^1]: Gallicchio, Claudio, and Alessio Micheli. "_Deep echo state network (deepesn): A brief survey._" arXiv preprint arXiv:1712.04323 (2017).
24 changes: 12 additions & 12 deletions docs/src/esn_tutorials/different_drivers.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,9 +72,9 @@ case5 = MRNN([tanh, f4], 0.9, [0.43, 0.13])
#tests
test_cases = [base_case, case3, case4, case5]
for case in test_cases
esn = ESN(training_input,
input_layer = WeightedLayer(scaling = 0.3),
reservoir = RandSparseReservoir(100, radius = 0.4),
esn = ESN(training_input, 1, 100,
input_layer = weighted_init(; scaling = 0.3),
reservoir = rand_sparse(; radius = 0.4),
reservoir_driver = case,
states_type = ExtendedStates())
wout = train(esn, training_target, StandardRidge(10e-6))
Expand Down Expand Up @@ -186,19 +186,19 @@ res_size = 300
res_radius = 1.4

Random.seed!(42)
esn = ESN(training_input;
reservoir = RandSparseReservoir(res_size, radius = res_radius),
esn = ESN(training_input, 1, res_size;
reservoir = rand_sparse(; radius = res_radius),
reservoir_driver = GRU())
```

The default inner reservoir and input layer for the GRU are the same defaults for the `reservoir` and `input_layer` of the ESN. One can use the explicit call if they choose to.

```@example gru
gru = GRU(reservoir = [RandSparseReservoir(res_size),
RandSparseReservoir(res_size)],
inner_layer = [DenseLayer(), DenseLayer()])
esn = ESN(training_input;
reservoir = RandSparseReservoir(res_size, radius = res_radius),
gru = GRU(reservoir = [rand_sparse,
rand_sparse],
inner_layer = [scaled_rand, scaled_rand])
esn = ESN(training_input, 1, res_size;
reservoir = rand_sparse(; radius = res_radius),
reservoir_driver = gru)
```

Expand Down Expand Up @@ -230,8 +230,8 @@ It is interesting to see a comparison of the GRU driven ESN and the standard RNN
```@example gru
using StatsBase

esn_rnn = ESN(training_input;
reservoir = RandSparseReservoir(res_size, radius = res_radius),
esn_rnn = ESN(training_input, 1, res_size;
reservoir = rand_sparse(; radius = res_radius),
reservoir_driver = RNN())

output_layer = train(esn_rnn, training_target, training_method)
Expand Down
2 changes: 1 addition & 1 deletion docs/src/esn_tutorials/hybrid.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ The training and prediction of the Hybrid ESN can proceed as usual:

```@example hybrid
output_layer = train(hesn, target_data, StandardRidge(0.3))
output = esn(Generative(predict_len), output_layer)
output = hesn(Generative(predict_len), output_layer)
```

It is now possible to plot the results, leveraging Plots.jl:
Expand Down
1 change: 1 addition & 0 deletions docs/src/esn_tutorials/lorenz_basic.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ end
#solve and take data
prob = ODEProblem(lorenz!, [1.0, 0.0, 0.0], (0.0, 200.0))
data = solve(prob, ABM54(), dt = 0.02)
data = reduce(hcat, data.u)
```

After obtaining the data, it is necessary to determine the kind of prediction for the model. Since this example will use the `Generative` prediction type, this means that the target data will be the next step of the input data. In addition, it is important to notice that the Lorenz system just obtained presents a transient period that is not representative of the general behavior of the system. This can easily be discarded by setting a `shift` parameter.
Expand Down
4 changes: 2 additions & 2 deletions docs/src/reca_tutorials/reca.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,8 @@ The data can be read as follows:
```@example reca
using DelimitedFiles

input = readdlm("./5bitinput.txt", ',', Float32)
output = readdlm("./5bitoutput.txt", ',', Float32)
input = readdlm("./5bitinput.txt", ',', Float64)
output = readdlm("./5bitoutput.txt", ',', Float64)
```

To use a ReCA model, it is necessary to define the rule one intends to use. To do so, ReservoirComputing.jl leverages [CellularAutomata.jl](https://github.com/MartinuzziFrancesco/CellularAutomata.jl) that needs to be called as well to define the `RECA` struct:
Expand Down
5 changes: 3 additions & 2 deletions src/ReservoirComputing.jl
Original file line number Diff line number Diff line change
Expand Up @@ -18,10 +18,11 @@ export StandardRidge
export scaled_rand, weighted_init, informed_init, minimal_init
export rand_sparse, delay_line, delay_line_backward, cycle_jumps, simple_cycle, pseudo_svd
export RNN, MRNN, GRU, GRUParams, FullyGated, Minimal
export ESN, train
export train
export ESN
export HybridESN, KnowledgeModel
export DeepESN
export RECA, train
export RECA
export RandomMapping, RandomMaps
export Generative, Predictive, OutputLayer

Expand Down
Loading
Loading