Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Breaking] Change weights generation to follow WeightInitializers.jl #184

Closed
MartinuzziFrancesco opened this issue Dec 18, 2023 · 5 comments
Labels
good first issue Good for newcomers v0.10

Comments

@MartinuzziFrancesco
Copy link
Member

In order to streamline the package and creation of reservoir computing models the creation of the weights of the matrices should follow WeightInitializers.jl

@MartinuzziFrancesco MartinuzziFrancesco added the good first issue Good for newcomers label Dec 18, 2023
@MartinuzziFrancesco MartinuzziFrancesco changed the title Change weights generation to follow WeightInitializers.jl [Breaking] Change weights generation to follow WeightInitializers.jl Dec 19, 2023
@Jay-sanjay
Copy link
Contributor

Hi @MartinuzziFrancesco I guess I can take a look at this, if you can elaborate a bit what exactly needs to be done here.

@MartinuzziFrancesco
Copy link
Member Author

Hi, of course! So at the moment the construction of the input layers matrices and reservoir matrices uses abstract typing and dispatch over different structs of the different architectures for the matrices and it is a bit cumbersome. I wanted to simplify it a bit and make it follow the standard in WeightInitializers.jl. This way we would also start an easier way to control the randomness in ReservoirComputing.jl. For example SparseRandReservoir would look like this:

using ReservoirComputing, Random
rng = MersenneTwister(42)

# Passing kwargs with explicit rng call
res_cl = rand_sparse(rng; sparsity=0.1, spectral_radius=1.0)
#res_cl is now a callable function that takes in the size of the reservoir
reservoir_weights = res_cl(rng, 10, 10)

# Passing kwargs with default rng call
res_cl = rand_sparse(; sparsity=0.1, spectral_radius=1.0)
#res_cl is now a callable function that takes in the size of the reservoir
reservoir_weights = res_cl(10, 10)

For details of the implementation you can refer to the source in WeightInitializers. I could sketch rand_sparse during this week if that would be of help. This needs to be done for both input layers and reservoirs.

Of course after this we need to modify the ESN constructor but that should be rather straightforward.

This issue is part of a streamlining in order to be more in line with (F)Lux as model building philosophy. This should also make contributions a little bit easier

@MartinuzziFrancesco
Copy link
Member Author

@Jay-sanjay I have started a PR in #193, you can follow the template to create the remaining functions for the reservoir and input layers. I will need to think about a couple of details (and tests as well)

@Jay-sanjay
Copy link
Contributor

Thanks @MartinuzziFrancesco for the template, I will start to work on it with other functions

@MartinuzziFrancesco
Copy link
Member Author

Addressed in #196 and finalized in #203

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers v0.10
Projects
None yet
Development

No branches or pull requests

2 participants