Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Expand RNN Options and Algorithm Flexibility #220

Open
2 tasks done
mtnusf97 opened this issue Dec 14, 2023 · 2 comments
Open
2 tasks done

[Feature Request] Expand RNN Options and Algorithm Flexibility #220

mtnusf97 opened this issue Dec 14, 2023 · 2 comments
Labels
enhancement New feature or request

Comments

@mtnusf97
Copy link

🚀 Feature

I suggest expanding the system's recurrent components by introducing various recurrent neural networks (RNNs) like vanilla RNN, GRU, and maybe some lesser-know networks like LMU, and ctRNN. Additionally, I propose compatibility with other RL algorithms beyond PPO, specifically A2C.

Motivation

The motivation is to enhance flexibility, allowing users to choose from a diverse set of recurrent networks and RL algorithms.

Pitch

Introduce different recurrent net options for different RL algorithms such as A2C, providing users with a more comprehensive toolkit for designing and experimenting RL with recurrent components.

Alternatives

Focus on LstmPPO: While effective, this limits exploration and potentially misses out on the strengths of other RNNs.

Develop custom algorithms: This is resource-intensive and may not be as widely applicable as expanding existing options.

Additional context

I have already implemented most of these features in my personal repository and successfully utilized them in my research.

Checklist

  • I have checked that there is no similar issue in the repo
  • If I'm requesting a new feature, I have proposed alternatives
@mtnusf97 mtnusf97 added the enhancement New feature or request label Dec 14, 2023
@araffin araffin added the Maintainers on vacation Maintainers are on vacation so they can recharge their batteries, we will be back soon ;) label Dec 14, 2023
@masterdezign
Copy link

Hi @mtnusf97, I am working on #201 so I may add several types of recurrent networks to SAC.

@araffin araffin removed the Maintainers on vacation Maintainers are on vacation so they can recharge their batteries, we will be back soon ;) label Jan 10, 2024
@araffin
Copy link
Member

araffin commented Jan 10, 2024

I propose compatibility with other RL algorithms beyond PPO, specifically A2C.

A2C is already included by the recurrent PPO implementation: https://arxiv.org/abs/2205.09123

introducing various recurrent neural networks (RNNs) like vanilla RNN, GRU, and maybe some lesser-know networks like LMU, and ctRNN.

I have already implemented most of these features in my personal repository and successfully utilized them in my research.

do you have a benchmark to share?
and are you willing to implement and benchmark those alternatives? (I would start with GRU only at first)
adding more options will add complexity to an already complex algorithm, so we should do that only if it is really beneficial.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants