Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

k2 integration #283

Open
bonham79 opened this issue Dec 7, 2024 · 0 comments
Open

k2 integration #283

bonham79 opened this issue Dec 7, 2024 · 0 comments
Assignees
Labels
enhancement New feature or request

Comments

@bonham79
Copy link
Collaborator

bonham79 commented Dec 7, 2024

https://github.com/k2-fsa/k2

Recently played around with the above. Really fast construction of FSAs on GPU. Given that a lot of our models use monotonic assumptions, having access to lattice constructions with autograd would give us some cool novel architectures for our set of tasks.

Some extensions we can have:

  1. Push monotonic transducer and edit action transducers to use trainable lattices
  2. Allow integration of our models with language n-grams. This can improve training by preventing illegal constructions not seen in training data.
  3. Helps with creation of CTC/Transducer models for NLP tasks.

N.B. From a self-serving route. Little birdy told me this wasn't getting much love these days, so want to keep it around the OpenGRM/Pynini/Thrax family to keep it alive. (Implementing GPU based FSA algorithms would be nifty masters project for CUNY peeps.)

@bonham79 bonham79 added the enhancement New feature or request label Dec 7, 2024
@bonham79 bonham79 self-assigned this Dec 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant