Releases: FluxML/MLJFlux.jl
Releases · FluxML/MLJFlux.jl
v0.1.14
MLJFlux v0.1.14
Merged pull requests:
v0.1.13
MLJFlux v0.1.13
Merged pull requests:
- Small typo in paper (#133) (@ayush-1506)
- rm LossFunctions from [deps] (#138) (@ablaom)
v0.1.12
v0.1.11
MLJFlux v0.1.11
Merged pull requests:
v0.1.10
MLJFlux v0.1.10
- extend compat for Flux to include 0.12
- extend compat for MLJModelInterface to include 1.0
Closed issues:
- Switch code coverage to use Codecov? (#82)
- Specifying non-supported acceleration option should trigger fallback to CPU1() (#104)
- Add support for Flux version 0.12? (#107)
Merged pull requests:
- Add acceleration cleanup in clean! (#105) (@ayush-1506)
- CompatHelper: bump compat for "Flux" to "0.12" (#110) (@github-actions[bot])
- Revert "CompatHelper: bump compat for "Flux" to "0.12"" (#111) (@DilumAluthge)
- CompatHelper: bump compat for "Flux" to "0.12" (#116) (@github-actions[bot])
- Update the MNIST example to make use of IteratedModel wrapper (#125) (@ablaom)
- Extend [compat] for MLJModelnterface to include 1.0, and bump MLJFlux version (#126) (@ablaom)
- For a 0.1.10 release (#127) (@ablaom)
v0.1.9
MLJFlux v0.1.9
- Add support for MLJ's forthcoming iteration interface (#100, JuliaAI/MLJ.jl#139)
Merged pull requests:
v0.1.8
MLJFlux v0.1.8
Closed issues:
- Update examples/ or remove to new dir outdated/ (#56)
- Add a GPU example to the README? (#78)
- Replace
train!
with a custom loop (#89) - Suppress gpu testing in PR to dev branch? (#92)
Merged pull requests:
- Examples review (#90) (@ablaom)
- Doc improvements. No new release (#91) (@ablaom)
- Only run Buildkite (GPU CI) on PRs to master (#93) (@DilumAluthge)
- Custom training loop (#95) (@ayush-1506)
- For a 0.1.8 release (#96) (@ablaom)
v0.1.7
MLJFlux v0.1.7
Closed issues:
Merged pull requests:
v0.1.6
v0.1.5
MLJFlux v0.1.5
Merged pull requests: