Skip to content

Releases: FluxML/MLJFlux.jl

v0.1.14

28 May 21:57
b18dc40
Compare
Choose a tag to compare

MLJFlux v0.1.14

Diff since v0.1.13

Merged pull requests:

v0.1.13

21 May 05:48
0387c2c
Compare
Choose a tag to compare

MLJFlux v0.1.13

Diff since v0.1.12

Merged pull requests:

v0.1.12

17 May 22:57
3ef95ae
Compare
Choose a tag to compare

MLJFlux v0.1.12

Diff since v0.1.11

  • Remove type piracy for optimisers (#135)
  • Fix incorrect cold-start behaviour (#124)
  • Make builder and optimiser "deep" hyper-parameters (#135)

Closed issues:

  • Cold training restart where warm restart is wanted (#124)
  • Fix type piracy for optimisers (#129)

Merged pull requests:

  • Remove optimiser type piracy. Make optimiser and builder "deep" hyper-parmaters (#135) (@ablaom)

v0.1.11

01 May 03:14
53fe3bb
Compare
Choose a tag to compare

MLJFlux v0.1.11

Diff since v0.1.10

Merged pull requests:

  • paper review (#123) (@ayush-1506)
  • CompatHelper: bump compat for "CategoricalArrays" to "0.10" (#128) (@github-actions[bot])
  • CompatHelper: bump compat for "ColorTypes" to "0.11" (#130) (@github-actions[bot])
  • Update Project.toml (#131) (@ablaom)
  • for a 0.1.11 release (#132) (@ablaom)

v0.1.10

21 Apr 00:57
f52a7de
Compare
Choose a tag to compare

MLJFlux v0.1.10

Diff since v0.1.9

  • extend compat for Flux to include 0.12
  • extend compat for MLJModelInterface to include 1.0

Closed issues:

  • Switch code coverage to use Codecov? (#82)
  • Specifying non-supported acceleration option should trigger fallback to CPU1() (#104)
  • Add support for Flux version 0.12? (#107)

Merged pull requests:

  • Add acceleration cleanup in clean! (#105) (@ayush-1506)
  • CompatHelper: bump compat for "Flux" to "0.12" (#110) (@github-actions[bot])
  • Revert "CompatHelper: bump compat for "Flux" to "0.12"" (#111) (@DilumAluthge)
  • CompatHelper: bump compat for "Flux" to "0.12" (#116) (@github-actions[bot])
  • Update the MNIST example to make use of IteratedModel wrapper (#125) (@ablaom)
  • Extend [compat] for MLJModelnterface to include 1.0, and bump MLJFlux version (#126) (@ablaom)
  • For a 0.1.10 release (#127) (@ablaom)

v0.1.9

18 Mar 00:04
a322ccf
Compare
Choose a tag to compare

MLJFlux v0.1.9

Diff since v0.1.8

Merged pull requests:

v0.1.8

03 Mar 00:52
0c09fc3
Compare
Choose a tag to compare

MLJFlux v0.1.8

Diff since v0.1.7

Closed issues:

  • Update examples/ or remove to new dir outdated/ (#56)
  • Add a GPU example to the README? (#78)
  • Replace train! with a custom loop (#89)
  • Suppress gpu testing in PR to dev branch? (#92)

Merged pull requests:

v0.1.7

23 Feb 18:59
03f17d2
Compare
Choose a tag to compare

MLJFlux v0.1.7

Diff since v0.1.6

Closed issues:

  • CompatHelper failing (#55)
  • TODO: Set up the GPU continuous integration (CI) for MLJFlux.jl (#65)

Merged pull requests:

v0.1.6

29 Jan 01:16
b66d575
Compare
Choose a tag to compare

MLJFlux v0.1.6

Diff since v0.1.5

Merged pull requests:

v0.1.5

02 Dec 05:49
6d6f5ee
Compare
Choose a tag to compare

MLJFlux v0.1.5

Diff since v0.1.4

Merged pull requests:

  • Move from Travis CI to GitHub Actions CI (#72) (@DilumAluthge)
  • switch ci to gh action & extend [compat] CategoricalArrays="...,0.9" (#73) (@ablaom)
  • Extend [compat] Flux = "...,0.11" (#74) (@ablaom)
  • For a 0.1.5 release (#75) (@ablaom)