Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump torchmetrics from 0.11.0 to 1.0.2 #31

Closed
wants to merge 1 commit into from

Conversation

dependabot[bot]
Copy link

@dependabot dependabot bot commented on behalf of github Aug 4, 2023

Bumps torchmetrics from 0.11.0 to 1.0.2.

Release notes

Sourced from torchmetrics's releases.

Weekly patch release

[1.0.2] - 2022-08-03

Added

  • Added warning to PearsonCorrCoeff if input has a very small variance for its given dtype (#1926)

Changed

  • Changed all non-task specific classification metrics to be true subtypes of Metric (#1963)

Fixed

  • Fixed bug in CalibrationError where calculations for double precision input was performed in float precision (#1919)
  • Fixed bug related to the prefix/postfix arguments in MetricCollection and ClasswiseWrapper being duplicated (#1918)
  • Fixed missing AUC score when plotting classification metrics that support the score argument (#1948)

Contributors

@​borda, @​SkafteNicki^n

If we forgot someone due to not matching commit email with GitHub account, let us know :]

Weekly patch release

[1.0.1] - 2022-07-13

Fixed

  • Fixes corner case when using MetricCollection together with aggregation metrics (#1896)
  • Fixed the use of max_fpr in AUROC metric when only one class is present (#1895)
  • Fixed bug related to empty predictions for IntersectionOverUnion metric (#1892)
  • Fixed bug related to MeanMetric and broadcasting of weights when Nans are present (#1898)
  • Fixed bug related to expected input format of pycoco in MeanAveragePrecision (#1913)

Contributors

@​fansuregrin, @​SkafteNicki

If we forgot someone due to not matching commit email with GitHub account, let us know :]

Visualize metrics

We are happy to announce that the first major release of Torchmetrics, version v1.0, is publicly available. We have worked hard on a couple of new features for this milestone release, but for v1.0.0, we have also managed to implement over 100 metrics in torchmetrics.

Plotting

The big new feature of v1.0 is a built-in plotting feature. As the old saying goes: "A picture is worth a thousand words". Within machine learning, this is definitely also true for many things. Metrics are one area that, in some cases, is definitely better showcased in a figure than as a list of floats. The only requirement for getting started with the plotting feature is installing matplotlib. Either install with pip install matplotlib or pip install torchmetrics[visual] (the latter option also installs Scienceplots and uses that as the default plotting style).

... (truncated)

Changelog

Sourced from torchmetrics's changelog.

[1.0.2] - 2023-08-02

Added

  • Added warning to PearsonCorrCoeff if input has a very small variance for its given dtype (#1926)

Changed

  • Changed all non-task specific classification metrics to be true subtypes of Metric (#1963)

Fixed

  • Fixed bug in CalibrationError where calculations for double precision input was performed in float precision (#1919)
  • Fixed bug related to the prefix/postfix arguments in MetricCollection and ClasswiseWrapper being duplicated (#1918)
  • Fixed missing AUC score when plotting classification metrics that support the score argument (#1948)

[1.0.1] - 2023-07-13

Fixed

  • Fixes corner case when using MetricCollection together with aggregation metrics (#1896)
  • Fixed the use of max_fpr in AUROC metric when only one class is present (#1895)
  • Fixed bug related to empty predictions for IntersectionOverUnion metric (#1892)
  • Fixed bug related to MeanMetric and broadcasting of weights when Nans are present (#1898)
  • Fixed bug related to expected input format of pycoco in MeanAveragePrecision (#1913)

[1.0.0] - 2023-07-04

Added

  • Added prefix and postfix arguments to ClasswiseWrapper (#1866)
  • Added speech-to-reverberation modulation energy ratio (SRMR) metric (#1792, #1872)
  • Added new global arg compute_with_cache to control caching behaviour after compute method (#1754)
  • Added ComplexScaleInvariantSignalNoiseRatio for audio package (#1785)
  • Added Running wrapper for calculate running statistics (#1752)
  • AddedRelativeAverageSpectralError and RootMeanSquaredErrorUsingSlidingWindow to image package (#816)
  • Added support for SpecificityAtSensitivity Metric (#1432)
  • Added support for plotting of metrics through .plot() method ( #1328, #1481, #1480, #1490, #1581, #1585, #1593, #1600, #1605, #1610, #1609,

... (truncated)

Commits
  • 81fe19c releasing 1.0.2
  • 0652d66 Convert classification wrapper to metrics (#1963)
  • 5494811 docs: revert links in gallery & adjust img/icon path (#1964)
  • b82df5e Fix missing AUC score when plotting (#1948)
  • 97eb3f6 build(deps): bump pypa/gh-action-pypi-publish from 1.8.7 to 1.8.8 (#1967)
  • 84fc622 building docs with py3.9 (#1958)
  • a12ad99 revert sphinx to 5.3 to recover icon with anchor links (#1956)
  • d601498 docs: remove empty (,,,) from plots docstrings (#1965)
  • e4cf2e7 Fix AttributeError: module 'matplotlib' has no attribute 'axes' (#1955)
  • aae4a64 Revert "docs: resolving relative path to images (#1930)"
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [torchmetrics](https://github.com/Lightning-AI/torchmetrics) from 0.11.0 to 1.0.2.
- [Release notes](https://github.com/Lightning-AI/torchmetrics/releases)
- [Changelog](https://github.com/Lightning-AI/torchmetrics/blob/master/CHANGELOG.md)
- [Commits](Lightning-AI/torchmetrics@v0.11.0...v1.0.2)

---
updated-dependencies:
- dependency-name: torchmetrics
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Aug 4, 2023
@dependabot @github
Copy link
Author

dependabot bot commented on behalf of github Aug 9, 2023

Superseded by #36.

@dependabot dependabot bot closed this Aug 9, 2023
@dependabot dependabot bot deleted the dependabot/pip/torchmetrics-1.0.2 branch August 9, 2023 03:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants