Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump mlflow from 1.30.1 to 2.14.2 #20

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

dependabot[bot]
Copy link

@dependabot dependabot bot commented on behalf of github Aug 8, 2024

Bumps mlflow from 1.30.1 to 2.14.2.

Release notes

Sourced from mlflow's releases.

MLflow 2.14.2 is a patch release that includes several important bug fixes and documentation enhancements.

Bug fixes:

  • [Models] Fix an issue with requirements inference error handling when disabling the default warning-only behavior (#12547, @​B-Step62)
  • [Models] Fix dependency inference issues with Transformers models saved with the unified API llm/v1/xxx task definitions. (#12551, @​B-Step62)
  • [Models / Databricks] Fix an issue with MLlfow log_model introduced in MLflow 2.13.0 that causes Databricks DLT service to crash in some situations (#12514, @​WeichenXu123)
  • [Models] Fix an output data structure issue with the predict_stream implementation for LangChain AgentExecutor and other non-Runnable chains (#12518, @​B-Step62)
  • [Tracking] Fix an issue with the predict_proba inference method in the sklearn flavor when loading an sklearn pipeline object as pyfunc (#12554, @​WeichenXu123)
  • [Tracking] Fix an issue with the Tracing implementation where other services usage of OpenTelemetry would activate MLflow tracing and cause errors (#12457, @​B-Step62)
  • [Tracking / Databricks] Correct an issue when running dependency inference in Databricks that can cause duplicate dependency entries to be logged (#12493, @​sunishsheth2009)

Documentation updates:

Small bug fixes and documentation updates:

#12311, #12285, #12535, #12543, #12320, #12444, @​B-Step62; #12310, #12340, @​serena-ruan; #12409, #12432, #12471, #12497, #12499, @​harupy; #12555, @​nojaf; #12472, #12431, @​xq-yin; #12530, #12529, #12528, #12527, #12526, #12524, #12531, #12523, #12525, #12522, @​dbczumar; #12483, @​jsuchome; #12465, #12441, @​BenWilson2; #12450, @​StarryZhang-whu

MLflow 2.14.1 is a patch release that contains several bug fixes and documentation improvements

Bug fixes:

Documentation updates:

Small bug fixes and documentation updates:

#12415, #12396, #12394, @​harupy; #12403, #12382, @​BenWilson2; #12397, @​B-Step62

v2.14.0

2.14.0 (2024-06-17)

MLflow 2.14.0 includes several major features and improvements that we're very excited to announce!

Major features:

  • MLflow Tracing: Tracing is powerful tool designed to enhance your ability to monitor, analyze, and debug GenAI applications by allowing you to inspect the intermediate outputs generated as your application handles a request. This update comes with an automatic LangChain integration to make it as easy as possible to get started, but we've also implemented high-level fluent APIs, and low-level client APIs for users who want more control over their trace instrumentation. For more information, check out the guide in our docs!
  • Unity Catalog Integration: The MLflow Deployments server now has an integration with Unity Catalog, allowing you to leverage registered functions as tools for enhancing your chat application. For more information, check out this guide!
  • OpenAI Autologging: Autologging support has now been added for the OpenAI model flavor. With this feature, MLflow will automatically log a model upon calling the OpenAI API. Each time a request is made, the inputs and outputs will be logged as artifacts. Check out the guide for more information!

Other Notable Features:

... (truncated)

Changelog

Sourced from mlflow's changelog.

2.14.2 (2024-07-03)

MLflow 2.14.2 is a patch release that includes several important bug fixes and documentation enhancements.

Bug fixes:

  • [Models] Fix an issue with requirements inference error handling when disabling the default warning-only behavior (#12547, @​B-Step62)
  • [Models] Fix dependency inference issues with Transformers models saved with the unified API llm/v1/xxx task definitions. (#12551, @​B-Step62)
  • [Models / Databricks] Fix an issue with MLlfow log_model introduced in MLflow 2.13.0 that causes Databricks DLT service to crash in some situations (#12514, @​WeichenXu123)
  • [Models] Fix an output data structure issue with the predict_stream implementation for LangChain AgentExecutor and other non-Runnable chains (#12518, @​B-Step62)
  • [Tracking] Fix an issue with the predict_proba inference method in the sklearn flavor when loading an sklearn pipeline object as pyfunc (#12554, @​WeichenXu123)
  • [Tracking] Fix an issue with the Tracing implementation where other services usage of OpenTelemetry would activate MLflow tracing and cause errors (#12457, @​B-Step62)
  • [Tracking / Databricks] Correct an issue when running dependency inference in Databricks that can cause duplicate dependency entries to be logged (#12493, @​sunishsheth2009)

Documentation updates:

Small bug fixes and documentation updates:

#12311, #12285, #12535, #12543, #12320, #12444, @​B-Step62; #12310, #12340, @​serena-ruan; #12409, #12432, #12471, #12497, #12499, @​harupy; #12555, @​nojaf; #12472, #12431, @​xq-yin; #12530, #12529, #12528, #12527, #12526, #12524, #12531, #12523, #12525, #12522, @​dbczumar; #12483, @​jsuchome; #12465, #12441, @​BenWilson2; #12450, @​StarryZhang-whu

2.14.1 (2024-06-20)

MLflow 2.14.1 is a patch release that contains several bug fixes and documentation improvements

Bug fixes:

Documentation updates:

Small bug fixes and documentation updates:

#12415, #12396, #12394, @​harupy; #12403, #12382, @​BenWilson2; #12397, @​B-Step62

2.14.0 (2024-06-17)

MLflow 2.14.0 includes several major features and improvements that we're very excited to announce!

Major features:

  • MLflow Tracing: Tracing is powerful tool designed to enhance your ability to monitor, analyze, and debug GenAI applications by allowing you to inspect the intermediate outputs generated as your application handles a request. This update comes with an automatic LangChain integration to make it as easy as possible to get started, but we've also implemented high-level fluent APIs, and low-level client APIs for users who want more control over their trace instrumentation. For more information, check out the guide in our docs!
  • Unity Catalog Integration: The MLflow Deployments server now has an integration with Unity Catalog, allowing you to leverage registered functions as tools for enhancing your chat application. For more information, check out this guide!
  • OpenAI Autologging: Autologging support has now been added for the OpenAI model flavor. With this feature, MLflow will automatically log a model upon calling the OpenAI API. Each time a request is made, the inputs and outputs will be logged as artifacts. Check out the guide for more information!

... (truncated)

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    You can disable automated security fix PRs for this repo from the Security Alerts page.

Bumps [mlflow](https://github.com/mlflow/mlflow) from 1.30.1 to 2.14.2.
- [Release notes](https://github.com/mlflow/mlflow/releases)
- [Changelog](https://github.com/mlflow/mlflow/blob/master/CHANGELOG.md)
- [Commits](https://github.com/mlflow/mlflow/commits/v2.14.2)

---
updated-dependencies:
- dependency-name: mlflow
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Aug 8, 2024
@wiz-inc-a28a8b7b4c
Copy link

Wiz Scan Summary

IaC Misconfigurations 0C 0H 0M 0L 0I
Vulnerabilities 3C 24H 19M 3L 0I
Sensitive Data 0C 0H 0M 0L 0I
Total 3C 24H 19M 3L 0I
Secrets 0🔑

1 similar comment
@wiz-inc-a28a8b7b4c
Copy link

Wiz Scan Summary

IaC Misconfigurations 0C 0H 0M 0L 0I
Vulnerabilities 3C 24H 19M 3L 0I
Sensitive Data 0C 0H 0M 0L 0I
Total 3C 24H 19M 3L 0I
Secrets 0🔑

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants