Releases: quarkiverse/quarkus-langchain4j
Releases Β· quarkiverse/quarkus-langchain4j
0.22.0
What's Changed
- Properly warn about missing javac -parameters flag by @geoand in #1113
- Bump quarkus-neo4j.version from 4.4.0 to 5.0.0 by @dependabot in #1115
- Allow rewriting of user messages from input guardrails by @mariofusco in #1083
- Bump io.quarkiverse.wiremock:quarkus-wiremock-test from 1.3.3 to 1.4.0 by @dependabot in #1116
- Integrate Mistral AI moderation models by @jmartisk in #1117
Full Changelog: 0.22.0.CR2...0.22.0
0.22.0.CR2
What's Changed
- Update version README during release by @geoand in #1105
- Bump com.github.tjake:jlama-core from 0.8.2 to 0.8.3 by @dependabot in #1107
- Initial implementation of the response augmenter idea by @cescoffier in #1106
- Enable resolution of AI services by bean name by @aldettinger in #1110
- Release 0.22.0.CR2 by @geoand in #1111
- Release 0.22.0.CR2 by @geoand in #1112
Full Changelog: 0.22.0.CR1...0.22.0.CR2
0.22.0.CR1
What's Changed
- Bump org.apache.maven.plugins:maven-surefire-plugin from 3.5.0 to 3.5.1 by @dependabot in #976
- Register all models from configuration as beans regardless of the existance of injection point by @manovotn in #1061
- Prepare the arrival of the bot by @gsmet in #1062
- Show how OIDC ModelAuthProvider can be used with Azure OpenAI by @sberyozkin in #1056
- Prioritize Jackson over Jsonb in extension that use
quarkus-rest-client-jackson
by @geoand in #1067 - Make Llama3.2 the default for Ollama by @geoand in #1066
- Introduce a global temperature property by @geoand in #1068
- Properly handle Ollama streaming by @geoand in #1070
- Fix MismatchedInputException in Ollama streaming by @jpohlmeyer in #1072
- Provide an abstract output guardrails for json data extraction by @mariofusco in #1060
- Use latest version of the Llama3.java code by @geoand in #1076
- Easy RAG extension: don't depend on the upstream Easy RAG module by @jmartisk in #1078
- Add AbstractJsonExtractorOutputGuardrail to guardrails docs by @mariofusco in #1082
- Add a Secure SQL ChatBot demo by @sberyozkin in #1073
- Update Quarkus to 3.15.2 and update documentation by @gsmet in #1084
- Move JsonGuardrailsUtilsTest to the deployment module by @jmartisk in #1086
- Detect when models to not support tools when using streaming by @cescoffier in #1085
- Drop references to resteasy-reactive artifacts by @gsmet in #1088
- Bump to Jlama 0.8.2 by @geoand in #1090
- Prevent Jlama inference to block vertx event loop by @mariofusco in #1092
- When creating fastJar don't try to include models more than once by @edeandrea in #1093
- Remove duplicate invocation from handleMessageStop. by @dennysfredericci in #1094
- static keyword is not allowed there. by @lordofthejars in #1098
- Upgrade to LangChain4j 0.36.2 by @jmartisk in #1097
- Bump io.smallrye.certs:smallrye-certificate-generator-junit5 from 0.8.1 to 0.9.2 by @dependabot in #1101
- Add chatbot codestart by @iocanel in #1103
- Migrate to the JsonSchemaElement API by @edeandrea in #1100
New Contributors
Full Changelog: 0.21.0...0.22.0.CR1
0.21.0
What's Changed
- Moving samples updates to release-prepare.yml by @gastaldi in #1035
- Update Llama3 and Jlama ITs in CI by @geoand in #1037
- Add prompt template and variables to input / output guardrails by @dennysfredericci in #992
- Allow to rewrite LLM result in an OutputGuardrail by @mariofusco in #1021
- Adjusted method signature mapping to json schema, to allow collections in tool arguments by @Tarjei400 in #1039
- Bring ModelAuthProvider support to the OpenAI extension by @geoand in #1041
- Bump jlama to version 0.8.1 by @mariofusco in #1042
- Improve model download progress logging by @geoand in #1045
- Fix warning about missing model-id in Ollama by @geoand in #1048
- Add basic docs for Llama3.java by @geoand in #1049
- Allow using generic ModelAuthProvider with named models by @sberyozkin in #1050
- Document rewriting output guardrails by @mariofusco in #1052
- Enable Tools to Define Execution Model by @cescoffier in #1023
- Allow for Ollama configuration to make
quarkus.langchain4j.foo.chat-model.provider
unnecessary by @geoand in #1057
New Contributors
- @Tarjei400 made their first contribution in #1039
Full Changelog: 0.21.0.CR4...0.21.0
0.21.0.CR4
What's Changed
- Switch config doc generation to the new plugin introduced in 3.14 by @gsmet in #1017
- Fix incorrect model-id parameter in WatsonxRecorder by @andreadimaio in #1024
- Improve web-search sample by @jmartisk in #1025
- Add description for neo4j extension by @holly-cummins in #1026
- Easy-rag: allow to specify minScore through configuration by @edeandrea in #1027
- Add Jlama documentation by @mariofusco in #1018
- Add missing llama3-java module by @geoand in #1028
Full Changelog: 0.21.0.CR1...0.21.0.CR4
0.21.0.CR1
What's Changed
- Remove BAM module by @andreadimaio in #968
- Add support for BGE 1.5 (regular and quantized) in-process embedding by @cescoffier in #966
- Refactor CI workflows (only pull requests) by @cescoffier in #972
- Bump org.apache.maven.plugins:maven-failsafe-plugin from 3.5.0 to 3.5.1 by @dependabot in #977
- Add clarification about WebSearchRetrievalAugmentor by @jmartisk in #982
- Enable /chat and /chat_stream in watsonx.ai by @andreadimaio in #981
- Split and Reuse the Release workflow by @gastaldi in #986
- Anthropic chat model not available when enabled by @dennysfredericci in #985
- Release workflow names are inverted by @gastaldi in #987
- Enable ScoringModel in watsonx.ai by @andreadimaio in #994
- Use custom integration with Jlama by @geoand in #998
- Enable watsonx.ai to process ImageContent in UserMessage by @andreadimaio in #1000
- Move inner classes to top level classes for API concision by @holly-cummins in #1003
- Allow including configured models in the built artifact for Jlama by @geoand in #1004
- [FEATURE] ToolProvider | Select tools dynamically on incoming message by @MiggiV2 in #989
- Add optional logging to jlama requests and responses by @mariofusco in #999
- Add support for TLS configuration name by @geoand in #1006
- Don't include tools from removed beans in the ToolsMetadata by @jmartisk in #1010
- Properly support TokenStream return type in AiService by @geoand in #1005
- Implement support for output guardrail on streamed responses by @cescoffier in #1011
- Introduce llama3-java module by @geoand in #1008
- Update watsonx module: Add Text Extraction APIs, Configuration changes, and Pretty-Print logging by @andreadimaio in #1013
New Contributors
- @dennysfredericci made their first contribution in #985
- @MiggiV2 made their first contribution in #989
Full Changelog: 0.20.3...0.21.0.CR1
0.20.3
What's Changed
- Fix local embedding tokenizer location by @cescoffier in #960
- Shadows api key from tavily logs by @lordofthejars in #961
- Bump io.quarkiverse:quarkiverse-parent from 16 to 18 by @dependabot in #959
- Move OpenWebUI Dev UI Actions to deployment classpath by @phillip-kruger in #867
- Fix ONNX Runtime Execution in Native Executable by @cescoffier in #964
Full Changelog: 0.20.1...0.20.3
0.20.1
What's Changed
- Extend the chatbot sample to use streaming by @cescoffier in #953
- Upgrade to Quarkus Antora - fix issues on Arm Macs by @ppalaga in #913
- Avoid duplicating info for AiService implementation constructors by @geoand in #955
Full Changelog: 0.20.0...0.20.1