-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for CommunityToolkit #8
base: main
Are you sure you want to change the base?
Conversation
Thanks for submitting this! We aim to review this during the next week. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for contributing - I've added a comment to discuss handling invalid urls and supporting the ability for the latest version of the Ollama component to pass the model to the Raygun component via the connection string.
if (!string.IsNullOrWhiteSpace(connectionString)) | ||
{ | ||
builder.Services.Add(new ServiceDescriptor(typeof(IOllamaApiClient), new OllamaApiClient(connectionString, Constants.AiModel))); | ||
builder.Services.Add(new ServiceDescriptor(typeof(IOllamaApiClient), new OllamaApiClient(connectionString.Replace("Endpoint=", string.Empty), ollamaOptions.Model))); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I started off by trying this out with version 9.0.0 of the CommunityToolkit.Aspire.Hosting.Ollama, using the following code:
var ollama = builder.AddOllama("Ollama").WithDataVolume().AddModel("llama3");
builder.AddRaygun().WithOllamaReference(ollama);
This threw an exception at this line of code due to the url being invalid. This is because the url is: http://ollama:11434;Model=llama3/ (The Model section makes it invalid).
So there's a couple of things to sort out. Firstly, the code to parse out the endpoint here needs to be strengthened. I'm thinking we find the "Endpoint=" string and then parse out the rest of the string up to where the next semicolon is, if any. This could help cater for future cases if the string ever contains more than "Endpoint" and "Model" and handles the possibility that they're in a different order.
Secondly, we may as well make use of the Model being specified in the connection string.
What you've done so far allows you to optionally specify a model when passing the Ollama reference to the Raygun component, but I'm wondering if we still need this given that the connection string can contain the model that's been specified when configuring the Ollama component. Let me know if you feel there is a case where it would make sense to need to specify the model in a different way than the code I posted above. Otherwise I think we simplify the solution here such that we'd no longer need a special WithOllamaReference method (we may just be able to use the built-in WithReference method) and also we'd no longer need to pass the model name around as an environment variable.
Given that there are so few versions of the Ollama component so far (only 3 stable versions), I don't think it's necessary to have a special override just for the two previous versions that didn't pass the model in the connection string. Using those versions would still work, but default to use the hardcoded "llama3" model.
Feel free to jump into these changes, or let me know any thoughts you have about them.
Description
This adds better support for Ollama References.
Endpoint=
prefix which will result in an exception on startup.Issues