Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for CommunityToolkit #8

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

dansiegel
Copy link

@dansiegel dansiegel commented Nov 18, 2024

Description

This adds better support for Ollama References.

  • Provides an ability to specify a specific model to use
  • Adds a new extension method for an Ollama Reference to make it easier to specify the Ollama reference. This ensures that we provide the appropriate ConnectionString name, and we can optionally provide a value for a Model to use
  • Updates the web project so that we ensure that when using the Community Toolkit we remove the Endpoint= prefix which will result in an exception on startup.
  • Updates to Aspire 9.0

Issues

@QuantumNightmare
Copy link
Contributor

Thanks for submitting this! We aim to review this during the next week.

Copy link
Contributor

@QuantumNightmare QuantumNightmare left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for contributing - I've added a comment to discuss handling invalid urls and supporting the ability for the latest version of the Ollama component to pass the model to the Raygun component via the connection string.

if (!string.IsNullOrWhiteSpace(connectionString))
{
builder.Services.Add(new ServiceDescriptor(typeof(IOllamaApiClient), new OllamaApiClient(connectionString, Constants.AiModel)));
builder.Services.Add(new ServiceDescriptor(typeof(IOllamaApiClient), new OllamaApiClient(connectionString.Replace("Endpoint=", string.Empty), ollamaOptions.Model)));
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I started off by trying this out with version 9.0.0 of the CommunityToolkit.Aspire.Hosting.Ollama, using the following code:

var ollama = builder.AddOllama("Ollama").WithDataVolume().AddModel("llama3");
builder.AddRaygun().WithOllamaReference(ollama);

This threw an exception at this line of code due to the url being invalid. This is because the url is: http://ollama:11434;Model=llama3/ (The Model section makes it invalid).

So there's a couple of things to sort out. Firstly, the code to parse out the endpoint here needs to be strengthened. I'm thinking we find the "Endpoint=" string and then parse out the rest of the string up to where the next semicolon is, if any. This could help cater for future cases if the string ever contains more than "Endpoint" and "Model" and handles the possibility that they're in a different order.
Secondly, we may as well make use of the Model being specified in the connection string.

What you've done so far allows you to optionally specify a model when passing the Ollama reference to the Raygun component, but I'm wondering if we still need this given that the connection string can contain the model that's been specified when configuring the Ollama component. Let me know if you feel there is a case where it would make sense to need to specify the model in a different way than the code I posted above. Otherwise I think we simplify the solution here such that we'd no longer need a special WithOllamaReference method (we may just be able to use the built-in WithReference method) and also we'd no longer need to pass the model name around as an environment variable.

Given that there are so few versions of the Ollama component so far (only 3 stable versions), I don't think it's necessary to have a special override just for the two previous versions that didn't pass the model in the connection string. Using those versions would still work, but default to use the hardcoded "llama3" model.

Feel free to jump into these changes, or let me know any thoughts you have about them.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants