Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for CommunityToolkit #8

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@
</ItemGroup>

<ItemGroup>
<PackageReference Include="Aspire.Hosting" Version="8.0.1" />
<PackageReference Include="Aspire.Hosting" Version="9.0.0" />
</ItemGroup>

</Project>
Original file line number Diff line number Diff line change
Expand Up @@ -14,5 +14,26 @@ public static IResourceBuilder<RaygunAspireWebAppResource> AddRaygun(this IDistr
.ExcludeFromManifest()
.PublishAsContainer();
}

public static IResourceBuilder<RaygunAspireWebAppResource> WithOllamaReference(this IResourceBuilder<RaygunAspireWebAppResource> builder, IResourceBuilder<IResourceWithConnectionString> source, string? model = null)
{
var resource = source.Resource;
var connectionName = resource.Name;

if (!string.IsNullOrEmpty(model))
{
builder.WithEnvironment("Ollama:Model", model);
}
return builder
.WithReference(source, "Ollama")
.WaitFor(source)
.WithEnvironment(context =>
{
if (!string.IsNullOrEmpty(model))
{
context.EnvironmentVariables["Ollama:Model"] = model;
}
});
}
}
}
6 changes: 6 additions & 0 deletions src/RaygunAspireWebApp/Configuraiton/OllamaOptions.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
namespace RaygunAspireWebApp.Configuraiton;

public class OllamaOptions
{
public string Model { get; set; } = Constants.AiModel;
}
11 changes: 7 additions & 4 deletions src/RaygunAspireWebApp/Controllers/ErrorInstanceController.cs
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
using RaygunAspireWebApp.Models;
using System.Text.Json;
using OllamaSharp;
using RaygunAspireWebApp.Configuraiton;

namespace RaygunAspireWebApp.Controllers
{
Expand All @@ -14,13 +15,15 @@ public class ErrorInstanceController : Controller
private readonly RaygunClient _raygunClient;
private readonly IHubContext<AierHub> _aierHubContext;
private readonly IOllamaApiClient? _ollamaClient;
private readonly OllamaOptions _ollamaOptions;

private static CancellationTokenSource? _cancellationTokenSource;

public ErrorInstanceController(RaygunClient raygunClient, IHubContext<AierHub> aierHubContext, IOllamaApiClient? ollamaClient = null)
public ErrorInstanceController(RaygunClient raygunClient, IHubContext<AierHub> aierHubContext, OllamaOptions ollamaOptions, IOllamaApiClient? ollamaClient = null)
{
_raygunClient = raygunClient;
_aierHubContext = aierHubContext;
_ollamaOptions = ollamaOptions;
_ollamaClient = ollamaClient;
}

Expand Down Expand Up @@ -117,11 +120,11 @@ await _ollamaClient.StreamCompletion(prompt, null, response =>
private async Task EnsureModel()
{
var models = await _ollamaClient.ListLocalModels();
if (!models.Any(m => m.Name.StartsWith(Constants.AiModel)))
if (!models.Any(m => m.Name.StartsWith(_ollamaOptions.Model)))
{
// If the model has not been downloaded yet, then kick off that process.
// If the model is already downloading, then this will pick up the progress of the existing download job:
await _ollamaClient.PullModel(Constants.AiModel, status =>
await _ollamaClient.PullModel(_ollamaOptions.Model, status =>
{
var percentage = status.Total == 0 ? 0 : status.Completed * 100 / (double)status.Total;
// There are some initial messages in the stream that state the download has started, but do not mention the progress yet.
Expand All @@ -136,7 +139,7 @@ await _ollamaClient.PullModel(Constants.AiModel, status =>
while (true)
{
models = await _ollamaClient.ListLocalModels();
if (models.Any(m => m.Name.StartsWith(Constants.AiModel)))
if (models.Any(m => m.Name.StartsWith(_ollamaOptions.Model)))
{
return;
}
Expand Down
9 changes: 8 additions & 1 deletion src/RaygunAspireWebApp/Program.cs
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
using Mindscape.Raygun4Net.AspNetCore;
using OllamaSharp;
using RaygunAspireWebApp.Configuraiton;
using RaygunAspireWebApp.Hubs;

namespace RaygunAspireWebApp
Expand All @@ -23,9 +24,15 @@ public static void Main(string[] args)
builder.Services.AddSignalR();

var connectionString = builder.Configuration.GetConnectionString("Ollama");
var ollamaOptions = new OllamaOptions
{
Model = builder.Configuration["Ollama:Model"] ?? Constants.AiModel
};
builder.Services.AddSingleton(ollamaOptions);

if (!string.IsNullOrWhiteSpace(connectionString))
{
builder.Services.Add(new ServiceDescriptor(typeof(IOllamaApiClient), new OllamaApiClient(connectionString, Constants.AiModel)));
builder.Services.Add(new ServiceDescriptor(typeof(IOllamaApiClient), new OllamaApiClient(connectionString.Replace("Endpoint=", string.Empty), ollamaOptions.Model)));
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I started off by trying this out with version 9.0.0 of the CommunityToolkit.Aspire.Hosting.Ollama, using the following code:

var ollama = builder.AddOllama("Ollama").WithDataVolume().AddModel("llama3");
builder.AddRaygun().WithOllamaReference(ollama);

This threw an exception at this line of code due to the url being invalid. This is because the url is: http://ollama:11434;Model=llama3/ (The Model section makes it invalid).

So there's a couple of things to sort out. Firstly, the code to parse out the endpoint here needs to be strengthened. I'm thinking we find the "Endpoint=" string and then parse out the rest of the string up to where the next semicolon is, if any. This could help cater for future cases if the string ever contains more than "Endpoint" and "Model" and handles the possibility that they're in a different order.
Secondly, we may as well make use of the Model being specified in the connection string.

What you've done so far allows you to optionally specify a model when passing the Ollama reference to the Raygun component, but I'm wondering if we still need this given that the connection string can contain the model that's been specified when configuring the Ollama component. Let me know if you feel there is a case where it would make sense to need to specify the model in a different way than the code I posted above. Otherwise I think we simplify the solution here such that we'd no longer need a special WithOllamaReference method (we may just be able to use the built-in WithReference method) and also we'd no longer need to pass the model name around as an environment variable.

Given that there are so few versions of the Ollama component so far (only 3 stable versions), I don't think it's necessary to have a special override just for the two previous versions that didn't pass the model in the connection string. Using those versions would still work, but default to use the hardcoded "llama3" model.

Feel free to jump into these changes, or let me know any thoughts you have about them.

}

var app = builder.Build();
Expand Down