Built-in AI Provider Resolvers
Three drop-in IRichTextBoxAiResolver implementations so the AI Toolkit lights up without writing HTTP client plumbing.
One line of DI registration; keys read from your secrets store. Works with OpenAI, Anthropic (Claude), Azure OpenAI, and any
OpenAI-compatible endpoint (Groq, Together.ai, local Ollama) via a BaseUrl override.
These resolvers live in the RichTextBox.AspNetCore NuGet package (1.0.0-preview.11+). The JS side stays the same — your server picks the provider.
1. OpenAI
using RichTextBox.AiResolvers;
builder.Services.AddRichTextBox();
builder.Services.AddRichTextBoxOpenAiResolver(opts =>
{
opts.ApiKey = builder.Configuration["OpenAI:ApiKey"];
opts.Model = "gpt-4o-mini";
});
var app = builder.Build();
app.MapRichTextBoxUploads();
2. Anthropic (Claude)
builder.Services.AddRichTextBoxAnthropicResolver(opts =>
{
opts.ApiKey = builder.Configuration["Anthropic:ApiKey"];
opts.Model = "claude-3-5-haiku-latest";
});
3. Azure OpenAI
builder.Services.AddRichTextBoxAzureOpenAiResolver(opts =>
{
opts.ApiKey = builder.Configuration["AzureOpenAI:ApiKey"];
opts.Endpoint = builder.Configuration["AzureOpenAI:Endpoint"];
opts.DeploymentName = builder.Configuration["AzureOpenAI:Deployment"];
});
Shared options
All three share AiResolverOptions: ApiKey, Model, BaseUrl, MaxTokens, Temperature, SystemSuffix, Timeout.
Customize brand voice / tone across every mode by setting SystemSuffix — it’s appended after the built-in system prompt for each action (Proofread, Rewrite, Translate, etc.).
Streaming responses
OpenAI and Anthropic resolvers also implement IStreamingRichTextBoxAiResolver. The endpoint /richtextbox/ai/stream
relays provider deltas as Server-Sent Events. The JS client consumes it via
editor.aiToolkit.streamRequest({ url, body, onDelta, onResponse, onDone }) — tokens render as they arrive.