Skip to content

Latest commit

 

History

History
378 lines (301 loc) · 13.1 KB

README.md

File metadata and controls

378 lines (301 loc) · 13.1 KB

.NET NuGet Badge badge Stryker Mutation testing badge license Visitors

OpenAI

OpenAI.Net

Our .NET OpenAI library is designed for easy integration with the OpenAI API using C#. As a community-maintained library, we ensure it's kept up-to-date and fully tested, supporting .NET Core 6.0 and above.

With an intuitive API, our library simplifies the process of working with OpenAI's powerful natural language processing tools. Plus, we handle HTTP client usage to prevent socket exhaustion and DNS update failures, ensuring a smooth and reliable experience for developers. Our library also supports the OpenAI streaming API, allowing developers to process and analyze large volumes of data in real-time.

Youtube tutorial : Creating your own ChatGPT clone with .NET: A Step-by-Step Guide

Getting started

Install package Nuget package

Install-Package OpenAI.Net.Client

Register services using the provided extension methods

services.AddOpenAIServices(options => {
    options.ApiKey = builder.Configuration["OpenAI:ApiKey"];
});

N.B We recommend using environment variables, configuration files or secret file for storing the API key securely. See here for further details.

Example Usage

You can view examples of a console, web application and Blazor app using the streaming API here.

You can also have a look at the Integration Tests for usage examples here.

Simple console app usage below.

using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using OpenAI.Net;

namespace ConsoleApp
{
    internal class Program
    {
        static async void Main(string[] args)
        {
            using var host = Host.CreateDefaultBuilder(args)
            .ConfigureServices((builder, services) =>
            {
                services.AddOpenAIServices(options => {
                    options.ApiKey = builder.Configuration["OpenAI:ApiKey"];
                });
            })
            .Build();

            var openAi = host.Services.GetService<IOpenAIService>();
            var response = await openAi.TextCompletion.Get("How long until we reach mars?");

            if (response.IsSuccess)
            {
                foreach(var result in response.Result.Choices)
                {
                    Console.WriteLine(result.Text);
                }
            }
            else
            {
                Console.WriteLine($"{response.ErrorMessage}");
            }
        }
    }
}

Configuring Http Client Options

The registration extension allows for configuration of the http client via the IHttpClientBuilder interface. This allows for adding a Polly retry policy for example. See example here.

services.AddOpenAIServices(options => {
    options.ApiKey = builder.Configuration["OpenAI:ApiKey"];
}, 
(httpClientOptions) => {
    httpClientOptions.AddPolicyHandler(GetRetryPolicy());
});

static IAsyncPolicy<HttpResponseMessage> GetRetryPolicy()
{
    return HttpPolicyExtensions
        .HandleTransientHttpError()
        .WaitAndRetryAsync(3, retryAttempt => TimeSpan.FromSeconds(Math.Pow(2,
                                                                    retryAttempt)));
}

Examples

Models Types

Models type strings can be found in ModelTypes class.

ModelTypes.GPT35Turbo

ModelTypes

Completion

var response = await service.TextCompletion.Get("Say this is a test",(o) => {
                o.MaxTokens = 1024;
                o.BestOf = 2;
            });

Completion Stream

await foreach(var response in service.TextCompletion.GetStream("Say this is a test"))
{
    Console.WriteLine(response?.Result?.Choices[0].Text);
}

Completion with array input

var prompts = new List<string>()
{
    "Say this is a test",
    "Say this is not a test"
};

var response = await service.TextCompletion.Get(prompts,(o) => {
                o.MaxTokens = 1024;
                o.BestOf = 2;
            });

Chat

var messages = new List<Message>
{
    Message.Create(ChatRoleType.System, "You are a helpful assistant."),
    Message.Create(ChatRoleType.User, "Who won the world series in 2020?"),
    Message.Create(ChatRoleType.Assistant, "The Los Angeles Dodgers won the World Series in 2020."),
    Message.Create(ChatRoleType.User, "Where was it played?")
};

 var response = await service.Chat.Get(messages,o => {
    o.MaxTokens = 1000;
});

Chat Stream

var messages = new List<Message>
{
    Message.Create(ChatRoleType.System, "You are a helpful assistant."),
    Message.Create(ChatRoleType.User, "Who won the world series in 2020?"),
    Message.Create(ChatRoleType.Assistant, "The Los Angeles Dodgers won the World Series in 2020."),
    Message.Create(ChatRoleType.User, "Where was it played?")
};

await foreach (var t in service.Chat.GetStream(messages))
{
    Console.WriteLine(t?.Result?.Choices[0].Delta?.Content);
}

Audio

Get transcription
var response = await OpenAIService.Audio.GetTranscription(@"Audio\TestTranscription.m4a");
Get translation
var response = await OpenAIService.Audio.GetTranslation(@"Audio\Translation.m4a");

Text Edit

var messages = new List<Message>
{
    Message.Create(ChatRoleType.System, "You are a spell checker. Fix the spelling mistakes"),
    Message.Create(ChatRoleType.User, "What day of the wek is it?"),
};

var response = await OpenAIService.Chat.Get(messages, o => {
    o.MaxTokens = 1000;
});

Image Edit

Using file paths
var response = await service.Images.Edit("A cute baby sea otter", @"Images\BabyCat.png", @"Images\Mask.png", o => {
    o.N = 99;
});
Using file bytes
var response = await service.Images.Edit("A cute baby sea otter",File.ReadAllBytes(@"Images\BabyCat.png"), File.ReadAllBytes(@"Images\BabyCat.png"), o => {
    o.N = 99;
});

Image Generate

var response = await service.Images.Generate("A cute baby sea otter",2, "1024x1024");

Image Variation

Using file paths
var response = await service.Images.Variation(@"Images\BabyCat.png", o => {
    o.N = 2;
    o.Size = "1024x1024";
});
Using file bytes
 var response = await service.Images.Variation(File.ReadAllBytes(@"Images\BabyCat.png"), o => {
                o.N = 2;
                o.Size = "1024x1024";
});

Fine Tune Create

var response = await service.FineTune.Create("myfile.jsonl", o => {
    o.BatchSize = 1;
});

Fine Tune Get All

var response = await service.FineTune.Get();

Fine Tune Get By Id

var response = await service.FineTune.Get("fineTuneId");

Fine Tune Get Events

var response = await service.FineTune.GetEvents("fineTuneId");

Fine Tune Delete

var response = await service.FineTune.Delete("modelId");

Fine Tune Cancel

var response = await service.FineTune.Cancel("fineTuneId");

File Upload

Using file path
var response = await service.Files.Upload(@"Images\BabyCat.png");
Using file bytes
var response = await service.Files.Upload(bytes, "mymodel.jsonl");

File Get Content

 var response = await service.Files.GetContent("fileId");

File Get File Detail

var response = await service.Files.Get("fileId");

File Get File All

var response = await service.Files.Get();

File Delete

var response = await service.Files.Delete("1");

Emdeddings Create

var response = await service.Embeddings.Create("The food was delicious and the waiter...", "text-embedding-ada-002", "test");

Models Get All

var response = await service.Models.Get();

Models Get By Id

var response = await service.Models.Get("babbage");

Moderation Create

var response = await service.Moderation.Create("input text", "test");

Testing

This project has 100% code coverage with Unit tests and 100% pass rate with Stryker mutation testing.

See latest Stryker report here.

We also have Integration tests foreach service.

This should provide confidence in the library going forwards.

Supported API's

Full support of all current API's

Contributions

Contributions are welcome.

Minimum requirements for any PR's.

  • MUST include Unit tests and maintain 100% coverage.

  • MUST pass Stryker mutation testing with 100%

  • SHOULD have integration tests

Buy me a coffee

https://www.buymeacoffee.com/timdoestech?new=1