TiktokenSharp 1.0.6

There is a newer version of this package available.
See the version list below for details.
dotnet add package TiktokenSharp --version 1.0.6
                    
NuGet\Install-Package TiktokenSharp -Version 1.0.6
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="TiktokenSharp" Version="1.0.6" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="TiktokenSharp" Version="1.0.6" />
                    
Directory.Packages.props
<PackageReference Include="TiktokenSharp" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add TiktokenSharp --version 1.0.6
                    
#r "nuget: TiktokenSharp, 1.0.6"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package TiktokenSharp@1.0.6
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=TiktokenSharp&version=1.0.6
                    
Install as a Cake Addin
#tool nuget:?package=TiktokenSharp&version=1.0.6
                    
Install as a Cake Tool

TiktokenSharp

Due to the lack of a C# version of cl100k_base encoding (gpt-3.5-turbo), I have implemented a basic solution with encoding and decoding methods based on the official Rust implementation.

Currently, cl100k_base p50k_base has been implemented. Other encodings will be added in future submissions. If you encounter any issues or have questions, please feel free to submit them on the lssues."

If you want to use the ChatGPT C# library that integrates this repository and implements context-based conversation, please visit ChatGPTSharp.

Getting Started

TiktokenSharp is available as NuGet package.

using TiktokenSharp;

//use model name
TikToken tikToken = TikToken.EncodingForModel("gpt-3.5-turbo");
var i = tikToken.Encode("hello world"); //[15339, 1917]
var d = tikToken.Decode(i); //hello world

//use encoding name
TikToken tikToken = TikToken.GetEncoding("cl100k_base");
var i = tikToken.Encode("hello world"); //[15339, 1917]
var d = tikToken.Decode(i); //hello world

When using a new encoder for the first time, the required tiktoken files for the encoder will be downloaded from the internet. This may take some time. Once the download is successful, subsequent uses will not require downloading again. You can set TikToken.PBEFileDirectory before using the encoder to modify the storage path of the downloaded tiktoken files, or you can pre-download the files to avoid network issues causing download failures.

Why are the tiktoken files not integrated into the package? On one hand, this would make the package size larger. On the other hand, I want to stay as consistent as possible with OpenAI's official Python code.

Below are the file download links: p50k_base.tiktoken cl100k_base.tiktoken

Efficiency Comparison

I noticed that some users would like to get a comparison of efficiency. Here, I use SharpToken as the basic comparison, with the encoder cl100k_base, on the .Net 6.0 in Debug mode.

  • TiktokenSharp Version: 1.0.5
  • SharpToken Version: 1.0.28

CPU

<details> <summary>Code:</summary>

const string kLongText = "King Lear, one of Shakespeare's darkest and most savage plays, tells the story of the foolish and Job-like Lear, who divides his kingdom, as he does his affections, according to vanity and whim. Lear’s failure as a father engulfs himself and his world in turmoil and tragedy.";

static async Task SpeedTiktokenSharp()
{
    TikToken tikToken = TikToken.GetEncoding("cl100k_base");
    Stopwatch stopwatch = new Stopwatch();
    stopwatch.Start();

    for (int i = 0; i < 10000; i++) 
    {
        var encoded = tikToken.Encode(kLongText);
        var decoded = tikToken.Decode(encoded);
    }

    stopwatch.Stop();
    TimeSpan timespan = stopwatch.Elapsed;
    double milliseconds = timespan.TotalMilliseconds;
    Console.WriteLine($"SpeedTiktokenSharp = {milliseconds} ms");
}

static async Task SpeedSharpToken()
{
    var encoding = GptEncoding.GetEncoding("cl100k_base");

    Stopwatch stopwatch = new Stopwatch();
    stopwatch.Start();   

    for (int i = 0; i < 10000; i++) 
    {
        var encoded = encoding.Encode(kLongText);
        var decoded = encoding.Decode(encoded);
    }

    stopwatch.Stop();
    TimeSpan timespan = stopwatch.Elapsed;
    double milliseconds = timespan.TotalMilliseconds;
    Console.WriteLine($"SpeedSharpToken = {milliseconds} ms");

}

</details> TiktokenSharp is approximately 57% faster than SharpToken.

  • SpeedTiktokenSharp = 570.1206 ms
  • SpeedSharpToken = 1312.8812 ms

Memory

<details> <summary>Image:</summary>

20230509125926 20230509130021

</details>

TiktokenSharp has approximately 26% less memory usage than SharpToken.

Update

1.0.6 20230625

  • Replace WebClient with HttpClient, add async methods.

1.0.5 20230508

  • New support for .Net Standard 2.0 has been added, making TiktokenSharp usable in the .Net Framework.

1.0.4 20230424

  • Add method TikToken.GetEncoding(encodingName).

1.0.3 20230321

  • GetEncodingSetting now supports the model of gpt-4 and also allows for encoding names to be directly passed in.

1.0.2 20230317

  • add method TikToken.PBEFileDirectory to allow for custom storage directory of bpe files. the path needs to be set before TikToken.EncodingForModel().

1.0.1 20230313

  • p50k_base encoding algorithm that supports the text-davinci-003 model.
Product Compatible and additional computed target framework versions.
.NET net5.0 was computed.  net5.0-windows was computed.  net6.0 was computed.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 was computed.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 was computed.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 was computed.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 was computed.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
.NET Core netcoreapp2.0 was computed.  netcoreapp2.1 was computed.  netcoreapp2.2 was computed.  netcoreapp3.0 was computed.  netcoreapp3.1 was computed. 
.NET Standard netstandard2.0 is compatible.  netstandard2.1 is compatible. 
.NET Framework net461 was computed.  net462 was computed.  net463 was computed.  net47 was computed.  net471 was computed.  net472 was computed.  net48 was computed.  net481 was computed. 
MonoAndroid monoandroid was computed. 
MonoMac monomac was computed. 
MonoTouch monotouch was computed. 
Tizen tizen40 was computed.  tizen60 was computed. 
Xamarin.iOS xamarinios was computed. 
Xamarin.Mac xamarinmac was computed. 
Xamarin.TVOS xamarintvos was computed. 
Xamarin.WatchOS xamarinwatchos was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.
  • .NETStandard 2.0

  • .NETStandard 2.1

    • No dependencies.

NuGet packages (5)

Showing the top 5 NuGet packages that depend on TiktokenSharp:

Package Downloads
Mythosia.AI

## What's New in v6.4.0 ### Streaming Diagnostics (service-level) - New WithStreamDiagnostics(d => d.OnRawLine(...).OnComplete(...)) extension on AIService — register once and every subsequent StreamAsync call automatically invokes the hooks. Same fluent builder pattern as WithRag. - OnRawLine fires for every SSE line received (Debug-level trace of the wire payload). OnComplete fires exactly once on stream exit, success or failure, with a StreamDiagnostics snapshot (LinesRead, AccumulatedTextLength, LastRawLine, Elapsed). - Read-time exceptions are wrapped in StreamReadException with Diagnostics attached and the original exception in InnerException — works regardless of whether WithStreamDiagnostics is registered. - Especially useful against self-hosted vLLM / ollama / unstable proxies, where "the stream just stopped" needs to be told apart from "transport error after N chunks". ### Fixed: NotSupportedException at await foreach ... DisposeAsync() - Replaced the synchronous using (var stream = ...) pattern across all 5 providers (10 SSE loops) with async stream disposal via the new ReadSseLinesAsync helper. Eliminates NotSupportedException thrown by HTTP transports whose stream rejects synchronous Dispose. - The helper's finally block now guards every disposal step with try/catch so a Dispose-time failure cannot mask the real read-side exception, and OnComplete is guaranteed to fire even when disposal fails. ### Fixed: CopyFrom now propagates service-level callbacks - AIService.CopyFrom(source) was silently dropping SystemMessageProvider (declared in v6.3.0 but missing from CopyFrom), StreamRawLineCallback, and StreamCompleteCallback. - These delegates are now propagated by reference, so cross-provider switches (e.g., new AnthropicService(...).CopyFrom(openaiService) in a multi-provider chat UI) keep the registered diagnostics and system-message provider working without re-registration. ### Internal - 5 provider streaming implementations (OpenAI/Grok/Qwen/vLLM, Anthropic, Google, DeepSeek, Perplexity) consolidated to a single ReadSseLinesAsync helper. Removes 9 duplicate SSE-reading loops. Requires Mythosia.AI.Abstractions v2.2.0. --- ## Documentation - Basic Usage: https://github.com/AJ-comp/Mythosia.AI/wiki - Advanced Features: https://github.com/AJ-comp/Mythosia.AI/wiki/Advanced-Features - Release Notes: https://github.com/AJ-comp/Mythosia.AI/wiki/Release-Notes - GitHub: https://github.com/AJ-comp/Mythosia.AI

NjxyChatAISDK

支持通过API调用OpenAI、DeepSeek、Doubao

Mythosia.AI.Providers.Alibaba

Alibaba Cloud Qwen provider package for Mythosia.AI. Includes QwenService with expanded Qwen 3 / 3.5 model constants, platform-specific thinking request handling across DashScope, vLLM, and Ollama, token usage streaming support, and Mythosia.AI v6.4.0 compatibility. Documentation - GitHub: https://github.com/AJ-comp/Mythosia.AI - Release Notes: core/Mythosia.AI.Providers.Alibaba/RELEASE_NOTES.md

TokenFlow.Tokenizers

Model-specific tokenizer implementations for TokenFlow.AI (OpenAI, Claude, etc.).

Serina.Semantic.Ai.Pipelines

Serina Pipelines for Semantic Kernel allows to build flexible Ai Proccessing pipelines.

GitHub repositories (3)

Showing the top 3 popular GitHub repositories that depend on TiktokenSharp:

Repository Stars
StartupHakk/OpenMonoAgent.ai
(BETA) AI shouldn't have a meter. Unlimited tokens. Forever. Your machine. Your agent. Use it from anywhere. Terminal-native coding agent powered by local LLMs — 100% open source, free forever, and installed with a single command. Proudly built on C#/.NET, because AI tooling should be infrastructure, not a subscription.
MayDay-wpf/AIBotPublic
AIBot PRO 是一个基于.NET 8 的 AI聚合客户端 to C 弱 to B 可以集成众多AI产品(ChatGPT,Gemini,Claude,文心一言,通义千问,讯飞星火),无感切换对话,支持知识库、插件开发、AI流程引擎(workflow)、以及开放平台对外输出定制化的特色AI API
dmitry-brazhenko/SharpToken
SharpToken is a C# library for tokenizing natural language text. It's based on the tiktoken Python library and designed to be fast and accurate.
Version Downloads Last Updated
1.2.1 15,801 2/22/2026
1.2.0 37,809 11/19/2025
1.1.8 71,436 8/14/2025
1.1.7 77,384 3/14/2025
1.1.6 24,588 12/24/2024
1.1.5 39,596 10/8/2024
1.1.4 125,723 5/14/2024
1.1.2 628 5/14/2024 1.1.2 is deprecated because it has critical bugs.
1.1.1 291 5/14/2024 1.1.1 is deprecated because it has critical bugs.
1.1.0 13,889 4/8/2024
1.0.9 23,747 2/8/2024
1.0.8 19,461 12/27/2023
1.0.7 38,488 10/10/2023
1.0.6 103,833 6/25/2023
1.0.5 53,675 5/8/2023
1.0.4 3,162 4/24/2023
1.0.3 3,338 3/21/2023
1.0.2 986 3/17/2023
1.0.1 1,644 3/13/2023
1.0.0 1,076 3/7/2023