Search Results for

    Show / Hide Table of Contents

    Class TextChunk

    Represents a chunk of text content in an LLM streaming response. Used for real-time display of AI-generated text as it arrives from the model.

    Inheritance
    object
    TextChunk
    ToolCallArguments
    Implements
    ITextChunk
    Inherited Members
    object.Equals(object)
    object.Equals(object, object)
    object.GetHashCode()
    object.GetType()
    object.MemberwiseClone()
    object.ReferenceEquals(object, object)
    Namespace: Glitch9.AIDevKit
    Assembly: Glitch9.AIDevKit.dll
    Syntax
    public class TextChunk : ITextChunk

    Constructors

    | Edit this page View Source

    TextChunk()

    Declaration
    public TextChunk()
    | Edit this page View Source

    TextChunk(TextChunkType, string, Annotation[])

    Creates a new LLM text chunk with the specified type, content, and optional annotations.

    Declaration
    public TextChunk(TextChunkType type, string content, Annotation[] annotations = null)
    Parameters
    Type Name Description
    TextChunkType type

    The type of text content this chunk represents.

    string content

    The text content generated by the LLM.

    Annotation[] annotations

    Optional metadata annotations (e.g., citations, file references).

    Properties

    | Edit this page View Source

    Annotations

    Additional metadata annotations attached to the text. Annotations can include citations, file references, or other contextual information that the LLM attaches to specific text segments. For example, when the model references a document, it may include a citation annotation pointing to the source file and character range.

    Declaration
    public Annotation[] Annotations { get; set; }
    Property Value
    Type Description
    Annotation[]
    | Edit this page View Source

    FinishReason

    The reason why the LLM stopped generating text (e.g., completed naturally, reached token limit, triggered a tool call, or was stopped by content filtering).

    Declaration
    public FinishReason? FinishReason { get; set; }
    Property Value
    Type Description
    FinishReason?
    | Edit this page View Source

    Role

    The role of the message author (typically Assistant for LLM responses).

    Declaration
    public ChatRole Role { get; set; }
    Property Value
    Type Description
    ChatRole
    | Edit this page View Source

    Text

    The actual text content generated by the LLM. Can be a partial chunk (delta) during streaming or the complete text when the response is finished.

    Declaration
    public string Text { get; set; }
    Property Value
    Type Description
    string
    | Edit this page View Source

    Type

    The type of text content being streamed from the LLM.

    Declaration
    public TextChunkType Type { get; set; }
    Property Value
    Type Description
    TextChunkType

    Methods

    | Edit this page View Source

    ToString()

    Declaration
    public override string ToString()
    Returns
    Type Description
    string
    Overrides
    object.ToString()

    Operators

    | Edit this page View Source

    implicit operator string(TextChunk)

    Declaration
    public static implicit operator string(TextChunk delta)
    Parameters
    Type Name Description
    TextChunk delta
    Returns
    Type Description
    string

    Implements

    ITextChunk

    Extension Methods

    ArrayExtensions.ToArrayOrEmpty<T>(T)
    ArrayExtensions.ToArrayOrNull<T>(T)
    EventExtensions.ToDelta<T>(T, string, int, string, bool)
    EventExtensions.ToDone<T>(T, string, int, string)
    EventExtensions.ToEvent<T>(T)
    EventExtensions.UpcastDelta<T1, T2>(T1, string, int, string, bool)
    FallbackExtensions.IsOr<TParent, TChild>(TParent, TChild)
    ResponseCastingExtensions.GetResult<T>(T)
    SystemExtensions.GetName(object)
    • Edit this page
    • View Source
    In this article
    Back to top AI DevKit Documentation