Search Results for

    Show / Hide Table of Contents

    Namespace Glitch9.AIDevKit.LMStudio

    Classes

    LMStudioClient

    LMStudioEphemeralMcp

    Inline (ephemeral) MCP server configuration for native LM Studio chat requests. Passed as the "ephemeral_mcp" field in the request body. Requires explicit opt-in via LMStudioSettings.

    LMStudioMcpServer

    LMStudioModelInfo

    A single model entry from LM Studio's /v1/models endpoint. Follows the OpenAI model object format.

    LMStudioModelListResponse

    Response wrapper for GET /v1/models.

    LMStudioNativeChatRequest

    Request model for LM Studio native /api/v1/chat endpoint (Track B). Supports stateful chat via conversation_id and MCP tool integrations. IMPORTANT: Do NOT confuse with ChatCompletionRequest ??kept separate to avoid contaminating the unified API surface with native-only fields.

    LMStudioNativeChatResponse

    Response from LM Studio native /api/v1/chat endpoint (Track B). Contains conversation_id for stateful session continuation.

    LMStudioNativeChoice

    LMStudioNativeStreamChoice

    LMStudioNativeStreamEvent

    Stream event from native /api/v1/chat (Track B, stream=true). The final event includes "stats"; intermediate events contain delta content.

    LMStudioPlugin

    A plugin entry for native LM Studio chat requests. Passed as elements of the "plugin" array in the request body.

    LMStudioResponseStats

    LMStudioSettings

    ModelLoadRequest

    ModelLoadResponse

    ModelStatusResponse

    ModelUnloadRequest

    NativeChatMessage

    NativeConversationSession

    Holds stateful session data for LM Studio native /api/v1/chat. Lifecycle: Create (first request, no id) ??Continue (pass id) ??Clear (drop reference). Server-side sessions expire automatically; no explicit delete endpoint exists.

    In this article
    Back to top AI DevKit Documentation