Class OpenRouterModel
Inheritance
Namespace: Glitch9.AIDevKit.OpenRouter
Assembly: .dll
Syntax
public class OpenRouterModel
Fields
Dolphin_Mixtral_8x7b
OpenRouter's Large Language Model This is a 16k context fine-tune of Mixtral-8x7b. It excels in coding tasks due to extensive training with coding data and is known for its obedience, although it lacks DPO tuning. The model is uncensored and is stripped of alignment and bias. It requires an external alignment layer for ethical use. Users are cautioned to use this highly compliant model responsibly, as detailed in a blog post about uncensored models at erichartford. com/uncensored-models. #moe #uncensored
Declaration
public const string Dolphin_Mixtral_8x7b = "cognitivecomputations/dolphin-mixtral-8x7b"
Field Value
Type | Description |
---|---|
string |
Ministral8b
OpenRouter's Large Language Model Ministral 8B is a state-of-the-art language model optimized for on-device and edge computing. Designed for efficiency in knowledge-intensive tasks, commonsense reasoning, and function-calling, it features a specialized interleaved sliding-window attention mechanism, enabling faster and more memory-efficient inference. Ministral 8B excels in local, low-latency applications such as offline translation, smart assistants, autonomous robotics, and local analytics. The model supports up to 128k context length and can function as a performant intermediary in multi-step agentic workflows, efficiently handling tasks like input parsing, API calls, and task routing. It consistently outperforms comparable models like Mistral 7B across benchmarks, making it particularly suitable for compute-efficient, privacy-focused scenarios.
Declaration
public const string Ministral8b = "mistral/ministral-8b"
Field Value
Type | Description |
---|---|
string |