Skip to content

Llm Providers

Instructor is provider-agnostic. Provider selection is handled through LLMConfig and the Polyglot runtime that powers the underlying inference calls. Once the configuration is resolved, your application code stays the same regardless of which provider you use.

Selecting a Provider

The most common approaches:

use Cognesy\Instructor\StructuredOutput;
use Cognesy\Polyglot\Inference\Config\LLMConfig;

// Use a named preset
$so = StructuredOutput::using('anthropic');

// Build from a config object
$so = StructuredOutput::fromConfig(
    LLMConfig::fromPreset('openai')
);

Presets are YAML files stored in the config/llm/presets directory. Each file defines the API URL, driver, default model, and other provider-specific settings.

Supported Providers

The following providers have built-in presets:

Provider Preset name
A21 a21
Anthropic anthropic
AWS Bedrock aws-bedrock
Azure OpenAI azure
Cerebras cerebras
Cohere cohere
DeepSeek deepseek
DeepSeek (Reasoning) deepseek-r
Fireworks fireworks
Google Gemini gemini
Gemini (OpenAI-compatible) gemini-oai
GLM glm
Groq groq
Hugging Face huggingface
Inception inception
Meta meta
MiniMaxi minimaxi
MiniMaxi (OpenAI-compatible) minimaxi-oai
Mistral mistral
Moonshot / Kimi moonshot-kimi
Ollama ollama
OpenAI openai
OpenAI Responses openai-responses
OpenRouter openrouter
Perplexity perplexity
Qwen qwen
SambaNova sambanova
Together together
xAI xai

Custom Providers

Any OpenAI-compatible API can be used by building an LLMConfig manually:

use Cognesy\Polyglot\Inference\Config\LLMConfig;
use Cognesy\Instructor\StructuredOutput;

$config = new LLMConfig(
    apiUrl: 'https://my-provider.example.com/v1',
    apiKey: $_ENV['MY_PROVIDER_KEY'],
    model: 'my-model',
    driver: 'openai-compatible',
    maxTokens: 2048,
);

$result = StructuredOutput::fromConfig($config)
    ->with(
        messages: 'Extract the data.',
        responseModel: MyModel::class,
    )
    ->get();