Skip to content

LLM model and options

Changing LLM model and options

You can specify model and other options that will be passed to OpenAI / LLM endpoint.

<?php
$person = (new Instructor)->respond(
    messages: [['role' => 'user', 'content' => $text]],
    responseModel: Person::class,
    model: 'gpt-3.5-turbo',
    options: [
        // custom temperature setting
        'temperature' => 0.0
        // ... other options
    ],
);

Providing custom client

You can pass a custom configured instance of client to the Instructor. This allows you to specify your own API key, base URI or organization.

<?php
use Cognesy\Instructor\Instructor;
use Cognesy\Utils\Env;
use Cognesy\Instructor\Clients\OpenAI\OpenAIClient;

// Create instance of OpenAI client initialized with custom parameters
$client = new OpenAIClient(
    apiKey: $yourApiKey,
    baseUri: 'https://api.openai.com/v1',
    organization: '',
    connectTimeout: 3,
    requestTimeout: 30,
);

/// Get Instructor with the default client component overridden with your own
$instructor = (new Instructor)->withClient($client);

$person = $instructor->respond(
    messages: [['role' => 'user', 'content' => $text]],
    responseModel: Person::class,
    model: 'gpt-3.5-turbo',
    options: ['temperature' => 0.0],
);