Authentication
One of the most common issues when working with LLM APIs is authentication problems.
Symptoms¶
- Error messages containing terms like "authentication failed," "invalid API key," or "unauthorized"
- HTTP status codes 401 or 403
Solutions¶
-
Verify API Key: Ensure your API key is correctly set in your environment variables
-
Check API Key Format: Some providers require specific formats for API keys
// OpenAI keys typically start with 'sk-' if (!str_starts_with(getenv('OPENAI_API_KEY'), 'sk-')) { echo "OpenAI API key format is incorrect\n"; } // Anthropic keys typically start with 'sk-ant-' if (!str_starts_with(getenv('ANTHROPIC_API_KEY'), 'sk-ant-')) { echo "Anthropic API key format is incorrect\n"; }
-
Test Keys Directly: Use a simple script to test your API keys
<?php
use Cognesy\Polyglot\Inference\Inference;
use Cognesy\Polyglot\Inference\LLMProvider;
use Cognesy\Http\Exceptions\HttpRequestException;
function testApiKey(string $preset): bool {
try {
$llm = (new LLMFactory)->fromPreset($preset);
$inference = new Inference($llm);
$inference->with(
messages: 'Test message',
options: ['max_tokens' => 5]
)->get();
echo "Connection using '$connection' is working correctly\n";
return true;
} catch (HttpRequestException $e) {
echo "Error with connection '$connection': " . $e->getMessage() . "\n";
return false;
}
}
// Test major providers
testApiKey('openai');
testApiKey('anthropic');
testApiKey('mistral');
?>
- Environment Variables: Ensure your environment variables are being loaded correctly