Working directly with LLMs and JSON - Markdown JSON prompting
Overview¶
While working with Inference class, you can also generate JSON output
from the model inference. This is useful for example when you need to
process the response in a structured way or when you want to store the
elements of the response in a database.
Example¶
In this example we explicitly ask the model to generate a JSON output by responding with a JSON object within a Markdown code block.
This is useful for the models which do not support JSON output directly.
We will also provide an example of the expected JSON output in the prompt to guide the model in generating the correct response.
<?php
require 'examples/boot.php';
use Cognesy\Messages\Messages;
use Cognesy\Polyglot\Inference\Inference;
$data = Inference::using('openai')
->with(
messages: Messages::fromString('What is capital of France? \
Respond with a JSON object in a ```json``` code block containing "name", "population", and "founded". \
Use integer values for population and founded year (negative for BC). Do not include extra text. \
Example: {"name":"Paris","population":2139000,"founded":-250}'),
options: ['max_tokens' => 64, 'temperature' => 0],
)
->asJsonData();
echo "USER: What is capital of France\n";
echo "ASSISTANT:\n";
dump($data);
assert(is_array($data), 'Response should be an array');
assert(isset($data['name']), 'Response should have "name" field');
assert(strpos($data['name'], 'Paris') !== false, 'City name should be Paris');
assert(isset($data['population']), 'Response should have "population" field');
assert(isset($data['founded']), 'Response should have "founded" field');
?>