Skip to main content

JSON Schema Match (json_schema_match)

Contents

Metric description

JSON schema match first ensures the output is valid JSON, then checks it against a JSON Schema (types, required fields, constraints). Use it when you need structural validation beyond “valid JSON”.

How to interpret the score

  • 100: JSON parses and conforms to the provided schema.
  • 0: invalid JSON, schema mismatch, or missing or invalid configuration for the check.

API usage

Prerequisites

After the environment variables are configured, the next step is to create a JSON payload for the custom-runs request. For a field-by-field description of the payload (top-level keys, evaluations, and each row in data), see Custom run request body.

Shortname: json_schema_match

Default threshold: 100

Structural metrics run without an LLM (deterministic checks). Your run may still include model_slug where the API expects it; scoring does not depend on it for this category.

Inputs (each object in data)

  • output (str, required): JSON text to validate.

metric_args

  • schema (object optional): JSON Schema object to validate against.

  • validator (string optional): Validator implementation: "Draft7Validator" or "Draft202012Validator". Default: "Draft7Validator".

Important

For real runs, pass a schema in metric_args that matches the shape you expect. Without a schema, validation cannot succeed.

Eval metadata

Structural metrics do not populate eval_metadata; the field is omitted or ull on the result object.

Example

import json
import os

import requests
from dotenv import load_dotenv

load_dotenv(override=True)

_API_KEY = os.getenv("AEGIS_API_KEY")
_BASE_URL = os.getenv("AEGIS_API_BASE_URL")
_CUSTOM_RUN_URL = f"{_BASE_URL}/runs/custom"


def post_custom_run(payload: dict) -> requests.Response:
"""POST JSON payload to Aegis custom runs; returns the raw response."""
headers = {
"Content-Type": "application/json",
"Authorization": f"Bearer {_API_KEY}",
}
return requests.post(
_CUSTOM_RUN_URL,
headers=headers,
data=json.dumps(payload),
)


if __name__ == "__main__":
data = [
{"output": "{\"count\": 3}"}
]

payload = {
"threshold": 100,
"model_slug": "o4-mini",
"is_blocking": True,
"data_collection_id": None,
"evaluations": [
{
"metrics": [
{
"metric": "json_schema_match",
"metric_args": {
"schema": {
"type": "object",
"properties": {"count": {"type": "integer"}},
"required": ["count"],
},
"validator": "Draft7Validator",
},
},
],
"threshold": 100,
"model_slug": "o4-mini",
"data": data,
}
],
}

response = post_custom_run(payload)
response.raise_for_status()
print(json.dumps(response.json(), indent=2))