What Is ai-plugin.json?

The ai-plugin.json manifest is a machine-readable file that describes an API plugin intended for consumption by AI systems — most notably large language models (LLMs) like ChatGPT. Placed at a well-known URL on your domain (/.well-known/ai-plugin.json), it tells AI systems what your service does, how to authenticate, and where to find the full API specification.

Think of it as the equivalent of a robots.txt or sitemap.xml, but for AI agents that want to call your API on behalf of users.

The Anatomy of an ai-plugin.json File

A typical manifest includes the following fields:

FieldTypeDescription
schema_versionstringVersion of the manifest schema (e.g., "v1")
name_for_humanstringDisplay name shown to users
name_for_modelstringShort identifier the LLM uses internally
description_for_humanstringPlain-language description of what the plugin does
description_for_modelstringTechnical prompt-style description to guide LLM behavior
authobjectAuthentication method: none, user_http, or oauth
apiobjectPoints to the OpenAPI spec (type + URL)
logo_urlstringURL to the plugin's logo image
contact_emailstringContact email for the plugin developer
legal_info_urlstringURL to terms of service / legal page

The description_for_model Field — And Why It's Critical

Unlike description_for_human, the description_for_model field is written specifically for the LLM. It functions similarly to a system prompt, explaining:

  • When the LLM should use this plugin (trigger conditions)
  • What kinds of queries the API can and cannot answer
  • Any important caveats or data freshness notes
  • How to interpret responses from the API

This field directly influences how reliably an AI will invoke your plugin. A vague description leads to missed invocations or misuse; a precise one leads to accurate, contextually appropriate API calls.

Authentication Options

The auth block supports several modes:

  • None: No authentication required — suitable for public APIs.
  • Service HTTP: A single API key sent in a header, managed by the plugin operator.
  • User HTTP: Each user provides their own API key via bearer token or basic auth.
  • OAuth: Full OAuth 2.0 flow, allowing the AI to act on behalf of an authenticated user.

How It Connects to an OpenAPI Spec

The api field in the manifest points to an OpenAPI 3.0 specification (in YAML or JSON). This spec defines every endpoint the AI can call, along with parameter schemas, response formats, and descriptions. The LLM reads both the manifest and the OpenAPI spec to understand what operations are possible and how to construct valid requests.

Writing clear, descriptive summary and description fields in your OpenAPI spec is just as important as crafting the manifest — the LLM uses all of it for reasoning.

Beyond ChatGPT: A Broader Standard

While ai-plugin.json originated with OpenAI's plugin ecosystem, the concept of a machine-readable API manifest for LLM consumption has influenced broader conversations around AI agent discoverability. As autonomous AI agents become more capable, standardized ways to advertise API capabilities — what a service does, how to call it, and what trust signals it provides — will become increasingly important across the industry.

Understanding ai-plugin.json today gives developers a head start on building AI-accessible services in an agentic future.