Environments-18

For instructions on how to authenticate to use this endpoint, see API overview.

Endpoints

POST
POST
GET
POST
GET
PATCH
DELETE
POST
POST
POST

Create environments llm analytics evaluation config set active key

Path parameters

  • project_id
    string

Example request

POST /api/environments/:project_id/llm_analytics/evaluation_config/set_active_key
export POSTHOG_PERSONAL_API_KEY=[your personal api key]
curl
-H 'Content-Type: application/json'\
-H "Authorization: Bearer $POSTHOG_PERSONAL_API_KEY" \
<ph_app_host>/api/environments/:project_id/llm_analytics/evaluation_config/set_active_key/

Example response

Status 200 No response body

Create environments llm analytics evaluation config set active key

Path parameters

  • project_id
    string

Example request

POST /api/environments/:project_id/llm_analytics/evaluation_config/set_active_key
export POSTHOG_PERSONAL_API_KEY=[your personal api key]
curl
-H 'Content-Type: application/json'\
-H "Authorization: Bearer $POSTHOG_PERSONAL_API_KEY" \
<ph_app_host>/api/environments/:project_id/llm_analytics/evaluation_config/set_active_key/

Example response

Status 200 No response body

Create environments llm analytics provider key validations

Validate LLM provider API keys without persisting them

Required API key scopes

llm_provider_key:write

Path parameters

  • project_id
    string

Example request

POST /api/environments/:project_id/llm_analytics/provider_key_validations
export POSTHOG_PERSONAL_API_KEY=[your personal api key]
curl
-H 'Content-Type: application/json'\
-H "Authorization: Bearer $POSTHOG_PERSONAL_API_KEY" \
<ph_app_host>/api/environments/:project_id/llm_analytics/provider_key_validations/

Example response

Status 201 No response body

Create environments llm analytics provider key validations

Validate LLM provider API keys without persisting them

Required API key scopes

llm_provider_key:write

Path parameters

  • project_id
    string

Example request

POST /api/environments/:project_id/llm_analytics/provider_key_validations
export POSTHOG_PERSONAL_API_KEY=[your personal api key]
curl
-H 'Content-Type: application/json'\
-H "Authorization: Bearer $POSTHOG_PERSONAL_API_KEY" \
<ph_app_host>/api/environments/:project_id/llm_analytics/provider_key_validations/

Example response

Status 201 No response body

List all environments llm analytics provider keys

Required API key scopes

llm_provider_key:read

Path parameters

  • project_id
    string

Query parameters

  • limit
    integer
  • offset
    integer

Response


Example request

GET /api/environments/:project_id/llm_analytics/provider_keys
export POSTHOG_PERSONAL_API_KEY=[your personal api key]
curl \
-H "Authorization: Bearer $POSTHOG_PERSONAL_API_KEY" \
<ph_app_host>/api/environments/:project_id/llm_analytics/provider_keys/

Example response

Status 200
RESPONSE
{
"count": 123,
"next": "http://api.example.org/accounts/?offset=400&limit=100",
"previous": "http://api.example.org/accounts/?offset=200&limit=100",
"results": [
{
"id": "497f6eca-6276-4993-bfeb-53cbbbba6f08",
"provider": "openai",
"name": "string",
"state": "unknown",
"error_message": "string",
"api_key": "string",
"api_key_masked": "string",
"set_as_active": false,
"created_at": "2019-08-24T14:15:22Z",
"created_by": {
"id": 0,
"uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
"distinct_id": "string",
"first_name": "string",
"last_name": "string",
"email": "user@example.com",
"is_email_verified": true,
"hedgehog_config": {
"property1": null,
"property2": null
},
"role_at_organization": "engineering"
},
"last_used_at": "2019-08-24T14:15:22Z"
}
]
}

List all environments llm analytics provider keys

Required API key scopes

llm_provider_key:read

Path parameters

  • project_id
    string

Query parameters

  • limit
    integer
  • offset
    integer

Response


Example request

GET /api/environments/:project_id/llm_analytics/provider_keys
export POSTHOG_PERSONAL_API_KEY=[your personal api key]
curl \
-H "Authorization: Bearer $POSTHOG_PERSONAL_API_KEY" \
<ph_app_host>/api/environments/:project_id/llm_analytics/provider_keys/

Example response

Status 200
RESPONSE
{
"count": 123,
"next": "http://api.example.org/accounts/?offset=400&limit=100",
"previous": "http://api.example.org/accounts/?offset=200&limit=100",
"results": [
{
"id": "497f6eca-6276-4993-bfeb-53cbbbba6f08",
"provider": "openai",
"name": "string",
"state": "unknown",
"error_message": "string",
"api_key": "string",
"api_key_masked": "string",
"set_as_active": false,
"created_at": "2019-08-24T14:15:22Z",
"created_by": {
"id": 0,
"uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
"distinct_id": "string",
"first_name": "string",
"last_name": "string",
"email": "user@example.com",
"is_email_verified": true,
"hedgehog_config": {
"property1": null,
"property2": null
},
"role_at_organization": "engineering"
},
"last_used_at": "2019-08-24T14:15:22Z"
}
]
}

Create environments llm analytics provider keys

Required API key scopes

llm_provider_key:write

Path parameters

  • project_id
    string

Request parameters

  • provider
  • name
    string
  • api_key
    string
  • set_as_active
    boolean
    Default: false

Response


Example request

POST /api/environments/:project_id/llm_analytics/provider_keys
export POSTHOG_PERSONAL_API_KEY=[your personal api key]
curl
-H 'Content-Type: application/json'\
-H "Authorization: Bearer $POSTHOG_PERSONAL_API_KEY" \
<ph_app_host>/api/environments/:project_id/llm_analytics/provider_keys/\
-d provider=undefined,\
-d name="string"

Example response

Status 201
RESPONSE
{
"id": "497f6eca-6276-4993-bfeb-53cbbbba6f08",
"provider": "openai",
"name": "string",
"state": "unknown",
"error_message": "string",
"api_key": "string",
"api_key_masked": "string",
"set_as_active": false,
"created_at": "2019-08-24T14:15:22Z",
"created_by": {
"id": 0,
"uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
"distinct_id": "string",
"first_name": "string",
"last_name": "string",
"email": "user@example.com",
"is_email_verified": true,
"hedgehog_config": {
"property1": null,
"property2": null
},
"role_at_organization": "engineering"
},
"last_used_at": "2019-08-24T14:15:22Z"
}

Create environments llm analytics provider keys

Required API key scopes

llm_provider_key:write

Path parameters

  • project_id
    string

Request parameters

  • provider
  • name
    string
  • api_key
    string
  • set_as_active
    boolean
    Default: false

Response


Example request

POST /api/environments/:project_id/llm_analytics/provider_keys
export POSTHOG_PERSONAL_API_KEY=[your personal api key]
curl
-H 'Content-Type: application/json'\
-H "Authorization: Bearer $POSTHOG_PERSONAL_API_KEY" \
<ph_app_host>/api/environments/:project_id/llm_analytics/provider_keys/\
-d provider=undefined,\
-d name="string"

Example response

Status 201
RESPONSE
{
"id": "497f6eca-6276-4993-bfeb-53cbbbba6f08",
"provider": "openai",
"name": "string",
"state": "unknown",
"error_message": "string",
"api_key": "string",
"api_key_masked": "string",
"set_as_active": false,
"created_at": "2019-08-24T14:15:22Z",
"created_by": {
"id": 0,
"uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
"distinct_id": "string",
"first_name": "string",
"last_name": "string",
"email": "user@example.com",
"is_email_verified": true,
"hedgehog_config": {
"property1": null,
"property2": null
},
"role_at_organization": "engineering"
},
"last_used_at": "2019-08-24T14:15:22Z"
}

Retrieve environments llm analytics provider keys

Required API key scopes

llm_provider_key:read

Path parameters

  • id
    string
  • project_id
    string

Response


Example request

GET /api/environments/:project_id/llm_analytics/provider_keys/:id
export POSTHOG_PERSONAL_API_KEY=[your personal api key]
curl \
-H "Authorization: Bearer $POSTHOG_PERSONAL_API_KEY" \
<ph_app_host>/api/environments/:project_id/llm_analytics/provider_keys/:id/

Example response

Status 200
RESPONSE
{
"id": "497f6eca-6276-4993-bfeb-53cbbbba6f08",
"provider": "openai",
"name": "string",
"state": "unknown",
"error_message": "string",
"api_key": "string",
"api_key_masked": "string",
"set_as_active": false,
"created_at": "2019-08-24T14:15:22Z",
"created_by": {
"id": 0,
"uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
"distinct_id": "string",
"first_name": "string",
"last_name": "string",
"email": "user@example.com",
"is_email_verified": true,
"hedgehog_config": {
"property1": null,
"property2": null
},
"role_at_organization": "engineering"
},
"last_used_at": "2019-08-24T14:15:22Z"
}

Retrieve environments llm analytics provider keys

Required API key scopes

llm_provider_key:read

Path parameters

  • id
    string
  • project_id
    string

Response


Example request

GET /api/environments/:project_id/llm_analytics/provider_keys/:id
export POSTHOG_PERSONAL_API_KEY=[your personal api key]
curl \
-H "Authorization: Bearer $POSTHOG_PERSONAL_API_KEY" \
<ph_app_host>/api/environments/:project_id/llm_analytics/provider_keys/:id/

Example response

Status 200
RESPONSE
{
"id": "497f6eca-6276-4993-bfeb-53cbbbba6f08",
"provider": "openai",
"name": "string",
"state": "unknown",
"error_message": "string",
"api_key": "string",
"api_key_masked": "string",
"set_as_active": false,
"created_at": "2019-08-24T14:15:22Z",
"created_by": {
"id": 0,
"uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
"distinct_id": "string",
"first_name": "string",
"last_name": "string",
"email": "user@example.com",
"is_email_verified": true,
"hedgehog_config": {
"property1": null,
"property2": null
},
"role_at_organization": "engineering"
},
"last_used_at": "2019-08-24T14:15:22Z"
}

Update environments llm analytics provider keys

Required API key scopes

llm_provider_key:write

Path parameters

  • id
    string
  • project_id
    string

Request parameters

  • provider
  • name
    string
  • api_key
    string
  • set_as_active
    boolean
    Default: false

Response


Example request

PATCH /api/environments/:project_id/llm_analytics/provider_keys/:id
export POSTHOG_PERSONAL_API_KEY=[your personal api key]
curl -X PATCH \
-H "Authorization: Bearer $POSTHOG_PERSONAL_API_KEY" \
<ph_app_host>/api/environments/:project_id/llm_analytics/provider_keys/:id/\
-d provider=undefined

Example response

Status 200
RESPONSE
{
"id": "497f6eca-6276-4993-bfeb-53cbbbba6f08",
"provider": "openai",
"name": "string",
"state": "unknown",
"error_message": "string",
"api_key": "string",
"api_key_masked": "string",
"set_as_active": false,
"created_at": "2019-08-24T14:15:22Z",
"created_by": {
"id": 0,
"uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
"distinct_id": "string",
"first_name": "string",
"last_name": "string",
"email": "user@example.com",
"is_email_verified": true,
"hedgehog_config": {
"property1": null,
"property2": null
},
"role_at_organization": "engineering"
},
"last_used_at": "2019-08-24T14:15:22Z"
}

Update environments llm analytics provider keys

Required API key scopes

llm_provider_key:write

Path parameters

  • id
    string
  • project_id
    string

Request parameters

  • provider
  • name
    string
  • api_key
    string
  • set_as_active
    boolean
    Default: false

Response


Example request

PATCH /api/environments/:project_id/llm_analytics/provider_keys/:id
export POSTHOG_PERSONAL_API_KEY=[your personal api key]
curl -X PATCH \
-H "Authorization: Bearer $POSTHOG_PERSONAL_API_KEY" \
<ph_app_host>/api/environments/:project_id/llm_analytics/provider_keys/:id/\
-d provider=undefined

Example response

Status 200
RESPONSE
{
"id": "497f6eca-6276-4993-bfeb-53cbbbba6f08",
"provider": "openai",
"name": "string",
"state": "unknown",
"error_message": "string",
"api_key": "string",
"api_key_masked": "string",
"set_as_active": false,
"created_at": "2019-08-24T14:15:22Z",
"created_by": {
"id": 0,
"uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
"distinct_id": "string",
"first_name": "string",
"last_name": "string",
"email": "user@example.com",
"is_email_verified": true,
"hedgehog_config": {
"property1": null,
"property2": null
},
"role_at_organization": "engineering"
},
"last_used_at": "2019-08-24T14:15:22Z"
}

Delete environments llm analytics provider keys

Required API key scopes

llm_provider_key:write

Path parameters

  • id
    string
  • project_id
    string

Example request

DELETE /api/environments/:project_id/llm_analytics/provider_keys/:id
export POSTHOG_PERSONAL_API_KEY=[your personal api key]
curl -X DELETE \
-H "Authorization: Bearer $POSTHOG_PERSONAL_API_KEY" \
<ph_app_host>/api/environments/:project_id/llm_analytics/provider_keys/:id/

Example response

Status 204 No response body

Delete environments llm analytics provider keys

Required API key scopes

llm_provider_key:write

Path parameters

  • id
    string
  • project_id
    string

Example request

DELETE /api/environments/:project_id/llm_analytics/provider_keys/:id
export POSTHOG_PERSONAL_API_KEY=[your personal api key]
curl -X DELETE \
-H "Authorization: Bearer $POSTHOG_PERSONAL_API_KEY" \
<ph_app_host>/api/environments/:project_id/llm_analytics/provider_keys/:id/

Example response

Status 204 No response body

Create environments llm analytics provider keys validate

Path parameters

  • id
    string
  • project_id
    string

Request parameters

  • provider
  • name
    string
  • api_key
    string
  • set_as_active
    boolean
    Default: false

Response


Example request

POST /api/environments/:project_id/llm_analytics/provider_keys/:id/validate
export POSTHOG_PERSONAL_API_KEY=[your personal api key]
curl
-H 'Content-Type: application/json'\
-H "Authorization: Bearer $POSTHOG_PERSONAL_API_KEY" \
<ph_app_host>/api/environments/:project_id/llm_analytics/provider_keys/:id/validate/\
-d provider=undefined,\
-d name="string"

Example response

Status 200
RESPONSE
{
"id": "497f6eca-6276-4993-bfeb-53cbbbba6f08",
"provider": "openai",
"name": "string",
"state": "unknown",
"error_message": "string",
"api_key": "string",
"api_key_masked": "string",
"set_as_active": false,
"created_at": "2019-08-24T14:15:22Z",
"created_by": {
"id": 0,
"uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
"distinct_id": "string",
"first_name": "string",
"last_name": "string",
"email": "user@example.com",
"is_email_verified": true,
"hedgehog_config": {
"property1": null,
"property2": null
},
"role_at_organization": "engineering"
},
"last_used_at": "2019-08-24T14:15:22Z"
}

Create environments llm analytics provider keys validate

Path parameters

  • id
    string
  • project_id
    string

Request parameters

  • provider
  • name
    string
  • api_key
    string
  • set_as_active
    boolean
    Default: false

Response


Example request

POST /api/environments/:project_id/llm_analytics/provider_keys/:id/validate
export POSTHOG_PERSONAL_API_KEY=[your personal api key]
curl
-H 'Content-Type: application/json'\
-H "Authorization: Bearer $POSTHOG_PERSONAL_API_KEY" \
<ph_app_host>/api/environments/:project_id/llm_analytics/provider_keys/:id/validate/\
-d provider=undefined,\
-d name="string"

Example response

Status 200
RESPONSE
{
"id": "497f6eca-6276-4993-bfeb-53cbbbba6f08",
"provider": "openai",
"name": "string",
"state": "unknown",
"error_message": "string",
"api_key": "string",
"api_key_masked": "string",
"set_as_active": false,
"created_at": "2019-08-24T14:15:22Z",
"created_by": {
"id": 0,
"uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
"distinct_id": "string",
"first_name": "string",
"last_name": "string",
"email": "user@example.com",
"is_email_verified": true,
"hedgehog_config": {
"property1": null,
"property2": null
},
"role_at_organization": "engineering"
},
"last_used_at": "2019-08-24T14:15:22Z"
}

Create environments llm analytics summarization

Generate an AI-powered summary of an LLM trace or event.

This endpoint analyzes the provided trace/event, generates a line-numbered text representation, and uses an LLM to create a concise summary with line references.

Summary Format:

  • 5-10 bullet points covering main flow and key decisions
  • "Interesting Notes" section for failures, successes, or unusual patterns
  • Line references in [L45] or [L45-52] format pointing to relevant sections

Use Cases:

  • Quick understanding of complex traces
  • Identifying key events and patterns
  • Debugging with AI-assisted analysis
  • Documentation and reporting

The response includes the summary text and optional metadata.

Required API key scopes

llm_analytics:write

Path parameters

  • project_id
    string

Request parameters

  • summarize_type
  • mode
    Default: minimal
  • data
  • force_refresh
    boolean
    Default: false
  • provider
  • model
    string

Response


Example request

POST /api/environments/:project_id/llm_analytics/summarization
export POSTHOG_PERSONAL_API_KEY=[your personal api key]
curl
-H 'Content-Type: application/json'\
-H "Authorization: Bearer $POSTHOG_PERSONAL_API_KEY" \
<ph_app_host>/api/environments/:project_id/llm_analytics/summarization/\
-d summarize_type=undefined,\
-d data=undefined

Example response

Status 200
RESPONSE
{
"summary": "## Summary\n- User initiated conversation with greeting [L5-8]\n- Assistant responded with friendly message [L12-15]\n\n## Interesting Notes\n- Standard greeting pattern with no errors",
"metadata": {
"text_repr_length": 450,
"model": "gpt-4.1"
}
}
Status 400
Status 403
Status 500

Create environments llm analytics summarization batch check

Check which traces have cached summaries available.

This endpoint allows batch checking of multiple trace IDs to see which ones have cached summaries. Returns only the traces that have cached summaries with their titles.

Use Cases:

  • Load cached summaries on session view load
  • Avoid unnecessary LLM calls for already-summarized traces
  • Display summary previews without generating new summaries

Path parameters

  • project_id
    string

Request parameters

  • trace_ids
    array
  • mode
    Default: minimal
  • provider
  • model
    string

Response


Example request

POST /api/environments/:project_id/llm_analytics/summarization/batch_check
export POSTHOG_PERSONAL_API_KEY=[your personal api key]
curl
-H 'Content-Type: application/json'\
-H "Authorization: Bearer $POSTHOG_PERSONAL_API_KEY" \
<ph_app_host>/api/environments/:project_id/llm_analytics/summarization/batch_check/\
-d trace_ids="array"

Example response

Status 200
RESPONSE
{
"summaries": [
{
"trace_id": "string",
"title": "string",
"cached": true
}
]
}
Status 400
Status 403
Next page →

Community questions

Questions about this page? or post a community question.