Skip to main content
PATCH
/
api
/
v1
/
proxies
/
{proxy_id}
Patch Proxy
curl --request PATCH \
  --url https://api.example.com/api/v1/proxies/{proxy_id} \
  --header 'Content-Type: application/json' \
  --data '
{
  "name": "<string>",
  "description": "<string>",
  "system_prompt": "<string>",
  "intent": "<string>",
  "source": "auto",
  "llm_provider": "<string>",
  "llm_model": "<string>",
  "llm_params": {}
}
'
{
  "id": "3c90c3cc-0d44-4b50-8888-8dd25736052a",
  "created_at": "2023-11-07T05:31:56Z",
  "updated_at": "2023-11-07T05:31:56Z",
  "org_id": "3c90c3cc-0d44-4b50-8888-8dd25736052a",
  "name": "<string>",
  "description": "<string>",
  "system_prompt": "<string>",
  "intent": "<string>",
  "status": "draft",
  "source": "auto",
  "llm_provider": "<string>",
  "llm_model": "<string>",
  "llm_params": {},
  "proxy_url": "<string>"
}

Headers

Authorization
string | null
X-Request-ID
string | null

Path Parameters

proxy_id
string<uuid>
required

Body

application/json
name
string | null
description
string | null
system_prompt
string | null
intent
string | null
source
enum<string> | null
Available options:
auto,
manual
llm_provider
string | null
llm_model
string | null
llm_params
Llm Params · object

Response

Successful Response

id
string<uuid>
required
created_at
string<date-time>
required
updated_at
string<date-time>
required
org_id
string<uuid>
required
name
string | null
required
description
string | null
required
system_prompt
string | null
required
intent
string | null
required
status
enum<string>
required
Available options:
draft,
active
source
enum<string>
required
Available options:
auto,
manual
llm_provider
string | null
required
llm_model
string | null
required
llm_params
Llm Params · object
required
proxy_url
string
required

Per-proxy runtime endpoint. Customers use this as their OpenAI base_url: {proxy_url}/openai/v1. Empty when PROXY_BASE_URL is not configured on the control plane.