Skip to main content
POST
/
api
/
v1
/
proxies
/
{proxy_id}
/
activate
Activate Proxy Route
curl --request POST \
  --url https://api.example.com/api/v1/proxies/{proxy_id}/activate
{
  "id": "3c90c3cc-0d44-4b50-8888-8dd25736052a",
  "created_at": "2023-11-07T05:31:56Z",
  "updated_at": "2023-11-07T05:31:56Z",
  "org_id": "3c90c3cc-0d44-4b50-8888-8dd25736052a",
  "name": "<string>",
  "description": "<string>",
  "system_prompt": "<string>",
  "intent": "<string>",
  "status": "draft",
  "source": "auto",
  "llm_provider": "<string>",
  "llm_model": "<string>",
  "llm_params": {},
  "proxy_url": "<string>"
}

Headers

Authorization
string | null
X-Request-ID
string | null

Path Parameters

proxy_id
string<uuid>
required

Response

Successful Response

id
string<uuid>
required
created_at
string<date-time>
required
updated_at
string<date-time>
required
org_id
string<uuid>
required
name
string | null
required
description
string | null
required
system_prompt
string | null
required
intent
string | null
required
status
enum<string>
required
Available options:
draft,
active
source
enum<string>
required
Available options:
auto,
manual
llm_provider
string | null
required
llm_model
string | null
required
llm_params
Llm Params · object
required
proxy_url
string
required

Per-proxy runtime endpoint. Customers use this as their OpenAI base_url: {proxy_url}/openai/v1. Empty when PROXY_BASE_URL is not configured on the control plane.