Authentication
When authenticating with the model endpoint, the following authentication schemes are supported. Note, all authentication methods should only be used over HTTPS (SSL).
Bearer Authentication
Bearer authentication (also called token authentication) is an HTTP authentication scheme that involves security tokens called bearer tokens. The name “Bearer authentication” can be understood as “give access to the bearer of this token.” The bearer token is a cryptic string, usually generated by the server in response to a login request. The client must send this token in the Authorization header when making requests to protected resources:
Authorization: Bearer <token>
The Bearer authentication scheme was originally created as part of OAuth 2.0 in RFC 6750, but is sometimes also used on its own.
When using this authentication, the request sent to the model endpoint will look as follows:
curl <url>/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer <token>" \
-d '{
"messages": [
{
...
}
]
}'
To use Bearer authentication, provide the api_key
argument when registering the model endpoint:
curl --request POST \
--url http://127.0.0.1:5005/api/model-providers/model_endpoints/models \
--header 'X-LatticeFlow-API-Key: $LF_API_KEY' \
--header 'accept: application/json' \
--header 'content-type: application/json' \
--data '
{
"modality": "text",
"task": "chat_completion",
"key": "gpt-4.5-preview",
"api_key": "$OPENAI_API_KEY", # <-- The `api_key` will passed to Bearer header
"model_adapter_key": "openai",
"url": "https://api.openai.com/v1",
"name": "gpt-4.5-preview"
}
'
API Keys
Some APIs use API keys for authorization. An API key is a token that a client provides when making API calls. The key can be sent as a request header:
X-API-Key: <token>
Where X-API-Key
can be arbitrary value selected by the user. Similar to Bearer authentication, the tokens used as API keys are supposed to be a secret that only the client and server know.
When using this authentication, the request sent to the model endpoint will look as follows
curl <url>/chat/completions \
-H "Content-Type: application/json" \
-H "X-API-Key: <token>" \
-d '{
"messages": [
{
...
}
]
}'
To use API Key authentication, provide the api_key
argument when registering the model endpoint
curl --request POST \
--url http://127.0.0.1:5005/api/model-providers/model_endpoints/models \
--header 'X-LatticeFlow-API-Key: $LF_API_KEY' \
--header 'accept: application/json' \
--header 'content-type: application/json' \
--data '
{
"modality": "text",
"task": "chat_completion",
"key": "gpt-4.5-preview",
"api_key": "any value",
"custom_headers": {
"X-API-Key": "<token>" # <-- The `X-API-Key` will passed as header
}
"model_adapter_key": "openai",
"url": "https://api.openai.com/v1",
"name": "gpt-4.5-preview"
}
'
OpenAI schema and clients mandate that
api_key
is a required even when custom API Key authentication is used. In this case, since theapi_key
is not used for authentication, any value can be passed.
Updated about 2 months ago