LiteLLM
Integration with LiteLLM's OpenAI-Compatible Endpoints with Infron AI
Account & API Keys Setup
Usage - completion
import litellm
import os
response = litellm.completion(
model="openai/<<Model Name>>", # add `openai/` prefix to model so litellm knows to route to OpenAI
api_key="<<API key>>", # api key to your openai compatible endpoint
api_base="https://llm.onerouter.pro/v1", # set API Base of your Custom OpenAI Endpoint
messages=[
{
"role": "user",
"content": "Hey, how's it going?",
}
],
)
print(response.json())
Usage - embedding
Usage with LiteLLM Proxy Server

Last updated