Blizzer API is OpenAI Compatible. This means your API Key works with the OpenAI SDK. Just replace the base_url to point to your litellm proxy. Example Below
import openai
client = openai.OpenAI(
api_key="your_api_key",
base_url="https://api.blizzer.tech" # Blizzer API is OpenAI compatible
)
response = client.chat.completions.create(
model="kiri", # model
messages = [
{
"role": "user",
"content": "this is a test request, write a short poem"
}
]
)
print(response)
from langchain.chat_models import ChatOpenAI
from langchain.prompts.chat import (
ChatPromptTemplate,
HumanMessagePromptTemplate,
SystemMessagePromptTemplate,
)
from langchain.schema import HumanMessage, SystemMessage
chat = ChatOpenAI(
openai_api_base="https://api.blizzer.tech",
model = "kiri",
temperature=0.1
)
messages = [
SystemMessage(
content="You are a helpful assistant that im using to make a test request to."
),
HumanMessage(
content="test from litellm. tell me why it's amazing in 1 sentence"
),
]
response = chat(messages)
print(response)
Developer Quick Start
Set up your environment and make your first API request in minutes
Want to deep dive?
Dive a little deeper and start exploring our API reference to get an idea of everything that's possible with the API: