Deepseek
We support ALL Deepseek models, just set deepseek/
as a prefix when sending completion requests
API Key​
# env variable
os.environ['DEEPSEEK_API_KEY']
Sample Usage​
from litellm import completion
import os
os.environ['DEEPSEEK_API_KEY'] = ""
response = completion(
model="deepseek/deepseek-chat",
messages=[
{"role": "user", "content": "hello from litellm"}
],
)
print(response)
Sample Usage - Streaming​
from litellm import completion
import os
os.environ['DEEPSEEK_API_KEY'] = ""
response = completion(
model="deepseek/deepseek-chat",
messages=[
{"role": "user", "content": "hello from litellm"}
],
stream=True
)
for chunk in response:
print(chunk)
Supported Models - ALL Deepseek Models Supported!​
We support ALL Deepseek models, just set deepseek/
as a prefix when sending completion requests
Model Name | Function Call |
---|---|
deepseek-chat | completion(model="deepseek/deepseek-chat", messages) |
deepseek-coder | completion(model="deepseek/deepseek-coder", messages) |