Ollama Llama 3.1
Generate a completion
Generate a response for a given prompt with a provided model. This is a streaming endpoint.
POST
{accessDomainName}
/
api
/
generate
Authorizations
Salad-Api-Key
string
headerrequiredBody
application/json
model
string
default: llama3.1:8bThe model name (required)
prompt
string
The prompt to generate a response for
stream
boolean
default: falseWhether to stream the response or not
options
object
Additional model parameters
system
string
System prompt to override the model's definition
template
string
The full prompt or prompt template to override the model's definition
Response
200 - application/json
model
string
created_at
string
response
string
done
boolean
total_duration
integer
load_duration
integer
sample_count
integer
sample_duration
integer
prompt_eval_count
integer
prompt_eval_duration
integer
eval_count
integer
eval_duration
integer
context
integer[]
Was this page helpful?