Does the API work with Llama3 format to provide system + user-assistant conversation?
#29
by
maxikq
- opened
Hi,
I need to provide the context to the model by defining system context + examples using user-assistant conversation using the format described here:
https://llama.meta.com/docs/model-cards-and-prompt-formats/meta-llama-3
Does the API support this format? I couldn't find any information in the documentation what's the expected format of "inputs" field.
Inference API example shows plain text.
This comment has been hidden