LLM Service
The service endpoint for chat is: https://model-service.services.silogen.ai/v1/chat/completions.
The base url is: https://model-service.services.silogen.ai
Open API documentation is here:
Remember to send the authentication token that was generated in section Authentication with the request. An example is given below:
curl -X POST https://model-service.services.silogen.ai/v1/chat/completions -H "Authorization: Bearer ${ACCESS_TOKEN}" -H "Content-Type: application/json" -d '{"model": "THE_MODEL_NAME", "messages": [{"role": "user", "content": "YOUR PROMPT HERE"}], "collection": {"collectionId": "THE_COLLECTION_NAME"}, "temperature": 0.2}'