ReferenceDemo
Demo Chat
Ephemeral chat endpoint for demo purposes. Uses local vLLM (Llama 4) and doesn't persist anything. Compatible with Vercel AI SDK streaming format.
Response Body
application/json
curl -X POST "https://loading/api/demo/chat"null