- 
                Notifications
    
You must be signed in to change notification settings  - Fork 114
 
Open
Description
vLLM recently added support for the new Responses API, which provides more flexible and agentic capabilities compared to the older endpoints. To take advantage of this improvement, KubeAI should add proxy support for the Responses API endpoint. This would allow users to access the new functionality seamlessly through KubeAI, enabling more advanced and interactive workflows. Adding this support ensures that users can leverage the enhanced features of vLLM’s Responses API.
vllm-project/vllm#20504
Relevant section in KubeAI router:
https://github.com/kubeai-project/kubeai/blob/main/internal/openaiserver/handler.go#L38
Metadata
Metadata
Assignees
Labels
No labels