Files
vllm/docs/source/deployment/integrations/kserve.md
2025-01-07 11:20:01 +08:00

310 B

(deployment-kserve)=

KServe

vLLM can be deployed with KServe on Kubernetes for highly scalable distributed model serving.

Please see this guide for more details on using vLLM with KServe.