vllm-inference / README.md
yusufs's picture
feat(first-commit): follow examples and tutorials
ae7cfbb
|
raw
history blame
284 Bytes
metadata
title: Deploy VLLM
emoji: 🐢
colorFrom: blue
colorTo: blue
sdk: docker
pinned: false

Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference

poetry export -f requirements.txt --output requirements.txt --without-hashes