Logo
Explore Help
Register Sign In
GithubMirror/LocalAI
1
0
Fork 0
You've already forked LocalAI
Code Issues Packages Projects Releases Wiki Activity
Files
6a382a1afe71ec7902e79882f0b4470232bc445a
LocalAI/backend/python/vllm
History
Ettore Di Giacinto 6a382a1afe fix(transformers): try to pin to working release (#5426)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-05-22 12:50:51 +02:00
..
backend.py
fix: vllm missing logprobs (#5279)
2025-04-30 12:55:07 +00:00
install.sh
chore(deps): bump grpcio to 1.68.1 (#4301)
2024-12-02 19:13:26 +01:00
Makefile
…
README.md
…
requirements-after.txt
…
requirements-cpu.txt
fix(transformers): try to pin to working release (#5426)
2025-05-22 12:50:51 +02:00
requirements-cublas11-after.txt
…
requirements-cublas11.txt
fix(transformers): try to pin to working release (#5426)
2025-05-22 12:50:51 +02:00
requirements-cublas12-after.txt
…
requirements-cublas12.txt
fix(transformers): try to pin to working release (#5426)
2025-05-22 12:50:51 +02:00
requirements-hipblas.txt
fix(transformers): try to pin to working release (#5426)
2025-05-22 12:50:51 +02:00
requirements-install.txt
…
requirements-intel.txt
fix(intel): pin torch and intel-extensions (#4435)
2024-12-19 15:39:32 +01:00
requirements.txt
chore(deps): bump grpcio to 1.72.0 (#5244)
2025-04-25 21:32:37 +02:00
run.sh
…
test.py
fix: vllm missing logprobs (#5279)
2025-04-30 12:55:07 +00:00
test.sh
…

README.md

Creating a separate environment for the vllm project

make vllm
Reference in New Issue View Git Blame Copy Permalink
Powered by Gitea Version: 1.25.3 Page: 3533ms Template: 75ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API