Add python-multipart to requirements, GGUF support for CUDA backend
- Add python-multipart to requirements.txt, requirements-nvidia.txt, requirements-vulkan.txt - Add llama-cpp-python to requirements-nvidia.txt for GGUF support - When using CUDA/nvidia backend with GGUF file, automatically use llama-cpp-python
Showing
| ... | ... | @@ -5,6 +5,7 @@ pydantic>=2.5.0 |
| # CLI dependencies | ||
| requests>=2.31.0 # for the coder CLI tool | ||
| python-multipart>=0.0.6 # for multipart form data parsing | ||
| # PyTorch - Uncomment the appropriate version for your system. | ||
| # IMPORTANT: Use quotes around version specifiers to prevent shell interpretation! | ||
| ... | ... |
Please
register
or
sign in
to comment