Fix: Add peft to requirements.txt for LoRA adapter support
PEFT (Parameter-Efficient Fine-Tuning) is required for loading LoRA adapters with pipe.load_lora_weights(). Without it, LoRA loading fails with: 'PEFT backend is required for this method.'
Showing
| ... | ... | @@ -16,6 +16,7 @@ ftfy>=6.1.0 |
| Pillow>=10.0.0 | ||
| safetensors>=0.4.0 | ||
| huggingface-hub>=0.19.0 | ||
| peft>=0.7.0 # Required for LoRA adapter loading | ||
| # Audio Dependencies (Optional - for TTS and music generation) | ||
| scipy>=1.11.0 | ||
| ... | ... |
Please
register
or
sign in
to comment