Is it ok to use Huggingface with Pytorch lightning?

42 Views Asked by At

I have some Huggingface models that I'd like to finetune with PEFT LORA. I'd also like to use Pytorch Lightning Fabric for FSDP distributed training. However, I'm not sure if they'd be compatible with each other. Does anyone have experience using these two together?

2

There are 2 best solutions below

0
LeviAckerman On

You don't need to use Lightning Fabric just for FSDP. You can do so within the Hugging Face ecosystem through their Accelerate package. In the HF accelerate config, set the distributed_type to FSDP and set a fsdp_config.

Take a look at the entire documentation here.

0
JobHunter69 On

I found there's currently an issue with lora with fsdp. Not sure how huggingface/lightning falls into play.

https://github.com/h2oai/h2o-llmstudio/issues/98