Hugging Face Transformers - trust_remote_code not working

4.9k Views Asked by At

I am currently working on a notebook to run falcon-7b-instruct myself. I am using a notebook in Azure Machine Learning Studio for that. The code I use for running Falcon is from Hugging Face.

from transformers import AutoTokenizer
import transformers
import torch

model = "tiiuae/falcon-7b-instruct"

tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
    "text-generation",
    model=model,
    tokenizer=tokenizer,
    torch_dtype=torch.bfloat16,
    device_map="auto",
    trust_remote_code=True
)

However, when I run the code, even though trust_remote_code is set to True, I get the following error.

ValueError: Loading tiiuae/falcon-7b-instruct requires you to execute the configuration file in that repo on your local machine. Make sure you have read the code there to avoid malicious use, then set the option `trust_remote_code=True` to remove this error.

Is there perhaps an environment variable I have to set in order for remote code execution to work? I wasn't able to find anything about this error in the transformers documentation.

1

There are 1 best solutions below

0
Rishabh Meshram On BEST ANSWER

I have reproduced the issue with the provided code snippet.

Issue Output: Image To solve this issue you can try to install the latest packages following the FalconStreaming notebook:

!pip install git+https://www.github.com/huggingface/transformers 
!pip install git+https://github.com/huggingface/accelerate 
!pip install bitsandbytes 
!pip install einops

Make sure to restart the kernel once the packages are installed.

After installing the packages, the script executed successfully. image