Using TPU on the Huggingface Pipeline throws PyTorch error

319 Views Asked by At

I used to run this script using GPUs on GCP, but I am now trying to implement it using TPUs. As far as I am concerned, TPUs should now be working fine with the transformers pipeline.

However, trying to set the device parameter throws RuntimeError: Cannot set version_counter for inference tensor

from transformers import pipeline
import torch
import torch_xla
import torch_xla.core.xla_model as xm

classifier = pipeline("text-classification",model='bhadresh-savani/distilbert-base-uncased-emotion', return_all_scores=True, device=device)

def detect_emotions(emotion_input):
    
    """Model Inference Section"""
    prediction = classifier(emotion_input,)
    output = {}
    
    for emotion in prediction[0]:
        output[emotion["label"]] = emotion["score"]   
    return output


detect_emotions('Rest in Power: The Trayvon Martin Story’ takes an emotional look back at the shooting that divided a nation')

How would this be rectified? What does this error even mean?

0

There are 0 best solutions below