if any(k in model_args.model_name_or_path for k in ("gpt", "opt", "bloom")):
padding_side = "left"
else:
padding_side = "right"
tokenizer = AutoTokenizer.from_pretrained(
model_args.tokenizer_name if model_args.tokenizer_name else model_args.model_name_or_path,
padding_side=padding_side,
use_fast=model_args.use_fast_tokenizer,
revision=model_args.model_revision
)
if tokenizer.pad_token_id is None:
tokenizer.pad_token_id = tokenizer.eos_token_id
...
trainer = Seq2SeqTrainer(
model=model,
args=training_args,
train_dataset=dataset.train_dataset if training_args.do_train else None,
eval_dataset=dataset.eval_dataset if training_args.do_eval else None,
compute_metrics=dataset.compute_metrics,
tokenizer=tokenizer,
data_collator=DataCollatorForSeq2Seq(tokenizer, model=model)
)
While using GPT2, during training, there are no warning messages, but during validation, the message as mentioned appears. As seen in the code, pad_token_id='left' and pad_token_id=eos_token_id are already set as required.
Why might this message be appearing during validation? Is there any additional task needed?
