When I try to create an AutoRegressiveBaseModelWithCovariates from the Pytorch Module with the Method .from_dataset with an TimeseriesDataset(in this case called "train_dataset") i always get the following error although i dont have the keyword static_categoricals in my TimeseriesDataset
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
Cell In[10], line 1
----> 1 model = AutoRegressiveBaseModelWithCovariates.from_dataset(dataset=train_dataset,
2 # hidden_size=HIDDEN,
3 # lstm_layers=LSTMLAYERS,
4 # dropout=DROPOUT,
5 # output_size=OUTPUT_SIZE,
6 # loss=QuantileLoss(quantiles=QUANTILES),
7 # attention_head_size=ATTHEADS,
8 # max_encoder_length=ENCODER_LEN_MAX,
9 # hidden_continuous_size=HIDDEN_CONTINIOUS,
10 learning_rate=LEARNING_RATE,
11 log_interval=LOG_INTERVAL,
12 reduce_on_plateau_patience = PLATEU_PATIENCE,
13 optimizer='adam',# weigth_decay
14 )
16 #todo Autoregressionmodel while handing over **kwargs gets error in BaseModel from_dataset function although it is working on TemporalFusionTransformer??
18 print(f"Number of parameters in TFT network: {model.size()/1e3:.1f}k")
File c:\Users\juo7ho\.conda\envs\hyperparam_pytorch\lib\site-packages\pytorch_forecasting\models\base_model.py:1679, in BaseModelWithCovariates.from_dataset(cls, dataset, allowed_encoder_known_variable_names, **kwargs)
1675 new_kwargs.update(kwargs)
1676 # print("basewithcovariates_from")
1677 # print(new_kwargs)
1678 #super is BaseModel
...
-> 1242 net = cls(**kwargs)
1243 net.dataset_parameters = dataset.get_parameters()
1244 if dataset.multi_target:
TypeError: BaseModel.__init__() got an unexpected keyword argument 'static_categoricals'
This my Dataset:
train_dataset = TimeSeriesDataSet(data=SG.df.loc[SG.df['Usecase'] == 'Train'], # dataframe with sequence data
time_idx='Sequence_idx', # integer column denoting the time index
target='a', # column/list denoting the target
group_ids=['Sequence_ID'], # column names that identifying a time series
max_encoder_length=ENCODER_LEN_MAX, # maximum length to encode
min_encoder_length=ENCODER_LEN_MIN, # minimum allowed length to encode
max_prediction_length=PREDICTION_LEN_MAX, # maximum prediction/decoder length
min_prediction_length=PREDICTION_LEN_MIN, # minimum prediction/decoder length
time_varying_known_reals=["size"], # continuous variables that change over time and are known in the future
time_varying_unknown_reals=['a', # continuous variables that change over time and are NOT known in the future
'b',
'c',
'd',
],
allow_missing_timesteps=False, # allow missing timesteps that are automatically filled up
add_relative_time_idx=False, # add a relative time index as feature
add_target_scales=False, # add scales for target to static real features
add_encoder_length=False, # add decoder length to list of static real variables
target_normalizer=TorchNormalizer(),
predict_mode=False # this will take choose for each time series identified by group_ids the last max_prediction_length samples of each time series as prediction samples and everthing previous up to max_encoder_length samples as encoder samples
)
I tried using a different Class from Pytorch named TemporalFusionTransformer and with that it worked flawlessly.
I looked at the Code from Pytorch and I noticed that
AutoRegressiveBaseModelWithCovariates inherets from BaseModelWithCovariates which inherets from BaseModel.
At the end the BaseModel __init__ Method is called with the **kwargs where static cathegorials are in and throws an error mentiont above because the BaseModel does not have static cathegorials.
Am I doing something wrong or is this a bug in the Pytorch Libary?
Thanks in advance.