Why does feeding packed sequence to RNN changes the sequences length?

24 Views Asked by At

My model consists of 3 GRU layers and the input to the model have the following shape [batch_size, sequence_length, features]

Since I am working with variable length sequence length, I pad the sequences to a maximal length.

Then I use the following code to feed the input to the GRUs:

 def forward(self, x, lengths):
     print(f"MaskedGRULayers x shape {x.shape}")
     packed_sequence = pack_padded_sequence(x, lengths, batch_first=True, enforce_sorted=False)
     packed_output, hidden = self.gru_layers(packed_sequence)
     output, output_lengths = pad_packed_sequence(packed_output, batch_first=True)
     print(f"MaskedGRULayers output shape {output.shape}")
     return output

The prints:

MaskedGRULayers x shape torch.Size([500, 87, 100])
MaskedGRULayers output shape torch.Size([500, 64, 1000])

Can someone explain how does the 87 seq length turned to 64?

This is the layers I used:

class MaskedGRULayers(nn.Module):
        def __init__(self, input_size, layers_size, dropout=0):
            super(MaskedGRULayers, self).__init__()
            # batch_first=True assumes that data shape is (batch_size, sequence_length, input_size)
            # where input_size is the number of features
            self.gru_layers = nn.GRU(input_size=input_size,
                                     hidden_size=layers_size[0],
                                     dropout=dropout,
                                     num_layers=len(layers_size),
                                     batch_first=True)
            

Where layers_size = [1000, 1000, 1000]

Thanks in advance. Any help will be appreciated

0

There are 0 best solutions below