Currently, I'm trying to delete an element in a vector corresponding to an index in pytorch, however, it keeps creating an extra copy with the methods I'm currently using.
This makes sense under the hood; the only way I'm thinking the memory address could remain the same is if the tensor was implemented like a linkedlist.
That being said, is it possible to delete an element from a PyTorch tensor referentially. i.e is it possible to delete an element from the tensor such that the tensor with the deleted index and the old tensor have the same memory address.
Methods I've already tried (but involved creating a copy/new memory address) involve (pseudocode)...
- new_data = torch.cat((data[:i], data[i+1:))
- data[torch.LongTensor(indices_to_keep)] where indices_to_keep = [index for index in range(data.shape[0]) if index != i]
As far as I understand, you don't need a special function to achieve such behavior. If you assign the existing tensor to a new variable, both variable names in that scope will be pointing to the same tensor.