Can we change the entry for a grad required tensor more than once?

Since this kind of operation is forbidden in Pytorch, so I would like to ask if we can change the entry for a grad required tensor more than once? Will this cause a wrong grad?

I’m not sure if I understand the question entirely, but perhaps you are asking about the Global Data Access Rules: https://github.com/yuanming-hu/taichi/blob/master/docs/differentiable_programming.rst

Yes, that’s what I mean. If I want to modify a global tensor for unknown times. Let’s say sometimes it will be modified twice and sometimes it will be modified 200 times (like an RNN). What can I do to obtain the correct gradient?
In Pytorch I can just replace the reference of the tensor and the deleted tensor will be retained to preserve the gradient. I am wondering what is the solution in Taichi?

In PyTorch if you are doing an RNN, deleting a tensor does not really erase it from memory. You are still creating 200 copies of tensors, for autograd.

In Taichi you can just allocate a tensor of size [200, X] to do the same thing.

Yes. What I mean is that in Pytorch it will retain the unknown number of tensor. I do not need to specify the number. However, in Taichi, I need to specify a maximum of that ‘number’. Is it necessary for me to specify that ‘number’?

I think the easiest way is just use the maximum possible number.