Is there any data copy between Taichi and Pytorch? Let’s assume all data are contiguous and in the same device.
Yes, when you do from_torch/to_torch, there will be a data copy. In the end if the data copy is time consuming, you may want to avoid it. However usually the copy won’t be the bottleneck for most kernels.
I am planning to use from_torch/to_torch in a future project for about 5-10 times in each forward. I think the copy might be an issue. Is there a way to avoid it?
I would say in most cases where a Taichi kernel is needed, the data copy won’t be the bottleneck.