The underlying storage mode of tensor in Python, dimension transformation, permute/view/reshape, dimension size and number

Record the underlying storage mode of tensor in pytorch, dimension transformation, permute/view/reshape, dimension size and number.

The underlying storage mode of tensor

The underlying storage of tensor is based on the principle of row priority, such as:

>>import torch
>>a=tensor.rand((2,2,3))
>>a
tensor([[[0.1345,0.4907,0.8740],
		 [0.4888,0.5481,0.8513]],
		[[0.1015,0.9427,0.8660],
		 [0.5832,0.6661,0.4127]]])
#The underlying storage is stored according to the principle of line priority, and the general address is continuous
# (0.1345,0.4907,0.8740,0.4888,0.5481,0.8513,0.1015,0.9427,0.8660,0.5832,0.6661,0.4127)

Transform / pose / pose dimension

permute/transpose

permute refers to the exchange of dimensions in the specified order, such as dimension transformation and transposition according to dimensions 0, 2 and 1:

>>a.permute(0,2,1)
tensor([[[0.1345,0.4888],
		 [0.4907,0.5481],
		 [0.8740,0.8513]],
		[[0.1015,0.5832],
		 [0.9427,0.6661],
		 [0.8660,0.4127]]])

Transfer can only select two dimensions in the tensor for transposition, such as the 1st and 2nd dimensions:

>>a.transpose(1,2)
tensor([[[0.1345,0.4888],
		 [0.4907,0.5481],
		 [0.8740,0.8513]],
		[[0.1015,0.5832],
		 [0.9427,0.6661],
		 [0.8660,0.4127]]])

The transposition of three-dimensional and higher-dimensional tensors is less intuitive. In order to better understand the transposition process, we can use the coordinates. For the number 0.5832, the coordinates in the original tensor are (1,1,0), execute the transposition operation, convert the first and second dimensions, and the coordinates of 0.5832 become (1,0,1).

view/reshape

The results of reshape and view methods are consistent, but view does not open up new memory space, while reshape opens up new memory space. Although reshape has opened up new memory space, the address of the underlying element pointed to has not been changed.
After using the view method, a tensor can be regenerated by using continguous. Using the reshape() method to directly change the underlying data is equivalent to calling the view() method and the contiguous method successively. The function of the continuous method is to deal with the incompatibility between the step size and dimension of the tensor. It is also commonly used after the methods of narrow(), expand(), transfer () and permute(), which only change the dimension step size information without changing the underlying data itself, otherwise an error may occur.

>>a.view((2,3,2))
tensor([[[0.1345,0.4907],
		 [0.8740,0.4888],
		 [0.5481,0.8513]],
		[[0.1015,0.9427],
		 [0.8660,0.5832],
		 [0.6661,0.4127]]])
		
>>a.reshape((2,3,2))
tensor([[[0.1345,0.4907],
		 [0.8740,0.4888],
		 [0.5481,0.8513]],
		[[0.1015,0.9427],
		 [0.8660,0.5832],
		 [0.6661,0.4127]]])

Personal summary:
Permute / transfer is a transpose operation, while view/reshape is the tensor of the underlying storage
(0.1345, 0.4907, 0.8740, 0.4888, 0.5481, 0.8513, 0.1015, 0.9427, 0.8660, 0.5832, 0.6661, 0.4127) perform dimension segmentation again. Therefore, the results of permute / transfer and view/reshape in the above example are different

tensor dimension size and number

Ndidimension method can obtain the dimension of tensor, and nelement method can obtain the total number of elements of tensor. To obtain the size of each dimension of the tensor, that is, the shape information of the tensor, you can use tensor Size () or tensor Shape gets. The difference between the two is tensor Szie () calls the method, tensor Shape accesses the properties of tensor. The same is that the data type of the results returned by both is torch Size

>>a.ndimension()
3
>>a.nelement()
12
>>a.size()
torch.Size([2,2,3])
>>a.shape
torch.Size([2,2,3])

reference resources:
https://blog.csdn.net/CQUSongYuxin/article/details/107379150
https://blog.csdn.net/qq_41740004/article/details/104712173
https://blog.csdn.net/CQUSongYuxin/article/details/107379150

Original is not easy, reprint please indicate the source!

Keywords: Pytorch

Added by racerxfactor on Thu, 10 Feb 2022 09:54:12 +0200