Copy a Tensor in PyTorch



PyTorch is a very popular Python library used in machine learning. This library is developed by Facebook AI. This library provides robust tools for deep learning, neural networks, and tensor computations.

Below are different approaches to Copying a Tensor in PyTorch.

  • Using clone() function
  • Using detach() method
  • Using copy.deepcopy() method

Using clone() function

We use the clone() method to create a deep copy of a tensor. In deep copy, the original tensor and the copied tensor do not share memory. If we make changes in copied tensor then it would not affect the original tensor.

Example

import torch

# Original tensor
original_tensor = torch.tensor([11, 12, 13])

copied_tensor = original_tensor.clone()

# Modify
copied_tensor[0] = 20

print("Original Tensor:", original_tensor)
print("Copied Tensor:", copied_tensor)

Output

Original Tensor: tensor([11, 12, 13])
Copied Tensor: tensor([20, 2, 3])

Time Complexity: O(n), n is the number of elements in the tensor.

Space Complexity: O(n), is used as a new memory block.

Using detach() method

We use the detach() method to create a new tensor that shares data with the original tensor. This method does not keep track of operations in the computational graph. This method is used when we need a lightweight copy of a tensor for inference or analysis.

Example

import torch

# Tensor with gradients enabled
original_tensor = torch.tensor([11.0, 12.0, 13.0], requires_grad=True)

# Create a lightweight copy using detach()
copied_tensor = original_tensor.detach()

# Modify copied tensor
copied_tensor[0] = 20

print("Original Tensor:", original_tensor)
print("Copied Tensor:", copied_tensor)

Output

Original Tensor: tensor([20., 12., 13.], requires_grad=True)
Copied Tensor: tensor([20., 12., 13.])

Time Complexity: O(1), as it does not create any new memory block.

Space Complexity: O(1), as the copied tensor shares the memory of the original tensor.

Using copy.deepcopy() method

We use deepcopy() method to create a completely independent copy of a tensor. We use this method when we deal with complex data such as models or containers of tensors. This method can also be applied to nested structures.

Example

import torch
import copy

# Nested tensor
original_tensor = {
'a': torch.tensor([11, 12, 13]),
'b': torch.tensor([14, 15, 16])
}

# Create a deep copy using deepcopy()
copied_tensor = copy.deepcopy(original_tensor)

# Modify
copied_tensor['a'][0] = 20

print("Original Tensor:", original_tensor)
print("Copied Tensor:", copied_tensor)

Output

Original Tensor: {'a': tensor([11, 12, 13]), 'b': tensor([14, 15, 16])}
Copied Tensor: {'a': tensor([20, 12, 13]), 'b': tensor([14, 15, 16])}

Time Complexity: O(n), where n is the number of elements in all tensors.

Space Complexity: O(n), uses a new memory block for each tensor in the nested structure.

Updated on: 2024-12-27T15:14:04+05:30

4K+ Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements