In PyTorch, element-wise multiplication of tensors is performed using the *
operator or the torch.mul()
function. Both methods will multiply elements in the same positions in each tensor, which requires the tensors to be the same size or be broadcastable to a common shape.
Here's how you can do element-wise multiplication of two tensors:
import torch # Create two tensors of the same size tensor1 = torch.tensor([[1, 2], [3, 4]]) tensor2 = torch.tensor([[5, 6], [7, 8]]) # Element-wise multiplication result = tensor1 * tensor2 print(result)
Or using the torch.mul()
function:
result = torch.mul(tensor1, tensor2) print(result)
Both of these will output:
tensor([[ 5, 12], [21, 32]])
Each element in the resulting tensor is the product of elements in the corresponding position of the input tensors.
PyTorch also supports broadcasting, a feature borrowed from NumPy, which allows torch.mul()
to perform element-wise multiplication on tensors of different shapes if they are broadcastable. The smaller tensor is "stretched" to match the shape of the larger tensor before the operation.
For example:
# Create a tensor and a scalar tensor = torch.tensor([[1, 2], [3, 4]]) scalar = torch.tensor(2) # Element-wise multiplication with broadcasting result = tensor * scalar print(result)
This will output:
tensor([[2, 4], [6, 8]])
In this case, the scalar value 2
is broadcast to the shape of tensor
before multiplication.
Be careful with broadcasting because it can sometimes lead to unexpected results if you're not keeping track of the shapes of your tensors. Always make sure the shapes are compatible for broadcasting if the tensors are not of the same shape.
package instagram m grails audio function-pointers coin-change service-worker client-side-validation samsung-galaxy