'How to upscale image in pytorch?

how to upscale an image in Pytorch without defining height and width using transforms? ('--upscale_factor', type=int, required=True, help="super resolution upscale factor")



Solution 1:[1]

This might do the Job

transforms.Compose([transforms.resize(ImageSize*Scaling_Factor)])

Solution 2:[2]

If I understand correctly that you want to upsample a tensor x by just specifying a factor f (instead of specifying target width and height) you could try this:

from  torch.nn.modules.upsampling import Upsample
m = Upsample(scale_factor=f, mode='nearest')
x_upsampled = m(x)

Note that Upsample allows for multiple interpolation modes, e.g. mode='nearest' or mode='bilinear'

Solution 3:[3]

Here is one interesting example:

input = torch.tensor([[1.,2.],[3.,4.]])
input=input[None]  
input=input[None]  
output = nn.functional.interpolate(input, scale_factor=2, mode='nearest')
print(output)

Out:

tensor([[[[1., 1., 2., 2.],
          [1., 1., 2., 2.],
          [3., 3., 4., 4.],
          [3., 3., 4., 4.]]]])

Solution 4:[4]

You can do

image_tensor = transforms.functional.resize(image_tensor, size=(image_tensor.shape[1] * 2, image_tensor.shape[2] * 2))

or read out width and height using color, height, width = image_tensor.size() beforehand

check this example for reference on Resize as well.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Khagendra
Solution 2
Solution 3 prosti
Solution 4