'Can we use one optimizer for GAN model?

I have seen lots of GAN tutorials, and all of them use two separate optimizers for Generator and Discriminator. Their code looks like this.

import torch.nn as nn

class Generator(nn.Module):
    def __init__(self):
        pass
    
    def forward(self, x):
        pass

class Discriminator(nn.Module):
    def __init__(self):
        pass
    
    def forward(self, x):
        pass
G = Generator()
D = Discriminator()

optimizerG = torch.optim.Adam(G.parameters())
optimizerD = torch.optim.Adam(D.parameters())

But, can we combine those optimizers into one as shown below? Is there any downside?

class GAN(nn.Module):
    def __init__(self):
        super().__init__()
        self.G = Generator()
        self.D = Discriminator()
    
    def forward(self, x):
        pass
model = GAN()
optimizer = torch.optim.Adam(model.parameters())


Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source