| Layers | Architecture of generator | Output size |
| Input | , , , | (256 256 1) | D1 | Conv.(4 4 64), LReLU | (128 128 64) | D2 | Conv.(4 4 128), SwitchNorm, LReLU | (64 64 128) | D3 | Conv.(4 4 256), SwitchNorm, LReLU | (32 32 256) | D4 | Conv.(4 4 512), SwitchNorm, LReLU, dropout | (16 16 512) | D5 | Conv.(4 4 512), SwitchNorm, LReLU, dropout | (8 8 512) | D6 | Conv.(4 4 512), SwitchNorm, LReLU, dropout | (4 4 512) | D7 | Conv.(4 4 512), SwitchNorm, LReLU, dropout | (2 2 512) | D8 | Conv.(4 4 512), LReLU, dropout | (1 1 512) |
| U1 | Concatenate (D8, D7), DeConv.(4 4 512), SwitchNorm, ReLU, dropout | (2 2 512) | U2 | Concatenate (U1, D6), DeConv.(4 4 512), SwitchNorm, ReLU, dropout | (4 4 512) | U3 | Concatenate (U2, D5), DeConv.(4 4 512), SwitchNorm, ReLU, dropout | (8 8 512) | U4 | Concatenate (U3, D4), DeConv.(4 4 512), SwitchNorm, ReLU, dropout | (16 16 512) | U5 | Concatenate (U4, D3), DeConv.(4 4 256), SwitchNorm, ReLU | (32 32 256) | U6 | Concatenate (U5, D2), DeConv.(4 4 128), SwitchNorm, ReLU | (64 64 128) | U7 | Concatenate (U6, D1), DeConv.(4 4 64), SwitchNorm, ReLU | (128 128 64) | Final | Upsample (4 4 1), ZeroPad, Conv.(4 4 1), tanh | (256 256 1) |
| Output | | (256 256 1) |
|
|