'How to insert dropout layers after activation layers in a pre-trained non-sequential model using functional keras API?

I am working on a modified resnet, and want to insert dropout after activation layers. I have tried the following but due to the model not being sequential, it did not work:

def add_dropouts(model, probability = 0.5):
    print("Adding Dropouts")
    
    updated_model = tf.keras.models.Sequential()
    for layer in model.layers:
        print("layer = ", layer)
        updated_model.add(layer)
        if isinstance(layer, tf.keras.layers.Activation):
            updated_model.add(tf.keras.layers.Dropout(probability))

    print("updated model Summary = ", updated_model.summary)
    print("model Summary = ", model.summary)

    model = updated_model

    return model


base_model = tf.keras.applications.ResNet50V2(include_top=False, input_shape=input_img_shape, pooling='avg')

base_model = add_dropouts(base_model, probability = 0.5)

Then i tried my own version using the functional API, but this method doesn't work and returns a value error say Tensor doesn't have output.

    prev_layer = base_model.layers[0]
    for layer in base_model.layers:
        next_layer = layer(prev_layer.output)
        if isinstance(layer, tf.keras.layers.Activation):
            next_layer = Dropout(0.5)(next_layer.output)
        prev_layer = next_layer

Does anyone know how someone would add dropout layers into resnet or any other pretrained network?



Solution 1:[1]

So eventually i figured out how to do it; but its very hacky. Go to:

C:\ProgramData\Anaconda3\envs*your env name*\Lib\site-packages\tensorflow\python\keras\applications

Go to resnet.py. This will also change resnetv2 instances because it is based on the original resnet. Just Cntrl+F for activation,and where you see an activation layer(which is usually in the format x = Layer(x) building the model a layer at a time) then just add: x = Dropout(prob)(x) Here is an example:


  if not preact:
    x = layers.BatchNormalization(
        axis=bn_axis, epsilon=1.001e-5, name='conv1_bn')(x)
    x = layers.Activation('relu', name='conv1_relu')(x)#insert layer after each of these
    x = layers.Dropout(prob)(x) # added dropout

Do this for all similar search results for 'activation'.

Then you will see the dropout added in your model summary.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 beelzmon