'model.predict yeilds more predictions than the number of outputs

I've created a multi-class image classifier using CNN. I am using the keras module specifically and I am using generators to fit and then predict 4 different classes of images. My test_generator has 394 examples (all four classes combined), but my model.predict yields (6304, 4) predictions.

Here's the model summary:

Model: "sequential_2"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 IP (Conv2D)                 (None, 64, 64, 32)        320       
                                                                 
 Convolution0 (Conv2D)       (None, 64, 64, 64)        18496     
                                                                 
 PL0 (MaxPooling2D)          (None, 32, 32, 64)        0         
                                                                 
 Convolution1 (Conv2D)       (None, 32, 32, 128)       73856     
                                                                 
 PL1 (MaxPooling2D)          (None, 16, 16, 128)       0         
                                                                 
 Convolution2 (Conv2D)       (None, 16, 16, 256)       295168    
                                                                 
 PL2 (MaxPooling2D)          (None, 8, 8, 256)         0         
                                                                 
 FL (Flatten)                (None, 16384)             0         
                                                                 
 FC (Dense)                  (None, 128)               2097280   
                                                                 
 OP (Dense)                  (None, 4)                 516       
                                                                 
=================================================================
Total params: 2,485,636
Trainable params: 2,485,636
Non-trainable params: 0
_________________________________________________________________

Here's how I created the test_generator: test_generator = core_imageDataGenerator(test_directory) and the result of len(test_generator.classes) is 394.

Here's how I made the predictions: predictions = model.predict(test_generator) and the result of predictions.shape is [6304, 4] and not [394, 4]. What could be the reason for this? Am I doing something wrong? What are my options here because my next step is to create a classification report with a variety of metrics.



Solution 1:[1]

that is input and number of layers I see you use the flatten layer followed by dense matrix, which indicates the output matrix of layers.

There are convolution layers and dense layers, 320 -> 18496 ( 32 * 578 ) -> MaxPool which return the same sizes.

You may create an engine similar to the attached picture, the output is sensitive your problems are similarities and ratios. ( 6304, 4 ) , ( 394, 4 ).

They are important in statistics methods, simply mean you do not need much of dense layers we had do it for you !!!

[ Sample ]:

import tensorflow as tf

"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Functions
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
def predict_action ( dataset ) :
    predictions = model.predict( dataset )
    return predictions

"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
DataSet
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
input = tf.constant( tf.random.uniform(shape=[5, 2], minval=5, maxval=10, dtype=tf.int64), shape=( 1, 1, 5, 2 ) )
label = tf.constant( [0], shape=( 1, 1, 1 )  )
dataset = tf.data.Dataset.from_tensor_slices(( input, label ))

"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
: Model Initialize
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
model = tf.keras.models.Sequential([
    tf.keras.layers.InputLayer(input_shape=( 5, 2 )),
    tf.keras.layers.Dense( 4 ),
    tf.keras.layers.Flatten(),
    tf.keras.layers.Dense( 128 ),
    tf.keras.layers.Dense( 4 ),
])  
model.summary()

"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
: Optimizer
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
optimizer = tf.keras.optimizers.Nadam(
    learning_rate=0.0001, beta_1=0.9, beta_2=0.999, epsilon=1e-07,
    name='Nadam'
)

"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
: Loss Fn
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""                               
lossfn = tf.keras.losses.MeanSquaredLogarithmicError(reduction=tf.keras.losses.Reduction.AUTO, name='mean_squared_logarithmic_error')

"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
: Model Summary
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
model.compile(optimizer=optimizer, loss=lossfn, metrics=['accuracy'])

"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
: Training
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
history = model.fit(dataset, epochs=1 ,validation_data=(dataset))
print( predict_action( dataset ) )

[ Output ]:

 Layer (type)                Output Shape              Param #
=================================================================
 dense (Dense)               (None, 5, 4)              12

 flatten (Flatten)           (None, 20)                0

 dense_1 (Dense)             (None, 16)                336

 dense_2 (Dense)             (None, 4)                 68

=================================================================
Total params: 416
Trainable params: 416
Non-trainable params: 0
_________________________________________________________________
1/1 [==============================] - 1s 1s/step - loss: 0.2335 - accuracy: 0.0000e+00 - val_loss: 0.2291 - val_accuracy: 0.0000e+00
[[ 0.07219216 -2.9527428   1.5981569  -5.590222  ]]

Sample

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Martijn Pieters