I'm writing my first CNN using a dataset of my own making consisting of 1400 training images and 600 testing images. Each image has a corresponding label. The softmax classifier should give a binary output of 0 or 1.

In the very first epoch of training the results are: loss: 386298720.0000 - accuracy: 0.9764 - val_loss: 0.0000e+00 - val_accuracy: 1.0000. And the succeeding epochs have loss: 0.0000e+00 - accuracy: 1.0000 - val_loss: 0.0000e+00 - val_accuracy: 1.0000.

These are obviously erroneous results. What could be causing this?

I've tried shuffling my dataset, it didn't make a difference. I changed the labels from the format [1,0] or [0,1] to simply [1] or [0]. In this case the loss was 0 and validation was still 1. The number of epochs doesn't change anything, nor does changing the learning rate.

Here is my model:

#BUILDING THE MODEL

num_filters = 8
filter_size = 3
pool_size = 2

model = Sequential([
    #layers

    Conv2D(num_filters, filter_size, input_shape=(724,150,1)), 
    Conv2D(num_filters, filter_size), 
    MaxPooling2D(pool_size=pool_size), #downsamples 2d spatial data
    Dropout(0.5), #helps prevent overfitting
    Flatten(), #connects convolitional and dense layers
    Dense(64, activation='relu'), #fully connected layer
    Dense(2, activation='softmax'),
])

#COMPILING THE MODEL
#need optimizer, loss function, metrics

model.compile(
    'adam',
    loss='categorical_crossentropy',
    metrics=['accuracy'],
)

model.fit(
    train,
    to_categorical(train_labels),
    epochs=10,
    validation_data=(test, to_categorical(test_labels)),
    )
1

There are 1 best solutions below

3
Daniel Byrne On

Make extra sure you are not leaking your test data into your training data. Also I think you only need 1 output Dense(1,' sigmoid') since you are looking for a binary answer.