I was training a network and I decided to add more data for training. input size
but when I train my net with this data, I got this error:
InvalidArgumentError Traceback (most recent call last)
File c:\Users\YATO.DESKTOP-PCAG66Q\Desktop\New folder (3)\braindl\train.py:84
81 print(model.inputshape)
82 print(model.output_shape)
---> 84 history=model.fit(train_img_datagen,
85 steps_per_epoch=steps_per_epoch,
86 epochs=100,
87 verbose=1,
88 validation_data=val_img_datagen,
89 validation_steps=val_steps_per_epoch,
90 )
92 model.save('tbrainseg_3d.hdf5')
File c:\Users\YATO.DESKTOP-PCAG66Q\anaconda3\Lib\site-packages\keras\src\utils\traceback_utils.py:70, in filter_traceback.<locals>.error_handler(*args, **kwargs)
67 filtered_tb = process_traceback_frames(e.__traceback)
68 # To get the full stack trace, call:
69 # tf.debugging.disable_traceback_filtering()
---> 70 raise e.with_traceback(filtered_tb) from None
71 finally:
72 del filtered_tb
File c:\Users\YATO.DESKTOP-PCAG66Q\anaconda3\Lib\site-packages\tensorflow\python\eager\execute.py:53, in quick_execute(op_name, num_outputs, inputs, attrs, ctx, name)
51 try:
52 ctx.ensure_initialized()
...
TypeError: generator yielded an element of shape (0,) where an element of shape (None, None, None, None, None) was expected.
[[{{node PyFunc}}]]
[[IteratorGetNext]] [Op:__inference_train_function_7443]
My data generator: my batch size is = 2
import os
import numpy as np
def load_img(img_dir, img_list):
images=[]
for i, image_name in enumerate(img_list):
if (image_name.split('.')[1] == 'npy'):
image = np.load(img_dir+image_name)
images.append(image)
images = np.array(images)
return(images)
def imageLoader(img_dir, img_list, mask_dir, mask_list, batch_size):
L = len(img_list)
#keras needs the generator infinite, so we will use while true
while True:
batch_start = 0
batch_end = batch_size
while batch_start < L:
limit = min(batch_end, L)
X = load_img(img_dir, img_list[batch_start:limit])
Y = load_img(mask_dir, mask_list[batch_start:limit])
yield (X,Y) #a tuple with two numpy arrays with batch_size samples
batch_start += batch_size
batch_end += batch_size`
I would mention that before adding the new data, it is working well.