I have a neural network which I want to evaluate and I have a set of numerical values.
t=[(i-1)/(l-1) for i in range(1,l+1)]
bx = normalize([4*math.pi*i*math.cos(4*math.pi*i) for i in t])
by = normalize([4*math.pi*i*math.sin(4*math.pi*i) for i in t])
train=[[bx[i],by[i]] for i in range(len(bx))]
Problem is, the evaluation function only works sometimes.
If I call the function directly results = loaded_model.evaluate(t, train) it returns the following error:
Detected at node 'Reshape' defined at (most recent call last):
Node: 'Reshape'
Input to reshape is a tensor with 512 values, but the requested shape has 16
[[{{node Reshape}}]] [Op:__inference_test_function_405174]
If I however just evaluate a single datapoint results = loaded_model.evaluate([t[0]], [train[0]]) inside an array. It works perfectly fine.
But whenever I try to evaluate more than one points even if inside an array (results = loaded_model.evaluate([t[0:3]], [train[0:3]]) or even results = loaded_model.evaluate([t], [train])).
Again it returns an error:
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
Cell In[57], line 31
29 print(t[0])
30 #pred = loaded_model.predict([t[]])
---> 31 results = loaded_model.evaluate([t], [train])
33 print(results)
File ~\anaconda3\envs\CVISEnv\lib\site-packages\keras\src\utils\traceback_utils.py:70, in error_handler(*args, **kwargs)
File ~\AppData\Local\Temp\__autograph_generated_filexnmit6hj.py:15, in outer_factory.<locals>.inner_factory.<locals>.tf__test_function(iterator)
13 try:
14 do_return = True
---> 15 retval_ = ag__.converted_call(ag__.ld(step_function), (ag__.ld(self), ag__.ld(iterator)), None, fscope)
16 except:
17 do_return = False
ValueError: in user code:
File "C:\Users\Portatil\anaconda3\envs\CVISEnv\lib\site-packages\keras\src\engine\training.py", line 1972, in test_function *
return step_function(self, iterator)
File "C:\Users\Portatil\anaconda3\envs\CVISEnv\lib\site-packages\keras\src\engine\training.py", line 1956, in step_function **
outputs = model.distribute_strategy.run(run_step, args=(data,))
File "C:\Users\Portatil\anaconda3\envs\CVISEnv\lib\site-packages\keras\src\engine\training.py", line 1944, in run_step **
outputs = model.test_step(data)
File "C:\Users\Portatil\anaconda3\envs\CVISEnv\lib\site-packages\keras\src\engine\training.py", line 1850, in test_step
y_pred = self(x, training=False)
File "C:\Users\Portatil\anaconda3\envs\CVISEnv\lib\site-packages\keras\src\utils\traceback_utils.py", line 70, in error_handler
raise e.with_traceback(filtered_tb) from None
ValueError: Exception encountered when calling layer 'model_284' (type Functional).
Could not find matching concrete function to call loaded from the SavedModel. Got:
Positional arguments (1 total):
* <tf.Tensor 'inputs:0' shape=(None, 100) dtype=float32>
Keyword arguments: {}
Expected these arguments to match one of the following 1 option(s):
Option 1:
Positional arguments (1 total):
* TensorSpec(shape=(None, 1), dtype=tf.float32, name='inputs')
Keyword arguments: {}
Call arguments received by layer 'model_284' (type Functional):
• inputs=tf.Tensor(shape=(None, 100), dtype=float32)
• training=False
• mask=None
I'm honestly stumped. One alternative I have came up with is just predicting individually for every single datapoint and then computing the error metrics myself. I'd rather use the library function, since it'll likely be more efficient.
Thanks
As I said, I tinkered a bit with the shape of the input values. I'm implementing it individually, but I find it a bit pointless to have to compute the RMSE myself when tensorflow has its own functions with evaluate.