Multi-task Neural Network Model

24 Views Asked by At

I'm working on a neural network that handles multiple tasks at once using Keras Functional API. I'm trying to make it predict two outputs at the same time (below is my code).

Although it works, I haven't noticed a big improvement in accuracy compared to a regular single-output neural network model.

I've tried tweaking the number of neurons, layers, and other settings using Keras Tuner to see if it helps, but no luck so far.

I'm thinking maybe I need to come up with a custom loss function to trigger the interaction between the two outputs in my model.

Any tips on that? or maybe any other ideas to make the multi-task neural network better than the single-output model. Thanks!

from keras.models import Model
import matplotlib.pyplot as plt
from keras.layers import BatchNormalization,Dropout
from keras.layers import Input, Dense
from sklearn.metrics import r2_score, mean_absolute_error, mean_squared_error
import keras
import seaborn as sns
import matplotlib.pyplot as plt
from sklearn.metrics import confusion_matrix
import numpy as np
from sklearn.metrics import confusion_matrix
import numpy as np
from tensorflow.keras import regularizers
from tensorflow.keras.optimizers import Adam
from keras.metrics import Precision, Recall, F1Score
import tensorflow as tf

# Define the shared input layer
input_layer = Input(shape=(input_data.shape[1],))

# Define the shared hidden layers
shared_hidden_layer_1 = Dense(100, activation='relu')(input_layer)

# Define task-specific output layers - Task1
task1_hidden_layer_1 = Dense(100, activation='relu')(shared_hidden_layer_1)
task1_hidden_layer_2 = Dense(100, activation='relu')(task1_hidden_layer_1)

# Define task-specific output layers - Task2
task2_hidden_layer_1 = Dense(100, activation='relu')(shared_hidden_layer_1)
task2_hidden_layer_2 = Dense(100, activation='relu')(task2_hidden_layer_1)

# Subtracted Layer
subtracted = keras.layers.Subtract()([task2_hidden_layer_2, task1_hidden_layer_2])

# Define the Outputs
task1_output = Dense(5, activation="softmax", name='task1_output')(subtracted)
task2_output = Dense(4, activation="softmax", name='task2_output')(subtracted)

# Define the model with multiple outputs
model = Model(inputs=input_layer, outputs=[task1_output, task2_output])

# Define a specific learning rate
learning_rate = 5.00E-05

# Instantiate an optimizer with the desired learning rate
optimizer = Adam(learning_rate=learning_rate)



# Compile the model with the custom loss function
model.compile(optimizer=optimizer,
              loss =['categorical_crossentropy', 'categorical_crossentropy'],
              metrics={"task1_output": [Precision(), Recall()],
                       "task2_output": [Precision(), Recall()]})




I was expecting a better performance for the multi-task neural network compared to the single-output neural network - but that did not work so far!

0

There are 0 best solutions below