I am currently working on a project aimed at developing a screen share program. The program involves a client that can act as either a sender or receiver and a server that facilitates the distribution of frames to all clients connected to a specific socket.
While implementing this functionality, I have encountered issues related to program stability. Specifically, there are instances where the program appears to function correctly initially. However, as the streaming duration increases, the delay between the client and receiver grows. Additionally, on some occasions, the receiver encounters this error:
java.io.StreamCorruptedException: invalid stream header: 803FFFD9
at java.base/java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:987)
at java.base/java.io.ObjectInputStream.<init>(ObjectInputStream.java:414)
at ScreenShareBox.receiveVideo(ScreenShareBox.java:129)
at java.base/java.lang.Thread.run(Thread.java:1589)
//This is the client-sender code:
public void sendImageToServer(BufferedImage screenshot) {
try {
// Convert BufferedImage to byte array
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
ImageIO.write(screenshot, "jpeg", byteArrayOutputStream);
// Get byte array from the ByteArrayOutputStream
byte[] imageBytes = byteArrayOutputStream.toByteArray();
// Get the output stream from the socket
OutputStream outputStream = socket.getOutputStream();
// Create ObjectOutputStream for sending image bytes
ObjectOutputStream objectOutputStream = new ObjectOutputStream(outputStream);
// Write image bytes to the ObjectOutputStream and flush the stream
objectOutputStream.writeObject(imageBytes);
objectOutputStream.flush();
System.out.println("Frame sent");
} catch (IOException e) {
e.printStackTrace();
}
}
//Frames are sent every 80ms and I'm using an ExecutorService `sendImageExecutor.execute(() -> sendImageToServer(screenshot));
//This is the server code (receive and broadcast to clients):
public void receiveVideo(Socket socket, int roomId) {
try {
// Set up input stream to receive video frames
InputStream inputStream = socket.getInputStream();
ObjectInputStream objectInputStream = new ObjectInputStream(inputStream);
while (true) {
// Read the image bytes from the ObjectInputStream
byte[] imageBytes = (byte[]) objectInputStream.readObject();
System.out.println("Frame Received");
// Retrieve the list of clients in the video chat room
List<UserWithSocket> roomClients = videoChatRoomClients.get(roomId);
// Broadcast the received video frame to all clients in the room
for (UserWithSocket client : roomClients) {
// Use a thread pool to concurrently broadcast video frames to clients
videoThreadPool.execute(() -> broadcastVideo(client, imageBytes));
}
}
} catch (IOException | ClassNotFoundException e) {
e.printStackTrace();
}
}
private void broadcastVideo(UserWithSocket user, byte[] imageBytes) {
try {
// Get the output stream from the user's socket
OutputStream outputStream = user.getUserSocket().getOutputStream();
// Write the video frame bytes to the output stream and flush
outputStream.write(imageBytes);
outputStream.flush();
// Print a message indicating that the frame has been successfully broadcasted
System.out.println("Frame Broadcasted");
} catch (IOException e) {
e.printStackTrace();
}
}
//This is the client-receiver code:
private void receiveVideo() {
try {
while (running) {
// Set up input stream to receive video frames
InputStream inputStream = socket.getInputStream();
ObjectInputStream objectInputStream = new ObjectInputStream(inputStream);
// Read the byte array representing the video frame from the ObjectInputStream
byte[] imageBytes = (byte[]) objectInputStream.readObject();
System.out.println("Frame Received");
// Convert the byte array to a BufferedImage
ByteArrayInputStream byteArrayInputStream = new ByteArrayInputStream(imageBytes);
BufferedImage receivedImage = ImageIO.read(byteArrayInputStream);
// Convert BufferedImage to JavaFX Image
Image fxImage = SwingFXUtils.toFXImage(receivedImage, null);
// Update the JavaFX ImageView with the received frame
imageView.setImage(fxImage);
}
} catch (Exception e) {
e.printStackTrace();
}
}
If you need to see more of my code I'll gladly share with you. I would greatly appreciate any insights, suggestions, or guidance. Thanks in advance.
I tried changing format to PNG but the delay was even worst. I added the ExecutorService to the server too but it didn't change anything. I considered using UDP (DatagramSocket) but it would require to send many fragments that need to be reassembled so I don't really know if it's worth it. Choosing a larger time interval for sending the frames seems to help but above 100ms it's just not watchable.