I have a socket.io server which listens to sockets:
io.on('connection', (socket) => {
socket.on('myEvent', function(data){
socket.emit('eventReceived', { status: 1 });
});
});
I know that Node.js and Socket.IO is not working in multithreading, so I wonder how to effectively handle multiple clients, sending a myEvent at the same time.
I've read a few things about Clusters. Would it be as easy as just adding the following code infront of my project?
const cluster = require('cluster');
const os = require('os');
const socketIO = require('socket.io');
if (cluster.isMaster) {
const numCPUs = os.cpus().length;
console.log(`Master ${process.pid} is running`);
// Fork workers
for (let i = 0; i < numCPUs; i++) {
cluster.fork();
}
cluster.on('exit', (worker, code, signal) => {
console.log(`Worker ${worker.process.pid} died`);
});
} else {
//my socket code
}
I've also read a few things about Redis, how would it be useful in such a case? Is it effective on an Raspberry Pi? Would it be possible to create a new thread for each user?
Thanks for any help :)
By my knowledge of this, clustering your node.js should prove effective in spreading the workload among several cpu cores. The code snippet for clustering is correct and should help with any scalability issues. I would suggest using clusters because I don't have much experience with Redis but i do know that it can help depending on workload and resources.