Maybe I'm asking a stupid question, but I really don't understand.
I have two scripts: socket server and socket client.
socket server:
this.io = new Server(HTTPServer, {
wsEngine: eiows.Server,
cors: {
origin: process.env.SOCKET_ORIGIN,
methods: ["GET", "POST"],
credentials: true,
transports: ['websocket', 'polling']
},
allowEIO3: true,
parser
});
socket client:
const { io } = require("socket.io-client");
this.socket = io(process.env.SOCKET_SERVER, {
withCredentials: true,
});
Both scripts work on the local network and do not have access from outside via any domain.
The socket server is started by the HTTP server on port 3000, however the socket client does not require the HTTP server to start the connection.
When I run the client in debug mode DEBUG=engine,socket.io* node I get this:
socket.io-client:url parse http://127.0.0.1:3000 +0ms
socket.io-client new io instance for http://127.0.0.1:3000 +0ms
socket.io-client:manager readyState closed +0ms
socket.io-client:manager opening http://127.0.0.1:3000 +0ms
socket.io-client:manager connect attempt will timeout after 20000 +3ms
socket.io-client:manager readyState opening +0ms
socket.io-client:manager disconnect +0ms
socket.io-client:manager closed due to forced close +0ms
socket.io-client:manager cleanup +0ms
What should I specify in the origin on the server in order to successfully connect to my socket?
UPD: the problem was in the parser