Chunked data transmission not working properly

924 Views Asked by At

I'm using content-type:chunked to transfer data in chunks, separated by \r\n, but the data received by the client is not received in chunks, but all together. How can I solve this problem?

I'm currently using content-type:chunked to transfer data in chunks, separated by \r\n. However, when the client receives the data, it seems to be receiving it all at once, rather than in the expected chunks. I'm unsure why this is happening, and would appreciate any guidance.

Here are some additional details about my setup:

The server is sending the data correctly with properly formatted chunk headers. I've checked the implementation of my client to ensure it is compatible with chunked data transfer. There are no apparent issues with my network connection, but there may be firewalls or proxies that could potentially interfere with chunked transfer encoding.

Any ideas on what might be causing this issue or how to resolve it would be greatly appreciated. Thank you in advance!

The code for the client to read chunked data is

fetch('https://xxxxxxxxxxxxx')
    .then((response) => {
      const stream = response.body.getReader();

      return new ReadableStream({
        async pull(controller) {
          try {
            const { done, value } = await stream.read();
            if (done) {
              controller.close();
            } else {
              controller.enqueue(value);
            }
          } catch (error) {
            controller.error(error);
          }
        },
      });
    })
    .then(async (stream) => {
      const reader = stream.getReader();
      while (true) {
        const { done, value } = await reader.read();
        if (done) {
          break;
        }
      }
    })
    .catch((error) => console.error(error));

The example of the data returned by the server is

{"x":1}\r\n{"x":1}\r\n{"x":1}\r\n{"x":1}\r\n
2

There are 2 best solutions below

3
Heiko Theißen On

You do not produce chunks just by inserting \r\n into the response payload. To make the server use chunked encoding, use res.write. But if the chunks come too quickly, the consuming client might still combine several of them into one stream.read() response.

The following server-side code makes your client receive three chunks of 9 bytes each.

http.createServer(function(req, res) {
  res.write(`{"x":1}\r\n`);
  setTimeout(function() {
    res.write(`{"x":1}\r\n`);
  }, 100);
  setTimeout(function() {
    res.end(`{"x":1}\r\n`);
  }, 200);
});

But reducing the timeout to 0 gives one chunk of 27 bytes on the client, even though the server still uses chunked encoding.

To summarize, your client should not rely on the chunk boundaries and should not, for example, assume that every chunk on its own is well-formed JSON.

Perhaps what you really want is separate the response by newlines (\r\n). The following client-side code does this with two for loops: The outer adds a chunk to the end of a buffer and the inner consumes lines from the beginning of the buffer. The separation into chunks is therefore independent of the separation into lines.

fetch(...).then(async function(response) {
  var stream = response.body.pipeThrough(new TextDecoderStream()).getReader();
  var buffer = "";
  for (; ;) {
    var { value, done } = await stream.read();
    if (done) break;
    buffer += value;
    for (; ;) {
      var offset = buffer.indexOf("\r\n");
      if (offset >= 0) {
        console.log(buffer.substring(0, offset));
        buffer = buffer.slice(offset + 2);
      } else
        break;
    }
  }
});

But as gre_gor's answer shows, such a splitting of the response can more easily be achieved with server-sent events. Note that these are split at double newlines (that is, at empty lines).

4
gre_gor On

Content-Type: chunked is not a thing. You probably mean Transfer-Encoding: chunked, but that requires a preceding line with the chunk's length in hexadecimal.
That encoding is meant to be used when sending large amounts of data without knows size in advance and letting the client have data in chunks.

You could make it work with that, but there is a better way to do what you want. You can use Server-sent events, which has native support in browsers.

Server:

app.get("/sse", async (req, res) => {
    res.writeHead(200, {
        "Content-Type": "text/event-stream",
        "Connection": "keep-alive",
        "Cache-Control": "no-cache"
    });
    for (let i=0; i<10; i++) {
        let data = {
            i,
            time: (new Date()).toString(),
        };
        res.write("data: "+JSON.stringify(data)+"\n\n");
        await sleep(1000);
    }
    res.end();
});

Client:

const events = new EventSource("/sse");
events.onmessage = event => {
    const data = JSON.parse(event.data);
    console.log(data);
};