Git Product home page Git Product logo

Comments (6)

MattiasBuelens avatar MattiasBuelens commented on June 9, 2024 1

The problem is backpressure.

The promises returned by writer.ready, writer.write(), writer.close() or writer.closed will only resolve once backpressure is relieved. However, by default, new TransformStream() sets the high water mark to 1 chunk for its writable side and 0 chunks for its readable side. This means that, if nobody is reading from the readable side, then the writable side will apply backpressure as soon as it has 1 chunk in its queue. Which is exactly what you see: chunk "a" is written, but then await writer.ready blocks on the next iteration.

If you uncomment line 4, then new Response(readable).text() will be actively pulling from the readable, and no chunks will remain in the writable's queue. Thus, await writer.ready resolves immediately and your loop can keep chugging along nicely.

One way to solve this would be to give the readable side an unbounded queue size:

let { readable, writable } = new TransformStream({}, {}, { highWaterMark: Infinity });

However, this is a really bad idea. You're not really "streaming" anything: first you're writing all chunks into a queue, and afterwards you're reading them all out again. Effectively, it would behave the same if you first push()ed all chunks into an array, and then looped through the array. 🤷‍♂️

The best solution is still to set up the consumer as soon as possible, so that it can relieve backpressure as needed. So that means keeping new Response(readable).text() at line 4 instead of line 15. 😁

Alternatively, move your writer code into a separate async function that runs "on the side" of your consumer:

(async () => {
  try {
    let { readable, writable } = new TransformStream();
    writeData(writable).catch(() => {});
    console.log(await new Response(readable).text());
  } catch (e) {
    console.error(e);
  }
})();

async function writeData(writable) {
  let writer = writable.getWriter();
  try {
    let encoder = new TextEncoder();
    for (const data of 'abcdef') {
      console.log({ data }); // {"data": "a"}
      await writer.ready;
      // Purposefully don't await write()
      // See https://streams.spec.whatwg.org/#example-manual-write-with-backpressure
      // and https://streams.spec.whatwg.org/#example-manual-write-dont-await
      writer.write(encoder.encode(data)).catch(() => {});
    }
    await writer.close();
  } catch (e) {
    await writer.abort(e);
  }
}

from web-streams-polyfill.

MattiasBuelens avatar MattiasBuelens commented on June 9, 2024 1

I just don't see how something as simple as the initial code block I posted is known to hang, and hand and exit, and still be WAI.

There's tons of ways you can cause a program to hang with a single line of code. For example:

await new Promise((resolve) => { /* never actually call resolve() */});
console.log("never reached");

Streams apply backpressure if they have too many chunks in their queue. And yes, by default, they have a fairly low high water mark, so you'll hit that limit almost immediately. It's up to the developer to ensure that chunks can "flow" steadily through the streams, and don't end up "stuck" in a queue somewhere.

@MattiasBuelens It looks like txiki.js is using web-streams. When line 4 is un-commented and line 15 is commented we still get only {data: "a"}, so there's a bug in this repository for that case.

I haven't heard of txiki.js before... and I don't have time to build it from source. 😅 It works fine in Node.js if you load the polyfill, see this live example.

Looking at their source code, they seem to be using whatwg-fetch as their fetch() implementation. That project does not support streaming fetch, see JakeChampion/fetch#198 (comment). So new Response(readable) will not work in txiki.js, the Response constructor will just fall back to calling toString() on its input and you'll end up with the string "[object ReadableStream]" as your response body.

If you need streaming fetch support in txiki.js, you should talk to them instead. 😉 Or, you should use a TextDecoderStream (or a polyfill for that) to read the chunks back into text.

from web-streams-polyfill.

guest271314 avatar guest271314 commented on June 9, 2024

Thanks.

On the specification side I am kind of surprised it is specification compliant to have a hanging Promise that is never going to be fulfilled. Now that I think about it I encountered this when creating a WebSocketStream I used pipeThorugh() without await else no data would be read

var {
  readable,
  writable
} = await wss.opened;
var now;
var writer = writable.getWriter();
var abortable = new AbortController();
var controller;
var {
  signal
} = abortable;
writer.closed.then(() => console.log('writer closed')).catch(() => console.log('writer closed error'));
let minutes = 0;
readable.pipeThrough(new TextDecoderStream()).pipeTo(
  new WritableStream({
    async start(c) {
      console.log(c);
      (async () => {
        while (!c.signal.aborted) {
          try {
            await new Promise((resolve, reject) => {
              c.signal.onabort = reject;
              setTimeout(resolve, 60000);
            });
            minutes++;
          } catch {} finally {
            console.log(minutes);
          }
        }
        console.log(c.signal.aborted);
      })();
      return controller = c;
    },
    write(v) {
      console.log(v);
    },
    close() {
      console.log('Socket closed');
    },
    abort(reason) {
      console.log({
        reason
      });
    }
  }), {
    signal
  }).then(() => console.log('pipeThrough, pipeTo Promise')).catch(() => console.log('caught'));


var encoder = new TextEncoder();
var enc = (text) => encoder.encode(text);

On the application side, I think I was looking for the highWaterMark: Infinity pattern for what I was working on a few days ago that prompted me to revisit this; converting a Node.js stream.Readable to a WHATWG ReadableStream. I would up using this pattern

let controller;

const webReadable = new ReadableStream({
  start(c) {
    return (controller = c);
  },
});

I'm going to re-read what you wrote, a few more times.

I just don't see how something as simple as the initial code block I posted is known to hang, and hand and exit, and still be WAI.

from web-streams-polyfill.

guest271314 avatar guest271314 commented on June 9, 2024

@MattiasBuelens It looks like txiki.js is using web-streams. When line 4 is un-commented and line 15 is commented we still get only {data: "a"}, so there's a bug in this repository for that case.

from web-streams-polyfill.

guest271314 avatar guest271314 commented on June 9, 2024

txiki.js is pretty easy to build from source. Yeah, fetch() polyfills don't really work.

I just find the hanging Promise WAI problematic. Setting highWaterMark to Infinity is what I was looking for for the case I was working on.

Feel free to close this if you think it's not relevant here.

from web-streams-polyfill.

MattiasBuelens avatar MattiasBuelens commented on June 9, 2024

Setting highWaterMark to Infinity is what I was looking for for the case I was working on.

I'm happy I could help you find a solution! Have a nice day. 😊

from web-streams-polyfill.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.