tags | ||
---|---|---|
|
- from other Node.js APIs that support streams(zlib, fs, http request/response, etc)
Readable.from(string or Buffer | async function *)
- using
readable.push
:
const stream = require('node:stream');
const readable = new stream.Readable({ read() { } });
readable.push('First');
readable.push('Second');
readable.push(null); // null means end of stream
Warning
Don't mix them up. Stick to one way that suits your codebase
readable.on('data')
- read chunk one by onereadable.on('readable'
- could be several chunks at once(null if the stream is ended)readable.pipe(writable)
pipeline
(for several streams):
const { pipeline } = require('node:stream/promises');
const fs = require('node:fs');
const zlib = require('node:zlib');
async function run() {
const ac = new AbortController();
const signal = ac.signal;
setImmediate(() => ac.abort());
await pipeline(
fs.createReadStream('archive.tar'),
zlib.createGzip(),
fs.createWriteStream('archive.tar.gz'),
{ signal },
);
}
run().catch(console.error); // AbortError
AsyncIterable
:
for await (const chunk of readable) {
console.log({ chunk });
}
- back-pressure(
writable.cork()
,readable.pause()
) stream
supportsAbortController
andAbortSignal
stream
supportsobject-mode
. Configures withreadableObjectMode
,writableObjectMode
.
https://github.com/HowProgrammingWorks/Streams/tree/master/JavaScript\