Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create a single zip file incrementally? #760

Open
spinlud opened this issue Apr 26, 2024 · 2 comments
Open

Create a single zip file incrementally? #760

spinlud opened this issue Apr 26, 2024 · 2 comments

Comments

@spinlud
Copy link

spinlud commented Apr 26, 2024

I would like to build a single zip file incrementally. What I have tried:

import {PassThrough} from 'stream';
import archiver from 'archiver';

(async () => {
	const stream = new PassThrough(); // Pass this to some upload function
	const archive = archiver('zip', {
		zlib: { level: 9 }
	});

	archive.pipe(stream);

	// Assume this data can't fit in memory all at once, so it should be added incrementally
	for (let i = 0; i < 100; i++) {
		const textData = `data-${i}`;
                // The problem is here (data is not appended to the same file). If using different file names, this result in multiple zip files instead of a single zip file
		archive.append(Buffer.from(`${textData}\n`, 'utf8'), { name: `myfile.txt` });
	}

	await archive.finalize();
	await upload;
})();

Using archive.append(Buffer) on a single file doesn't seem to work properly, only the first appended data is included, the rest id discarded. Using different file names in the append operation would result in multiple zipped files, instead of a single zip file.

Another question: is the data flowing from the archive stream to the destination stream only when we call archive.finalize()? If this is the case, then I would still loading all the data in memory. Ideally I would like the data to flow from archive stream to destination stream as soon as available.

Any solution for this?

@spinlud
Copy link
Author

spinlud commented Apr 26, 2024

Appending a Stream instead of a Buffer seems to be a possible solution. Any thought?

import {PassThrough} from 'stream';
import archiver from 'archiver';

(async () => {
	const inputStream = new PassThrough();
	const outputStream = new PassThrough(); // Pass this to some upload function
	const archive = archiver('zip', {
		zlib: { level: 9 }
	});

	archive.pipe(outputStream);
	archive.append(inputStream, { name: `myfile.txt` });

	// Assume this data can't fit in memory all at once, so it should be added incrementally
	for (let i = 0; i < 100; i++) {
		const textData = `data-${i}\n`;                
		inputStream.write(textData, 'utf8')
	}

	inputStream.end();	
	await archive.finalize();	
})();

@ftokarev
Copy link

Hello @spinlud, have you found a solution? I'm facing a similar problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants