I'm looking for ways to speed up the process of adding entries to a zip-archive, in a web app, using DotNetZip.
The problem currently is that the response times out before all files has been added to the archive. There can potentially be between 200 - 500 entries in each archive, each entry between 1 - 4 meg in size. I've tried a sample of 100 files which is too slow.
Compression is not necessary since all files are .mp3-files.
I currently have this code:
private readonly IDictionary<string, Stream> _files = new Dictionary<string, Stream>();
public bool CreateZip(ref MemoryStream result, bool compress = true)
{
using (ZipFile zip = new ZipFile())
{
zip.UseZip64WhenSaving = Zip64Option.AsNecessary;
if (!compress)
zip.CompressionLevel = CompressionLevel.None;
if (_files.Any())
{
foreach (var file in _files)
{
ZipEntry entry = zip.AddEntry(file.Key, file.Value);
entry.Comment = file.Key;
}
}
zip.Save(result);
result.Seek(0, SeekOrigin.Begin);
result.Flush();
}
return true;
}
The files are first added to a dictionary (as streams from a RavenFS/RavenFileStorage) and then added to the archive. The zip itself is also saved to a stream.
I thought about threading but it seems that is not supported by the lib when adding entries to the archive.
Am I using the lib correctly or is any method faster?