Using Data Compression in .NET 2.0
Subject:   Slight improvement
Date:   2009-08-17 14:45:17
From:   Nano2k
You can get rid of the inefficiency of ExtractBytesFromStream method by providing the lenght of the original (that is, uncompressed) array of bytes.

The method is simple. Just before starting to compress, supply the lenght of the original array as the first four bytes of the resulting (that is, compressed array).

The trick is NOT to compress those 4 bytes.
See below:

using System.IO;
using System.IO.Compression;

public static byte[] CompressData(byte[] data) {
using (MemoryStream ms = new MemoryStream(data.Length)) {
//prepare a 4-bytes buffer containing the length of the original (uncompressed) data buffer
byte[] buf = BitConverter.GetBytes(data.Length);
//...and write it to the resulting (compressed) stream
ms.Write(buf, 0, buf.Length);

//now, go on with compression; remember, the first 4 bytes are uncompressed, so we'll be able to directly access them to learn the lenght of the uncompressed buffer
using (GZipStream gz = new GZipStream(ms, CompressionMode.Compress, true)) {
gz.Write(data, 0, data.Length);
ms.Position = 0;
byte[] compressed = new byte[ms.Length];
ms.Read(compressed, 0, compressed.Length);
return compressed;
public static byte[] DecompressData(byte[] data) {
//retrieve the length of the original (uncompressed) buffer
int length = BitConverter.ToInt32(data, 0);

//initialize the imput stream starting with the 5th byte, that is, eliminate the first 4 bytes as they are already "consumed"
using (MemoryStream ms = new MemoryStream(data, 4, data.Length - 4)) {

//because we know the size of the resulting buffer, we can safely allocate just as much space as needed
//this means no need for supplimentary memory allocations / relocations => speed + memory optimization
byte[] decompressed = new byte[length];
using (GZipStream gz = new GZipStream(ms, CompressionMode.Decompress, false)) {
gz.Read(decompressed, 0, decompressed.Length);

//that's all, folks!
return decompressed;