This is a good question. The obvious answer is that the error handling mechanism you have set up handles the problem same as any other thrown exception. However, I think you are implying the compression filter might break while compressing a perfectly valid page, in which case the filter should simply send the uncompressed page.
The best way to handle this is to have the compression filter be smart enough to catch any errors the compression code might throw, which the filter presented does not, and simply send the uncompressed content. I'll roll those changes in to the filter, but for most practical uses this issue isn't a problem. The compression code uses the GZIP classes provided by the standard Java API. Unless you run out of memory on the server (which would probably cause all sorts of other problems) it is a fairly safe assumption that no exceptions will be thrown when trying to compress content. Its not as if the GZIP algorithm cares what type of content or how much content it is attempting to compress.