Mark re-recommended compressing pages today, which is always a useful idea. Sneaky (The server Aqcom wallows on) has mod_gzip installed, which automagically compresses all static pages before sending them. Great, but not so cool for the dynamic PHP pages that the rest of the site works on.

My solution to this was to kill two birds with one stone. The first is that since Aquarionics caches every page the first time it’s loaded the cache directory can get quite large – esspecially when something like Google comes in and looks at (and therefore generates) all the pages on the site. First thing was then to modify my cache generation system as follows:

The cache-save system – which writes the contents of the output buffer to a file – just used @gzopen@ and @gzwrite@ instead of the standard PHP file-write code, and the cache-load decompressed it.

Next, testing for compression and sending if possible. First was simple, PHP gives you an array $HTTP_SERVER_VARS containing all the stuff the client sends, so I just needed to check the HTTP_ACCEPT_ENCODING variable for the string “gzip”:

and then the complicated bit, sending the right version. If the compression was on, all I needed to do was send the right headers and pass through the contents of the cache. If not, I would decompress the cache and pass that. Note that while there is a function to directly output the contents of a gzip file (gzpassthru) I tend to avoid it because I want to send a content-length header too, and I can’t get that if I only know the length after the output. gzpassthru returns the number of bytes out, but header("Content-Length: ".gzpassthru($cachename)) was abandoned for readiblity, though it’s prefectly valid. Anyway, the code:

(I’ve left the etag code in for context), so now all pages should be gzipped any time they are accessed after the first (First time it generates the content, echos it, then compresses it and writes it to the cache) The full code for all this is at the top and bottom of Epistula.php