Last night a friend linked me to an offer for VPS's.

On the page detailing the offer, a 100MB test file was linked. As I was interested in the offer, I tried the link.

I reached a download speed of 2.2MB/s. Given my home connection can only reach a download speed of 500kB/s on the best of days, I was rather surprised.

The cause for this was quite simple: the test file had a content-type of text/plain, and gzip compression was enabled for the file transfer.

Firefox, at least, measures the rate at which the file is downloaded post-decompression -- not the rate at which bytes are transferred.

This means that if a file has a 50% compression ratio, Firefox will show the file downloading twice as fast as bytes are being transferred. I personally think this is less than ideal (and that Firefox should really just show the actual download speed)

As you can see, this makes speed test files transferred over HTTP a potentially misleading metric.

Here are some example 10MB (10,000,000B) files to demonstrate the effect. Each contains 1000B of random data, followed by 1000B of zeroes, repeated 500 times.

There are a few ways to avoid this (as a user)

  1. Disable gzip in your client
  2. Use a HTTP client (e.g. wget) with gzip disabled by default

And, to ensure you don't accidentally do it as a host

  1. Disable gzip on the directory containing the test files
  2. Use 100% pseudorandom files
  3. Use a binary filetype that won't be gzipped by your httpd, e.g. .bin

Ultimately, this was just a simple mistake on the host's behalf, and I've made them aware of the issue. Their speeds, even without compression, are stellar. As are their ping times. But I'm certainly not going to take speed test files at face value in future.