People keep posting articles about protecting web servers with zip bombs. The idea is that if there's a malicious bot scraping your website, you can host a heavily compressed payload somewhere so that when the bot tries to decompress it it runs out of memory and crashes. Everybody writing these articles always creates their payloads in the same way:
This command uses dd to create a 10 gigabye file, and then uses gzip to compress that file to around 10 megabytes. This approach has always slightly annoyed me because it's obviously the least efficient way to create a zip bomb; like, what if the bot has more than 10 gigabytes of memory? Are you just going to create a bigger payload? How big are you willing to go? When you use this command and others like it, you're basically just hoping that you have more memory than your adversary, which seems like a bad assumption to make.
The zip bombs that this command generates are also just not very good. Deflate, the data format that powers gzip and zlib, has a maximum compression ratio of 1032x, meaning that a 1 megabyte payload can only expand to a maximum of around 1 gigabyte. Luckily, HTTP allows us to apply multiple content encodings to our data, so we could create a 1 megabyte payload that expands to a 1 gigabyte payload, which expands again into a 1 terabyte file. At the extreme end, we could even create a 1.5 kilobyte payload that expands 34 times to create a googol (10^100) byte file.