One exception to this I know of is a dummy file. Take a 400 MB dummy file (containing all ZEROS) and compress it once at maximum compression with any compression application and you'll get around 100 KB to 200 KB. Now compress this compressed file and you should get a few KB sized file. I've noticed that this was possible when I looked at that compressed file with a hex editor and noticed that it too contains a series of many repetitive characters (IIRC, null characters?).you cant compress a compressed file anymore.. all you really do is add header/error correction records.. which makes the whole thing bigger..
That's where SFV comes inie; you can tell instantly if one of the zips is bad.. but when you have 99 rars of a solid archive, sometimes you can only tell the whole thing is bad, and it can be a pain in the ass..