This issue was reported before but no one in that thread seemed to realize the seriousness of the issue, and it's now closed to replies. There are three reasons I can think of why this is a very bad thing:
*The footer that is added to each image includes a number for how long PHP took to serve the page - this number is different every time, so that means every copy of the image that people get will be slightly different and will have a different hash. This makes automated searching for duplicates in your collection much more difficult, and any site people might upload things to later that does dupe checking won't be able to detect duplicate images from Minitokyo.
*The footer's presence means that all the images are being served through PHP, which is unnecessary and will place a great additional processing burden on the server (for example, one image I saved says "43.982sec. (44% PHP, 56% SQL)" which is a considerable amount of processing time).
*The images can't be cached because they're dynamically served, so the whole image downloads again when you right click, save as, which means Minitokyo uses double the bandwidth for every image.
I may be wrong about the last two because I don't know how Minitokyo is set up technically (maybe it's retrieving the images from a database rather than a flat directory) but the first could still cause significant problems if it's not fixed soon and lots of these corrupt images get out into the wild (so to speak).