So standards-conformant web sites are mother-and-apple-pie kinda stuff. Everyone knows they're a good thing. The w3c has even been so helpful as to provide a way to validate anything you might have online.
Problem is, a lot of your stuff isn't online. Once you know it's valid, then it'll go online. Sure, the w3c validator will accept file uploads, but if you're doing any kinda dynamic content generation, you have to save the rendered output, and upload that. Every time.
So yesterday, (or was it Tuesday?) I hacked up a validator proxy. It's TurboGears, but it just uses CherryPy. It pulls down a specified URL, which can be a LAN-only URL, shoves it up to the w3c validator, and returns a slightly-rewritten version of the validator's response. For an encore, I hacked up the w3c validation bookmarklet, to teach it to use my proxy.
Validation happens a lot more when it's easy. And it's so validating...