In short: Yes. Go with the first example… The hash function can lose entropy if feed back to itself without adding the original data (I can’t seem to find a reference now, I’ll keep looking).
And for the record, I am in support of hashing multiple times.
A hash that takes 500 ms to generate is not too slow for your server (considering that generating hashes are typically not done the vast majority of requests). However a hash that takes that long will significantly increase the time it will take to generate a rainbow table…
Yes, it does expose a DOS vulnerability, but it also prevents brute force attacks (or at least makes them prohibitively slow). There is absolutely a tradeoff, but to some the benefits exceed the risks…
A reference (more like an overview) to the entire process: Key Strengthening
As for the degenerating collisions, the only source I could find so far is this discussion…
And some more discussion on the topic:
And a few more links:
- PBKDF2 on WikiPedia
- PBKDF2 Standard
- A email thread that’s applicable
- Just Hashing Is Far From Enough Blog Post
There are tons of results. If you want more, Google hash stretching
… There’s tons of good information out there…