I'd say that your first option should be to enable moderation, and make it very clear to posters that all posts are moderated, therefore they're not going to get their links up if they're spammers.
However, some of these spammers hire people from countries where labour is cheap and English is not everyone's first language - therefore the morons doing this may not be able to understand such a statement and therefore do it anyway (or they may not even understand why they're doing it etc, hence not understand the logical problem).
Also an option to consider is to rename your webform and make the link to it less robot-friendly (Use Javascript if absolutely necessary). Add it to your robots.txt as a page to exclude (or put it in an excluded directory).
Something as simple as a checkbox on the submission page "I verify that I am not a spam bot" (name the field something obscure) would probably stop most robots.
Another possibility is, if you're seeing lots of email headers in the messages, that they're actually exploiting an email header injection vulnerability in it (if the script sends email).
If they're exploiting (or trying to exploit) a header injection vulnerability, you can fix that pretty easily by simply disallowing newline characters in any field that goes in an email header (which of course, you should be doing already (I use a standard email function across my application which ensures that this can't happen anywhere)).
As such attacks are easily detectable, you'll be able to detect them and simply not add them to your database.
What appears to be the case is, that a human guides the robot initially, then they make some kind of database of vulnerable URLs, and later a spambot just spams them en-masse.
If you can block them by any other means, you could consider doing so. Try dumping the entire request data to a log file, to see if you can find any headers which you might be able to use to block their request - if it's a really serious problem.
Mark