The Internet Storm Center has always advocated spending time monitoring and reviewing your logs, whether they are your personal systems or the ones at the office. Logs should be full of useful information so becoming accustomed to normal events helps quickly identify those outliers which can mean something worth looking at is going on. One of the key points to note; you, not the factory defaults, have to make the information in the logs actually useful.
When I first set up a blog, looking through what it logged, I found little of help or use to determine the threats my little spot on the internet faced. That was the default out of the box set up and millions of blogs are configured to effectively tell their owners little of no information on the bad stuff*. A quick bit of searching pulled up a number of plugins to log details of all activities: IP address, action, user agent, etc. and neatly dropped them in to a database. As it’s using a well-known blog engine that’s had more that its fair share of troubles, it receives a healthy dose of attention from the automated denizens of the Internet. With the plugins it’s reasonable easy to split up these net denizens between good autonotoms and downright bad. The good-ish connections are from search engine bots , all doing relatively benign index of the site; then we have evil flying monkey bots: spammers and automated attacks.
With the right information now being logged automated attacks are very easy to spot, especially the password guess attacks on default accounts. This immediately provides a list of known bad IP addresses being used with malicious intent. The SPAM entries have to be taken as bad IP addresses, even though many take SPAM to just an annoyance, these systems are clearly aren’t playing nicely and are most likely being directed to post messages without their owner’s consent.
These two lists of IP addresses can be added to one big block list on the site’s .htaccess file  or further broken down in to sub type and groups to further profile, if you’re that way inclined, as two examples of proactive actions.
Logs can be a rich source of data for many reasons, but only if configured, tweaked and then parsed. If you still don't think it's worth the time to read logs and pull valuable data out of them, read any of the technical details on the major breaches in the press. Password guessing, command injection and SQL injection attacks are incredibly easy to spot - if the right data is logged and you're looking for it.
 Example of what can be logged http://httpd.apache.org/docs/2.0/mod/mod_log_config.html
 Example of some of the more common search bots strings http://user-agent-string.info/list-of-ua/bots
*Now if you have permissions to the access log – the log which shows details  on every connection to your site - that’s a different story, but that’s normally up to the hosting provider if they allow it.
Chris Mohan --- Internet Storm Center Handler on Duty