Pages

Friday, 20 July 2012

Monitoring HTTP server logs on the command line.

I was after a real quick and cheap method of monitoring our server logs. Various Cloud solutions were mentioned so I chucked together this quick shell script to get the data I want out of the logs while we investigate different, fuller options.

This script displays the pages which are server erroring the most, order by number of errors.

grep " 500 " 2012-07-*.txt | cut -f 7 -d" " | cut -f 1 -d"?" | sed  's_/[0-9]*$_/{ID}_' | sort | uniq -c | sort -nr | sed  "s/\(.*api.*\)/`echo -e "\033"`[1m\1`echo -e "\033"`[0m/"



How it works:
grep " 500 " 2012-05*.txt  # Search for 500 in all the files for the month of May, 2012.
| cut -f 7 -d" "           # Take only field 7, where the URL lives.
| cut -f 1 -d"?"           # Strip out everything after the ?
| sed  's_/[0-9]*$_/{ID}_' # Replace any IDs with {ID}
| sort | uniq -c           # Counts
| sort -nr                 # Sorts the count
| sed  "s/\(.*api.*\)/`echo -e "\033"`[1m\1`echo -e "\033"`[0m/"
                           # This line highlights each of the lines that contain the string api.

No comments:

Post a Comment