Take the 2-minute tour ×
Code Review Stack Exchange is a question and answer site for peer programmer code reviews. It's 100% free, no registration required.

we want to check if URL is down or not. But sometimes, environment is down for maintenance for 3-4 hours and we dont want to keep sending emails during that time.

I have written a shell script for url check and running it every 30 mins using cronjob and incorporated below requirement. The actual requirements are:

1)check if url is up.If it is down, send an email.
2)cronjob will execute the script again. If Step 1 sent an email, then send an email again asking if the environment is under maintenance?
3)cronjob will execute the script again.If it is still down,dont do anything.
4) keep checking the url, if it is responding dont do anything.But it goes down again follow step 1-3 .

The script works. My request is,could you please review and suggest if there is nicer way to write the script since i'm learning shell script but dont know all the available options.

#!/bin/bash
#Checking urls from urls.txt
MAddr="[email protected]"
TIME=`date +%d-%m-%Y_%H.%M.%S`
SCRIPT_LOC=/user/inf/ete4/eteabp4/eid_scripts/jsing002
for url in `awk '{print $1}' $SCRIPT_LOC/urls.txt`
do
    /usr/bin/wget -t 0 --spider --no-check-certificate $url > wget.output  2>&1
    HTTPCode=`(/usr/bin/wget -t 0 --spider --no-check-certificate $url) 2>&1 | grep HTTP| tail -1|cut -c 41-43`
        ENV=`(grep $url $SCRIPT_LOC/urls.txt | awk '{print $2}')`
        echo $HTTPCode
        E1=`/bin/grep -ise  'refused' -ise 'failed'  wget.output`
        if [ "$E1" != "" ] || [ $HTTPCode -ge 500 ]
            then
                    status="DOWN"
                    echo "Step 1"
                    echo "${ENV}""_DOWN"

                    if [ -f "${ENV}""_DOWN" ];
                        then
                            echo "step 2"
                            echo "Please check if $ENV in Maintanance window.The check for $url has failed twice.Next The next failure email will be sent if preceding test was SUCCESSFUL" | /bin/mail -s "Is $ENV in Maintanance Window ?" $MAddr
                            mv "${ENV}""_DOWN" "${ENV}""_DOWN""_2"
                            echo "Step 3"
                        elif [ -f "${ENV}""_DOWN""_2" ];
                            then
                                echo "this is elif statement"

                        else
                            echo "E1 is empty. Site is down"
                            echo "Site is down. $url is not accessible" | /bin/mail -s "$ENV is $status" $MAddr
                            touch  "${ENV}""_DOWN"
                        fi

            else    
                        if [ $HTTPCode -eq 200 ]
                            then
                                status="UP"
                                echo $status
                                rm "${ENV}""_DOWN""_2"
                        fi
        fi
done


Content of urls.txt:
http://mer01bmrim:30270/rim/web         E2E-RIMLITE4
http://mer01csmap:18001/console         ABP_WL-E2E1
http://mer02sitap:18051/console         ABP_WL-E2E2
http://mer03sitap:18101/console         ABP_WL_E2E3
share|improve this question
add comment

1 Answer

  • Quoting: Be in the habit of always double-quoting your variables when you use them, e.g. "$url" instead of $url. Otherwise, nasty vulnerabilities could happen if a variable's value contains spaces or shell metacharacters. URLs, especially, often contain special characters such as & and ?.
  • Structure: When processing input with multiple columns, one row at a time, the idiom to use is…

    while read url env ; do
        # Do stuff here
    done < "$SCRIPT_LOC/urls.txt"
    

    If the columns are delimited by something other than whitespace…

    while IFS=: read user pwhash uid gid gecos homedir shell ; do
        # Do stuff here
    done < /etc/passwd
    
  • Status of wget: You run wget twice for each URL. Instead of trying to get the HTTP status code, consider using just the exit status of wget to indicate success or failure.

    if wget -q -t 0 --spider --no-check-certificate "$url" ; then
        # Handle success
    else
        # Handle failure
    fi
    

    I've used the --quiet flag here. Also, I think that an infinite timeout (-t 0) is a bad idea.

  • HTTP status interpretation: HTTP status codes other than 200 (e.g. 2xx or 3xx) could also indicate some kind of success.
  • Filesystem littering: You litter the current directory with temporary files wget.output and "${ENV}_DOWN" and "${ENV}_DOWN_2". Perhaps you could append all the state to a single log file.
  • SCRIPT_LOC could probably be computed using $(dirname $0).
share|improve this answer
    
@200_suceess . Thank you for replying. I'm going to include Quoting,--quiet in wget and Filesystem littering suggestion.<br/> . For structure section,i am using 2nd column to indicate the environment being represented by url. Thats why im using awk to get the url and then 2 column value in email to notify which environment is down. Is a good way to implement like this ? –  user2950074 Nov 4 '13 at 15:56
add comment

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.