Monitor your website’s uptime using a bash script and a cronjob

Maintaining a handful of websites can be stressful. Even more so, not knowing when one of those sites goes down.

There are services such as SiteUptime that will do this for you out of the box, however there are efficient, free ways of doing this yourself.

Prerequisites:
1. You have permission to create and run bash (shell) scripts on your server.
2. You have permission to create and schedule cronjobs on your server.
3. You have permission to send emails from your server.

Ok, let’s get started.

1. Create a file on your server called monitorsites.sh (the location doesn’t matter, however I recommend keeping it outside of publicly accessible folders).
2. In monitorsites.sh, paste the following:


#!/bin/bash

SITESFILE=sites.txt #list the sites you want to monitor in this file
EMAILS="you@email.com,someoneelse@email.com" #list of email addresses to receive alerts (comma separated)

while read site; do
    if [ ! -z "${site}" ]; then
        
        CURL=$(curl -s --head $site)
        
        if echo $CURL | grep "200 OK" > /dev/null
        then
            echo "The HTTP server on ${site} is up!"
        else    

            MESSAGE="This is an alert that your site ${site} has failed to respond 200 OK."

            for EMAIL in $(echo $EMAILS | tr "," " "); do
                SUBJECT="$site (http) Failed"
                echo "$MESSAGE" | mail -s "$SUBJECT" $EMAIL
                echo $SUBJECT
                echo "Alert sent to $EMAIL"
            done      
        fi
    fi
done < $SITESFILE

3. In the monitorsites.sh file, change the EMAILS variable to a comma separate list of email addresses you wish to send the notification to. For Example: EMAILS="someone@gmail.com,anotherone@gmail.com"

4. Create a file named sites.txt in the same folder as monitorsites.sh. In this file, create a list of complete URLs you wish to monitor.


http://www.google.com
http://www.axertion.com
http://someotherwebsite.com

Make sure your domain names are absolutely correct. Double check whether your URLs use WWW or Non-WWW url structures

5. Test the script. Login to your server using SSH, and run the bash file using the following command:
sh monitorsites.sh

6. If done correctly, you will see something like this:


The HTTP server on http://www.google.com is up!
The HTTP server on http://www.axertion.com is up!
http://someotherwebsite.com (http) Failed
Alert sent to you@email.com
Alert sent to someoneelse@email.com

7. To have this script run automatically (say every 5 minutes), you can setup a cronjob. You can read more on how to do this by reading HowTo: Add Jobs To cron Under Linux or UNIX

Once setup, your cronjob will run at the interval you set, and you will receive an email if your site doesn’t response.

Keep in mind, if your sites are on the same server as this script runs and the cause of downtime is your server going down, you may not receive the alert email. This should alert you if your apache (httpd) process fails as this script does not rely on that process to run.

10 Comments

  1. Steev

    Greetings,

    Scripts looks fine, but what if I want to see what error is showing when loading a site pointed in the sites.txt?

    Thanks

    • Axel (Author)

      This isn’t a catchall solution for logging errors. This script only checks if the site is responding correctly and returning 200 Success response.

  2. Steev

    Another thing, how can I receive e-mails only if 403,404,500,503 errors persist?

    Thanks again

    • Axel (Author)

      You would have to drastically change the script to do that.

      You would first have to store the persisted errors in a log, then count them after a certain amount has been reached to trigger the email. A bit more complex than the script provided.

      This script was given to be simple and to the point, to check if the site is responding at certain intervals.

  3. KC

    We just added a few nested IFs for the other conditions and ech the httpcode and timestamp if it failed :)

    A

  4. Anil

    great

  5. The script didn’t work for me because the ISP rejected curl –head requests.

    To fix, this line:
    > CURL=$(curl -s –head $site)
    became
    > CURL=$(curl -s $site)

    and
    > if echo $CURL | grep “200 OK” > /dev/null
    became
    > if echo $CURL | grep “some text that was on my page” > /dev/null

    If you still want to monitor multiple sites, use some text that is guaranteed to appear on your pages but also won’t appear on a failed page. Perhaps “” will work for you but I’ve not tested.

  6. … that post got filtered,

    “Perhaps (left arrow)body(right arrow) will work for you…”

  7. thank you for bash script
    it’s very useful.

  8. Hi,

    Thanks for the post. does this work for secured (https) sites too? I did not get a response when I provided a https site.

    Thanks,
    zen.

Leave a Reply to Steve Root