Sign up ×
Unix & Linux Stack Exchange is a question and answer site for users of Linux, FreeBSD and other Un*x-like operating systems. It's 100% free, no registration required.

First, I understand this is probably something better suited for cron. However, I don't have access to cron. (shop rule around here) So, I am using the next best scheduling option at my disposal.

This is the sequence:

I schedule a job to start using "at"

at 5:05 am tomorrow -f /opt/ecommerce/backup/analysis/Data/Scripts/DoDaily.sh

The script executes on time and runs fine up to the point that it should launch another shell script. I might mention, this works perfectly when start the first script from a normal command prompt.

It might be that this is a limitation of using at as a scheduler (it will not launch other shell scripts within the shell started with at.

The second to the last line (below) is the script that I am trying to call, but it seems to be ignored. Nothing shows up in my spool output saying it had some sort of error. I have tried running it by:

  1. Nohup
  2. calling the script (fully qualified)

If this sort of thing is not possible, I am OK with that too. If I can't figure it out, I usually find an answer using this resource. There some examples using at, but not in this specific scenario. I did check the man page for at..

Keep in mind -- This is not the "Pretty" final product. I usually troubleshoot to ensure things work as I "think" they should and then surround with error handling and additional comments.


#!/bin/bash
##Clear our the data folder
##
##
cd /opt/ecommerce/backup/analysis/Data 
find . -maxdepth 1 -type f -exec rm {} \;
cd /opt/ecommerce/backup/analysis/Data/Scripts 
##
./DoABunch.sh BlaBla 1 1017531
./DoABunch.sh BlaBla 1 1020055
##
##
## Copy the edi data to the temp BCfiles folder on the QA interior
scp /opt/ecommerce/backup/analysis/Data/*.zip eXXXXX@xlqxxxxx:/tmp/BCfiles/.
##
## Create a daily folder and put all of the stuff in the daily folder for test artifacts
ssh exxxxx@xlqxxxx 'mkdir -p /opt/ecommerce/backup/analysis/Data/$(date '+%d-%b-%Y')'
ssh exxxxx@xlqxxxx 'chmod -R 2777 /opt/ecommerce/backup/analysis/Data/$(date '+%d-%b-%Y')'

##
## Copy everything gathered to the folder just created
scp /opt/ecommerce/backup/analysis/Data/*.zip e22013@xlqxxxxx:/opt/ecommerce/backup/analysis/Data/$(date '+%d-%b-%Y')/.
scp /opt/ecommerce/backup/analysis/Data/*.txt e22013@xlqxxxxx:/opt/ecommerce/backup/analysis/Data/$(date '+%d-%b-%Y')/.
ssh exxxxx@xlqxxxxx 'chmod -R 2777 /opt/ecommerce/backup/analysis/Data/$(date '+%d-%b-%Y')'
##
nohup /opt/ecommerce/backup/analysis/Data/Scripts/DoFuelDaily.sh &
##
exit;
share|improve this question
    
Seems to me that this would be a strong case for requesting an exemption to your "shop rule". Your proposed workaround would be a maintenance nightmare. (But I hope you know that.) –  roaima Feb 23 at 15:49
1  
Your reference to the "second to the last line" - are you referring to the nohup /opt/ecommerce/backup/analysis/Data/Scripts/DoFuelDaily.sh & ? –  roaima Feb 23 at 15:51
1  
@wurtel the nohup will trap and ignore (almost) all signals. I'd suggest it's good practice when the controlling script/program exits while the backgrounded job is intended to continue running. –  roaima Feb 23 at 15:55
1  
You're using at to run the script in the background anyway! Run it in "the foreground" i.e. no nohup and no &. Let at take care of waiting for it etc. –  wurtel Feb 23 at 15:58
1  
You could try exec instead of nohup for the final script call if you're not doing anything after that. Or make another call to at (with "now") as the time. –  MattBianco Feb 23 at 16:05

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Browse other questions tagged or ask your own question.