Take the 2-minute tour ×
Unix & Linux Stack Exchange is a question and answer site for users of Linux, FreeBSD and other Un*x-like operating systems. It's 100% free, no registration required.

Suppose that I have three (or more) bash scripts: script1.sh, script2.sh, and script3.sh. I would like to call all three of these scripts and run them in parallel. One way to do this is to just execute the following commands:

nohup bash script1.sh &
nohup bash script2.sh &
nohup bash script3.sh &

(In general, the scripts may take several hours or days to finish, so I would like to use nohup so that they continue running even if my console closes.)

But, is there any way to execute those three commands in parallel with a single call?

I was thinking something like

nohup bash script{1..3}.sh &

but this appears to execute script1.sh, script2.sh, and script3.sh in sequence, not in parallel.

share|improve this question
2  
What does "single call" mean? –  jw013 Nov 21 '14 at 22:10
1  
What is the use case? Do you have a million scripts to start? –  l0b0 Nov 21 '14 at 22:10
    
@jw013 I mean, something like a single short line command. If I have 100 scripts to start, I would like to be able to type something short (like nohup bash script{1..100}.sh & or for i in {1..100}; do nohup bash script{1..100} &; done), rather than typing nohup bash script*.sh & 100 different times. –  Andrew Nov 21 '14 at 22:11
1  
In case the scripts have useful output: You can start them within screen, too (or tmux), in order to solve the console problem but keep access to the output (and input). –  Hauke Laging Nov 21 '14 at 22:12
1  
There is nothing that prevents you from typing all 3 of those commands in the same line. nohup ... & nohup ... & nohup ... &. If you mean instead that you want to run all of the scripts without typing each script name individually, a simple loop will do it. –  jw013 Nov 21 '14 at 22:12

6 Answers 6

up vote 9 down vote accepted
for((i=1;i<100;i++)); do nohup bash script${i}.sh & done
share|improve this answer
2  
You're missing a close parenthesis. –  David Conrad Nov 22 '14 at 7:13
    
@HaukeLaging What if script names are different ? –  pythonlearner Jul 31 at 15:04

A single line solution:

$ nohup bash script1.sh & nohup bash script2.sh & nohup bash script3.sh &

Less facetiously, just use a wrapper script:

$ cat script.sh
#!/usr/bin/env bash
script1.sh &
script2.sh &
script3.sh &
$ nohup script.sh &

Or loop over them:

for script in dir/*.sh
do
    nohup bash "$script" &
done
share|improve this answer

A better way would be to use GNU Parallel. GNU parallel is simple and with it we can control the number of jobs to run in parallel with more control over the jobs.

In the below command, script{1..3}.sh gets expanded and are sent as arguments to bash in parallel. Here -j0 indicates that as many jobs should be run as possible. By default parallel runs one job for one cpu core.

$ parallel -j0 bash :::: <(ls script{1..3}.sh)

And you can also try using

$ parallel -j0 bash ::: script{1..3}.sh

While executing the second method if you get any error message then it means that --tollef option is set in /etc/parallel/config and that needs to be deleted and every thing will work fine.

You can read GNU Parallels man page here for more richer options.

And in case if your are running the jobs from a remote machine, better use screen so that the session does not gets closed due to network problems. nohup is not necessary, as recent versions of bash as coming with huponexit as off and this will prevent parent shell from sending HUP signal to its children during its exit. In case if its not unset do it with

$ shopt -u huponexit  
share|improve this answer
    
If you are going to use bash as the shell parallel -j0 bash :::: <(ls script{1..3}.sh) can be reduced to parallel -j0 bash :::: script{1..3}.sh, no? –  1_CR Nov 22 '14 at 3:41
    
No its not, when '::::' is used with parallel it means that the argument is a file which contains commands to be executed and not the command itself. Here we are using process substitution to redirect the script names within a files descriptor. –  Kannan Mohan Nov 22 '14 at 4:10
    
Er.. in that case why not parallel -j0 bash ::: script{1..3}.sh? –  1_CR Nov 22 '14 at 4:13
    
It's bash ::: script{1..3}.sh being passed to parallel, not ::: script{1..3}.sh. So this should first expand to parallel bash ::: script1.sh script2.sh script3.sh by the shell and then parallel invocations of bash script1.sh, bash script2.sh, bash script3.sh. I tried it –  1_CR Nov 22 '14 at 4:24
    
I think you are confused, Check out the answer I have mentioned :::: and not :::, both have different use. Read the parallel man page to understand more about them. –  Kannan Mohan Nov 22 '14 at 4:44

If you're looking to save yourself some typing effort

eval "nohup bash "script{1..3}.sh" &"

Or on second thoughts, maybe not

share|improve this answer

We can also use xargs to run multiple script in parallel.

$ ls script{1..5}.sh|xargs -n 1 -P 0 bash

here each script is passed to bash as argument separately. -P 0 indicates that the number of parallel process can be as much as possible. It is also safer that using bash default job control feature (&).

share|improve this answer

I am suggesting a much simpler utility I just wrote. It's currently called par, but will be renamed soon to either parl or pll, haven't decided yet.

https://github.com/k-bx/par

API is as simple as:

par "script1.sh" "script2.sh" "script3.sh"
share|improve this answer

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.