Take the 2-minute tour ×
Unix & Linux Stack Exchange is a question and answer site for users of Linux, FreeBSD and other Un*x-like operating systems. It's 100% free, no registration required.

I have one bash script which calls the same perl scrips in a serial way. The bash script is used to collect the overall results, while the perl script collects the results of my simulations for the given attributes.

The bash script looks as follows:

mkdir ./results/csv     && \
../perlscripts/v2csv.pl -v -F reach results/Heterogeneous*.vec > ./results/csv/reach.csv
../perlscripts/v2csv.pl -v -F roundTrip results/Heterogeneous*.vec > ./results/csv/RT.csv
../perlscripts/v2csv.pl -v -F downlink results/Heterogeneous*.vec > ./results/csv/DL.csv
../perlscripts/v2csv.pl -v -F clusters results/Heterogeneous*.vec > ./results/csv/clusters.csv

Collecting the results by calling one perl script at a time really long, I am looking for a way which would allow me to call the different variations of the perl script within the bash script in parallel. Is there a way to achieve this in bash?

Just to clarify, I don't want the commands which call the perl script to be dependent on each other in any way. I want all of them to start at the same point in time, as if I had 4 seperate bash-terminals each executing one of these commands

Similar: http://stackoverflow.com/questions/15644991/running-several-scripts-in-parallel-bash-script

share|improve this question

3 Answers 3

If you have gnu parallel installed, you could make a script with just the commands, e.g.:

../perlscripts/v2csv.pl -v -F reach results/Heterogeneous*.vec > ./results/csv/reach.csv
../perlscripts/v2csv.pl -v -F roundTrip results/Heterogeneous*.vec > ./results/csv/RT.csv
../perlscripts/v2csv.pl -v -F downlink results/Heterogeneous*.vec > ./results/csv/DL.csv
../perlscripts/v2csv.pl -v -F clusters results/Heterogeneous*.vec > ./results/csv/clusters.csv

and then run them in parallel:

mkdir ./results/csv && parallel :::: myscript.sh

Alternatively, invoking the command and using {} - the default replacement string:

mkdir ./results/csv && parallel ../perlscripts/v2csv.pl -v -F {} \
results/Heterogeneous*.vec '>' ./results/csv/{}.csv ::: reach roundTrip downlink clusters

would run the following commands in parallel:

../perlscripts/v2csv.pl -v -F reach results/Heterogeneous*.vec > ./results/csv/reach.csv
../perlscripts/v2csv.pl -v -F roundTrip results/Heterogeneous*.vec > ./results/csv/roundTrip.csv
../perlscripts/v2csv.pl -v -F downlink results/Heterogeneous*.vec > ./results/csv/downlink.csv
../perlscripts/v2csv.pl -v -F clusters results/Heterogeneous*.vec > ./results/csv/clusters.csv
share|improve this answer
../_Cscripts/v2csv.pl -v -F reach results/Heterogeneous*.vec > ./results/csv/reach.csv &
../_Cscripts/v2csv.pl -v -F roundTrip results/Heterogeneous*.vec > ./results/csv/RT.csv &
../_Cscripts/v2csv.pl -v -F downlink results/Heterogeneous*.vec > ./results/csv/DL.csv &
../_Cscripts/v2csv.pl -v -F clusters results/Heterogeneous*.vec > ./results/csv/clusters.csv &
wait

The & puts the program in background. wait waits for them to stop if you care.

share|improve this answer
    
what would happen if there was no wait? –  cross Mar 17 at 19:51
    
@Cross without the wait, the script wouldn't wait for all the scripts to finish before it itself ended. –  roaima Mar 17 at 19:53
    
I'm not sure. I think the script would just end and the processes would just be running in the background. If that is what you want, precede each command with nohup and be sure. –  Robert Jacobs Mar 17 at 19:54

You can try the following syntax:

mkdir ./results/csv && (script0 & script1 &)

This will run the scripts in the background, not waiting for them to finish. The parentheses introduce a subshell group (so that no script will be run if the mkdir command fails) and the & requests background execution (returning control to the outer shell right away).

share|improve this answer
    
how does is differ from the solution given by Robert Jacobs? asking just for understanding purposes –  cross Mar 17 at 19:50
    
It's hardly different from the other answer... there was no answer posted when I started typing mine. –  dhag Mar 17 at 19:51
    
but you don't have the wait and he does not use paratheses –  cross Mar 17 at 20:13
    
That's right; the parentheses are for grouping, so that none of the scripts get run if mkdir fails (one could avoid spawning a sub-shell here, though it's unlikely to be a performance killer if your scripts take long enough to run that you care about running them in parallel), and, without the call to wait, this script will return almost instantly, letting the scripts run in the background. This is a matter of subjective preference; if you want to be blocked until all jobs have completed, then use wait. Otherwise you can use jobs to see if your script is still running. –  dhag Mar 17 at 20:44

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.