Stack Overflow is a community of 4.7 million programmers, just like you, helping each other.

Join them; it only takes a minute:

Sign up
Join the Stack Overflow community to:
  1. Ask programming questions
  2. Answer and help your peers
  3. Get recognized for your expertise

Good day,

I am writing a relatively simple BASH script that performs an SVN UP command, captures the console output, then does some post processing on the text.

For example:

#!/bin/bash
# A script to alter SVN logs a bit

# Update and get output
echo "Waiting for update command to complete..."
TEST_TEXT=$(svn up --set-depth infinity)
echo "Done"

# Count number of lines in output and report it
NUM_LINES=$(echo $TEST_TEXT | grep -c '.*')
echo "Number of lines in output log: $NUM_LINES"

# Print out only lines containing Makefile
echo $TEST_TEXT | grep Makefile

This works as expected (ie: as commented in the code above), but I am concerned about what would happen if I ran this on a very large repository. Is there a limit on the maximum buffer size BASH can use to hold the output of a console command?

I have looked for similar questions, but nothing quite like what I'm searching for. I've read up on how certain scripts need to use the xargs in cases of large intermediate buffers, and I'm wondering if something similar applies here with respect to capturing console output.

eg:

# Might fail if we have a LOT of results
find -iname *.cpp | rm

# Shouldn't fail, regardless of number of results
find -iname *.cpp | xargs rm

Thank you.

share|improve this question
    
possible duplicate of What is the maximum size of an environment variable value? – dogbane May 22 '12 at 16:57
    
Why don't you use wc -l instead of grep -c '.*'? Also, you should quote your variables: echo "$TEST_TEXT" in both places. – Dennis Williamson May 22 '12 at 17:27
    
find -iname *.cpp | rm won't work because rm doesn't take files from stdin, but why do you think it "Might fail if we have a LOT of results"? – Kevin May 22 '12 at 17:32
up vote 5 down vote accepted

Using

var=$(hexdump /dev/urandom | tee out)

bash didn't complain; I killed it at a bit over 1G and 23.5M lines. You don't need to worry as long as your output fits in your system's memory.

share|improve this answer

I see no reason not to use a temporary file here.

tmp_file=$(mktemp XXXXX)

svn up --set-depth=infinity > $tmp_file
echo "Done"

# Count number of lines in output and report it
NUM_LINES=$(wc -l $tmp_file)
echo "Number of lines in output log: $NUM_LINES"

# Print out only lines containing Makefile
grep Makefile $tmp_file

rm $tmp_file
share|improve this answer
    
Fair enough. +1 I can't write to the filesystem (ro), which prevents me from doing this. However I didn't specify this in TFQ, and it's valid given the details I provided. Thanks! – Dogbert May 22 '12 at 19:31

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.