[GRLUG] Argument list too long
Michael Mol
mikemol at gmail.com
Tue May 15 21:36:11 EDT 2007
On 5/15/07, Bob Kline <bob.kline at gmail.com> wrote:
> I find it common today to run a command
> like tar on a large number of files, only
> to have something come back like "argument
> list too long." The only command that does
> not do this is "ls".
>
> GNU used to have a policy of making every
> command relatively wide open. The size
> of things like argument lists would be limited
> only by the bit length of a processor word.
> And with today's 64-bit processors, the real
> limit would be the hard drive.
AFAIK, it's not a problem with the individual tools, it's a problem
with bash's maximum command-line size. The buffer can only hold so
many bytes. (I think it's 2K.)
You might try finding the buffer size in the bash source, and
expanding that. But be warned. The POSIX spec for xargs hints at
potential problems with doing that:
(quote)
The requirement that xargs never produces command lines such that
invocation of utility is within 2048 bytes of hitting the POSIX exec
{ARG_MAX} limitations is intended to guarantee that the invoked
utility has room to modify its environment variables and command line
arguments and still be able to invoke another utility. Note that the
minimum {ARG_MAX} allowed by the System Interfaces volume of IEEE Std
1003.1-2001 is 4096 bytes and the minimum value allowed by this volume
of IEEE Std 1003.1-2001 is 2048 bytes; therefore, the 2048 bytes
difference seems reasonable. Note, however, that xargs may never be
able to invoke a utility if the environment passed in to xargs comes
close to using {ARG_MAX} bytes.
(/quote)
http://www.opengroup.org/onlinepubs/000095399/utilities/xargs.html
According to the same document, the -s option to xargs should help
your situation by consuming multiple lines of input with each executed
command.
--
:wq
More information about the grlug
mailing list