'Bash command line and input limit
Is there some sort of character limit imposed in bash (or other shells) for how long an input can be? If so, what is that character limit?
I.e. Is it possible to write a command in bash that is too long for the command line to execute? If there is not a required limit, is there a suggested limit?
Solution 1:[1]
The limit for the length of a command line is not imposed by the shell, but by the operating system. This limit is usually in the range of hundred kilobytes. POSIX denotes this limit ARG_MAX
and on POSIX conformant systems you can query it with
$ getconf ARG_MAX # Get argument limit in bytes
E.g. on Cygwin this is 32000, and on the different BSDs and Linux systems I use it is anywhere from 131072 to 2621440.
If you need to process a list of files exceeding this limit, you might want to look at the xargs
utility, which calls a program repeatedly with a subset of arguments not exceeding ARG_MAX
.
To answer your specific question, yes, it is possible to attempt to run a command with too long an argument list. The shell will error with a message along "argument list too long".
Note that the input to a program (as read on stdin or any other file descriptor) is not limited (only by available program resources). So if your shell script reads a string into a variable, you are not restricted by ARG_MAX
. The restriction also does not apply to shell-builtins.
Solution 2:[2]
Ok, Denizens. So I have accepted the command line length limits as gospel for quite some time. So, what to do with one's assumptions? Naturally- check them.
I have a Fedora 22 machine at my disposal (meaning: Linux with bash4). I have created a directory with 500,000 inodes (files) in it each of 18 characters long. The command line length is 9,500,000 characters. Created thus:
seq 1 500000 | while read digit; do
touch $(printf "abigfilename%06d\n" $digit);
done
And we note:
$ getconf ARG_MAX
2097152
Note however I can do this:
$ echo * > /dev/null
But this fails:
$ /bin/echo * > /dev/null
bash: /bin/echo: Argument list too long
I can run a for loop:
$ for f in *; do :; done
which is another shell builtin.
Careful reading of the documentation for ARG_MAX
states, Maximum length of argument to the exec functions. This means: Without calling exec
, there is no ARG_MAX
limitation. So it would explain why shell builtins are not restricted by ARG_MAX
.
And indeed, I can ls
my directory if my argument list is 109948 files long, or about 2,089,000 characters (give or take). Once I add one more 18-character filename file, though, then I get an Argument list too long error. So ARG_MAX
is working as advertised: the exec is failing with more than ARG_MAX
characters on the argument list- including, it should be noted, the environment data.
Solution 3:[3]
There is a buffer limit of something like 1024. The read will simply hang mid paste or input. To solve this use the -e option.
http://linuxcommand.org/lc3_man_pages/readh.html
-e use Readline to obtain the line in an interactive shell
Change your read to read -e and annoying line input hang goes away.
Solution 4:[4]
In the old days, tcsh
had a limit of 1024 characters per command line, which made it difficult if you had a very long $PATH
. I was forced to rebuild a private version of tcsh with the buffer size increased to allow users to have long $PATH
settings. That was 2 decades ago. That was when I gave up using tcsh, and switched to zsh which did not have that limitation. Now I just use plain old bash because it is good enough.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | |
Solution 2 | phuclv |
Solution 3 | Paul Kenjora |
Solution 4 | Mark Lakata |