I am writing a very simple script that calls another script, and I need to propagate the parameters from my current script to the script I am executing.
For instance, my script name is foo.sh
and calls bar.sh
foo.sh:
bar $1 $2 $3 $4
How can I do this without explicitly specifying each parameter?
Use "$@"
instead of plain $@
if you actually wish your parameters to be passed the same.
Observe:
$ cat no_quotes.sh
#!/bin/bash
echo_args.sh $@
$ cat quotes.sh
#!/bin/bash
echo_args.sh "$@"
$ cat echo_args.sh
#!/bin/bash
echo Received: $1
echo Received: $2
echo Received: $3
echo Received: $4
$ ./no_quotes.sh first second
Received: first
Received: second
Received:
Received:
$ ./no_quotes.sh "one quoted arg"
Received: one
Received: quoted
Received: arg
Received:
$ ./quotes.sh first second
Received: first
Received: second
Received:
Received:
$ ./quotes.sh "one quoted arg"
Received: one quoted arg
Received:
Received:
Received:
For bash and other Bourne-like shells:
java com.myserver.Program "$@"
$argv:q
will work in some csh variants.
exec java com.myserver.Program "$@"
This causes bash to exec into java, rather than wait around for it to complete. So, you are using one less process slot. Also, if the parent process (which ran your script) is watching it via the pid, and expecting it to be the 'java' process, some unusual things could break if you don't do an exec; the exec causes java to inherit the same pid.
args=("$@")
and expand each element as a separate shell “word” (akin to "$@"
) with "${args[@]}"
.
"$@"
, like will it fail if you have escaped spaces in an argument, or null characters, or other special characters?
Use "$@"
(works for all POSIX compatibles).
[...] , bash features the "$@" variable, which expands to all command-line parameters separated by spaces.
From Bash by example.
$*
. I believe there is historical progression here; $*
did not work as designed, so $@
was invented to replace it; but the quoting rules being what they are, the double quotes around it are still required (or it will revert to the broken $*
semantics).
echo "$@"
as ./script.sh a "b c" d
then you just get a b c d
instead of a "b c" d
, which is very much different.
echo
receives three arguments: "a" "b c" "d"
(then the shell joins them together as part of its string expansion). But if you'd used for i in "$@"; do echo $i; done
You'd have gotten a⏎b c⏎d
.
I realize this has been well answered but here's a comparison between "$@" $@ "$*" and $*
Contents of test script:
# cat ./test.sh
#!/usr/bin/env bash
echo "================================="
echo "Quoted DOLLAR-AT"
for ARG in "$@"; do
echo $ARG
done
echo "================================="
echo "NOT Quoted DOLLAR-AT"
for ARG in $@; do
echo $ARG
done
echo "================================="
echo "Quoted DOLLAR-STAR"
for ARG in "$*"; do
echo $ARG
done
echo "================================="
echo "NOT Quoted DOLLAR-STAR"
for ARG in $*; do
echo $ARG
done
echo "================================="
Now, run the test script with various arguments:
# ./test.sh "arg with space one" "arg2" arg3
=================================
Quoted DOLLAR-AT
arg with space one
arg2
arg3
=================================
NOT Quoted DOLLAR-AT
arg
with
space
one
arg2
arg3
=================================
Quoted DOLLAR-STAR
arg with space one arg2 arg3
=================================
NOT Quoted DOLLAR-STAR
arg
with
space
one
arg2
arg3
=================================
A lot answers here recommends $@
or $*
with and without quotes, however none seems to explain what these really do and why you should that way. So let me steal this excellent summary from this answer:
+--------+---------------------------+
| Syntax | Effective result |
+--------+---------------------------+
| $* | $1 $2 $3 ... ${N} |
+--------+---------------------------+
| $@ | $1 $2 $3 ... ${N} |
+--------+---------------------------+
| "$*" | "$1c$2c$3c...c${N}" |
+--------+---------------------------+
| "$@" | "$1" "$2" "$3" ... "${N}" |
+--------+---------------------------+
Notice that quotes makes all the difference and without them both have identical behavior.
For my purpose, I needed to pass parameters from one script to another as-is and for that the best option is:
# file: parent.sh
# we have some params passed to parent.sh
# which we will like to pass on to child.sh as-is
./child.sh $*
Notice no quotes and $@
should work as well in above situation.
env --debug
. E.g. put env --debug echo "$*"
inside a function and try executing it with different arguments.
$*
passem them not as-is. Maybe it depends on how you define "as-is". Example: If you call ./parent.sh 'a b' c
, then in the parent script $1
would eval to a b
but in the child script $1
would eval to only a
. So this is not what I expect. I expect that both scripts "see" the same arguments and this only works with ./child.sh "$@"
.
#!/usr/bin/env bash
while [ "$1" != "" ]; do
echo "Received: ${1}" && shift;
done;
Just thought this may be a bit more useful when trying to test how args come into your script
$#
""
, ''
as an argument, also if there were no args it is silent. I tried to fix this, but needs a for loop and counter with $#
. I just added this on the end: echo "End of args or received quoted null"
If you include $@
in a quoted string with other characters the behavior is very odd when there are multiple arguments, only the first argument is included inside the quotes.
Example:
#!/bin/bash
set -x
bash -c "true foo $@"
Yields:
$ bash test.sh bar baz
+ bash -c 'true foo bar' baz
But assigning to a different variable first:
#!/bin/bash
set -x
args="$@"
bash -c "true foo $args"
Yields:
$ bash test.sh bar baz
+ args='bar baz'
+ bash -c 'true foo bar baz'
"$@"
in bash. It also helps illustrate the key difference between $@
and $*
, and why they're both useful. From the bash(1)
man page Special Parameters section: "*
— When the expansion occurs within double quotes, it expands to a single word with the value of each parameter […] That is, "$*"
is equivalent to "$1c$2c..."
, where c
is [$IFS
]." And indeed, using $*
instead of $@
in your first example would've netted output identical to the second version.
"$@"
. Again from the man page: "@
— When the expansion occurs within double quotes, each parameter expands to a separate word. That is, "$@"
is equivalent to "$1"
"$2"
… If the double-quoted expansion occurs within a word, the expansion of the first parameter is joined with the beginning part of the original word, and the expansion of the last parameter is joined with the last part of the original word." ...And indeed, if your code had been bash -c "true foo $@ bar baz"
, then running it as test.sh one two
would net bash -c 'true foo one' 'two bar baz'
.
$*
, I seem to forget that it exists..
$@
was just gaining traction when I first started shell scripting, I still have to remind myself it's there. It was common to see "$*"
used in scripts... then the author would realize it was smashing all of their arguments together, so they'd try all manner of complex nonsense with word-splitting "$*"
, or [re]assembling an arg list by looping over shift
to pull them down one by one... just using $@
solves it. (Helps that bash uses the same mnemonic to access array members, too: ${var[*]}
for them all as a word, ${var[@]}
for a list of words.)
bash -c
in a way which makes absolutely no sense.
My SUN Unix has a lot of limitations, even "$@" was not interpreted as desired. My workaround is ${@}. For example,
#!/bin/ksh
find ./ -type f | xargs grep "${@}"
By the way, I had to have this particular script because my Unix also does not support grep -r
ksh
Sometimes you want to pass all your arguments, but preceded by a flag (e.g. --flag
)
$ bar --flag "$1" --flag "$2" --flag "$3"
You can do this in the following way:
$ bar $(printf -- ' --flag "%s"' "$@")
note: to avoid extra field splitting, you must quote %s
and $@
, and to avoid having a single string, you cannot quote the subshell of printf
.
bar "$@"
will be equivalent to bar "$1" "$2" "$3" "$4"
Notice that the quotation marks are important!
"$@"
, $@
, "$*"
or $*
will each behave slightly different regarding escaping and concatenation as described in this stackoverflow answer.
One closely related use case is passing all given arguments inside an argument like this:
bash -c "bar \"$1\" \"$2\" \"$3\" \"$4\""
.
I use a variation of @kvantour's answer to achieve this:
bash -c "bar $(printf -- '"%s" ' "$@")"
Works fine, except if you have spaces or escaped characters. I don't find the way to capture arguments in this case and send to a ssh inside of script.
This could be useful but is so ugly
_command_opts=$( echo "$@" | awk -F\- 'BEGIN { OFS=" -" } { for (i=2;i<=NF;i++) { gsub(/^[a-z] /,"&@",$i) ; gsub(/ $/,"",$i );gsub (/$/,"@",$i) }; print $0 }' | tr '@' \' )
"${array[@]}"
is the right way for passing any array in bash. I want to provide a full cheat sheet: how to prepare arguments, bypass and process them.
pre.sh
-> foo.sh
-> bar.sh
.
#!/bin/bash
args=("--a=b c" "--e=f g")
args+=("--q=w e" "--a=s \"'d'\"")
./foo.sh "${args[@]}"
#!/bin/bash
./bar.sh "$@"
#!/bin/bash
echo $1
echo $2
echo $3
echo $4
result:
--a=b c
--e=f g
--q=w e
--a=s "'d'"
Success story sharing
'arg with spaces'
for the three examples, too. I was surprised by the results; hopefully you can explain them../foo.sh "arg with spaces"
and./foo.sh 'arg with spaces'
are 100% identical, so I don't see how the suggestion of adding it to the examples given would be of any help.