In a given shell, normally I'd set a variable or variables and then run a command. Recently I learned about the concept of prepending a variable definition to a command:
FOO=bar somecommand someargs
This works... kind of. It doesn't work when you're changing a LC_* variable (which seems to affect the command, but not its arguments, for example, '[a-z]' char ranges) or when piping output to another command thusly:
FOO=bar somecommand someargs | somecommand2 # somecommand2 is unaware of FOO
I can prepend somecommand2 with "FOO=bar" as well, which works, but which adds unwanted duplication, and it doesn't help with arguments that are interpreted depending on the variable (for example, '[a-z]').
So, what's a good way to do this on a single line?
I'm thinking something on the order of:
FOO=bar (somecommand someargs | somecommand2) # Doesn't actually work
I got lots of good answers! The goal is to keep this a one-liner, preferably without using "export". The method using a call to Bash was best overall, though the parenthetical version with "export" in it was a little more compact. The method of using redirection rather than a pipe is interesting as well.
(T=$(date) echo $T)
will work
FOO=bar bash -c 'somecommand someargs | somecommand2'
How about exporting the variable, but only inside the subshell?:
(export FOO=bar && somecommand someargs | somecommand2)
Keith has a point, to unconditionally execute the commands, do this:
(export FOO=bar; somecommand someargs | somecommand2)
;
rather than &&
; there's no way export FOO=bar
is going to fail.
&&
executes the left command, then executes the right command only if the left command succeeded. ;
executes both commands unconditionally. The Windows batch (cmd.exe
) equivalent of ;
is &
.
(FOO=XXX ; echo FOO=$FOO) ; echo FOO=$FOO
yields FOO=XXX\nFOO=\n
.
source
(aka .
) in that case? Also, the backticks shouldn't be used anymore these days and this is one of the reasons why, using $(command)
is waaaaay safer.
bash
but could be something else, e.g. dash
) and I don't run into any trouble if I must use quotes within the command args (someargs
).
You can also use eval
:
FOO=bar eval 'somecommand someargs | somecommand2'
Since this answer with eval
doesn't seem to please everyone, let me clarify something: when used as written, with the single quotes, it is perfectly safe. It is good as it will not launch an external process (like the accepted answer) nor will it execute the commands in an extra subshell (like the other answer).
As we get a few regular views, it's probably good to give an alternative to eval
that will please everyone, and has all the benefits (and perhaps even more!) of this quick eval
“trick”. Just use a function! Define a function with all your commands:
mypipe() {
somecommand someargs | somecommand2
}
and execute it with your environment variables like this:
FOO=bar mypipe
eval
.
eval
is evil without understanding what's evil about eval
. And maybe you're not really understanding this answer after all (and really there's nothing wrong with it). On the same level: would you say that ls
is bad because for file in $(ls)
is ,bad? (and yeah, you didn't downvote the accepted answer, and you didn't leave a comment either). SO is such a weird and absurd place sometimes.
eval
is evil without understanding what's evil about eval
, I'm referring to your sentence: This answer lacks all the warnings and explanations necessary when talking about eval
. eval
is not bad or dangerous; no more than bash -c
.
eval
. In the answer provided the args have been single quoted protecting from variable expansion, so I see no problem with the answer.
eval
solution.
Use env
.
For example, env FOO=BAR command
. Note that the environment variables will be restored/unchanged again when command
finishes executing.
Just be careful about about shell substitution happening, i.e. if you want to reference $FOO
explicitly on the same command line, you may need to escape it so that your shell interpreter doesn't perform the substitution before it runs env
.
$ export FOO=BAR
$ env FOO=FUBAR bash -c 'echo $FOO'
FUBAR
$ echo $FOO
BAR
env
is to solve the stated question.
A simple approach is to make use of ;
For example:
ENV=prod; ansible-playbook -i inventories/$ENV --extra-vars "env=$ENV" deauthorize_users.yml --check
command1; command2
executes command2 after executing command1, sequentially. It does not matter whether the commands were successful or not.
ENV
in the environment of the same shell in which the commands that follow the semicolon execute. How this differs from the other answers, though, is that this one defines ENV
for all subsequent references in the shell and not just those on the same line. I believe that the original question intended to alter the environment only for the references on the same line.
;unset ENV
to the same line will make it one liner. but I ignored it as it doesn't make sense.
Use a shell script:
#!/bin/bash
# myscript
FOO=bar
somecommand someargs | somecommand2
> ./myscript
export
; otherwise $FOO
will be a shell variable, not an environment variable, and therefore not visible to somecommand
or somecommand2
.
Success story sharing
somecommand
as sudo, you need to pass sudo the-E
flag to pass though variables. Because variables can introduce vulnerabilities. stackoverflow.com/a/8633575/1695680FOO_X=foox bash -c 'echo $FOO_X'
works as expected but with specific var names it fails:DYLD_X=foox bash -c 'echo $DYLD_X'
echos blank. both work usingeval
instead ofbash -c