ChatGPT解决这个技术问题 Extra ChatGPT

How to replace ${} placeholders in a text file?

I want to pipe the output of a "template" file into MySQL, the file having variables like ${dbName} interspersed. What is the command line utility to replace these instances and dump the output to standard output?


C
Community

Update

Here is a solution from yottatsa on a similar question that only does replacement for variables like $VAR or ${VAR}, and is a brief one-liner

i=32 word=foo envsubst < template.txt

Of course if i and word are in your environment, then it is just

envsubst < template.txt

On my Mac it looks like it was installed as part of gettext and from MacGPG2

Old Answer

Here is an improvement to the solution from mogsie on a similar question, my solution does not require you to escale double quotes, mogsie's does, but his is a one liner!

eval "cat <<EOF
$(<template.txt)
EOF
" 2> /dev/null

The power on these two solutions is that you only get a few types of shell expansions that don't occur normally $((...)), `...`, and $(...), though backslash is an escape character here, but you don't have to worry that the parsing has a bug, and it does multiple lines just fine.


I'm finding the bare envsubst doesn't work if your envars aren't exported.
@ToddiusZho: There is no such thing as an environment variable that isn't exported - it is precisely exporting that makes a shell variable an environment variable. envsubst, as its name suggests, only recognizes environment variables, not shell variables. It's also worth noting that envsubst is a GNU utility, and therefore not preinstalled or available on all platforms.
Maybe another way to say is that envsubst only see's it's own process environment variables, so "normal" shell variables you might have defined earlier (on separate lines) are not inherited by child processes unless you "export" them. In my example usage of gettext above, I'm modifying the inherited gettext environment through a bash mechanism by prefixing them to the command I'm about to run
I have one string with $HOME in it, i found $HOME is worked as default shell to do, instead $HOME as my own /home/zw963, but, it seem like not support $(cat /etc/hostname) substitution, so it not complete match my own demand.
Thanks for the "Old Answer", as it not only allows variables, but also a shell commands like $(ls -l)
W
Willem Van Onsem

Sed!

Given template.txt:

The number is ${i}
The word is ${word}

we just have to say:

sed -e "s/\${i}/1/" -e "s/\${word}/dog/" template.txt

Thanks to Jonathan Leffler for the tip to pass multiple -e arguments to the same sed invocation.


You can combine those two sed commands into one: sed -e "s/\${i}/1/" -e "s/\${word}/dog/"; that is more efficient. You can run into problems with some versions of sed at maybe 100 such operations (problem from years ago - may not still be true, but beware HP-UX).
Small hint: if "1" or "dog" in the given example would contain a dollar symbol, you would have to escape it with a backslash (otherwise replacement does not occur).
You also don't need the cat. All you need is sed -e "s/\${i}/1/" -e "s/\${word}/dog/" template.text.
What if the replacement text is a password? In this case, sed will expect an escaped text, which is a hassle.
To write the result to a textfile you can use sed -e "s/\${i}/1/" -e "s/\${word}/dog/" template.text | tee newFile
P
Peter Mortensen

Use /bin/sh. Create a small shell script that sets the variables, and then parse the template using the shell itself. Like so (edit to handle newlines correctly):

File template.txt:

the number is ${i}
the word is ${word}

File script.sh:

#!/bin/sh

#Set variables
i=1
word="dog"

#Read in template one line at the time, and replace variables (more
#natural (and efficient) way, thanks to Jonathan Leffler).
while read line
do
    eval echo "$line"
done < "./template.txt"

Output:

#sh script.sh
the number is 1
the word is dog

Why not just: while read line ; do eval echo "$line"; done < ./template.txt ??? There's no need to read the whole file into memory, only to spit it out one line at a time via intensive use of head and tail. But the 'eval' is OK - unless the template contains shell characters like back quotes.
This is very dangerous! All the bash command in the input will be executed. If the template is: "the words is; rm -rf $HOME" you'll loose files.
@rzymek - remember, he wants to pipe this file directly to the database. So appearently, the input is trusted.
@gnud There is a difference between trusting a file enough to store it's contents and trusting it enough to execute anything it contains.
To note the constraints: (a) double quotes in the input are quietly discarded, (b) the read command, as written, trims leading and trailing whitespace from each line and 'eats' \ chars., (c) only use this if you fully trust or control the input, because command substitutions (`…` or $(…)) embedded in the input allow execution of arbitrary commands due to use of eval. Finally, there's a small chance that echo mistakes the beginning of a line for one of its command-line options.
D
Dana the Sane

I was thinking about this again, given the recent interest, and I think that the tool that I was originally thinking of was m4, the macro processor for autotools. So instead of the variable I originally specified, you'd use:

$echo 'I am a DBNAME' | m4 -DDBNAME="database name"

This solution has the fewest drawbacks of the answers here. Do you know of any way to replace ${DBNAME} instead of only DBNAME though?
@JackDavidson I would use envsubst for this simple variable replacement / templating usage, as mentioned in other answers. m4 is a great tool, but it's a full-blown preprocessor with much more features and thus complexity which may not be needed if you simply want to replace some variables.
n
neu242

Create rendertemplate.sh:

#!/usr/bin/env bash

eval "echo \"$(cat $1)\""

And template.tmpl:

Hello, ${WORLD}
Goodbye, ${CHEESE}

Render the template:

$ export WORLD=Foo
$ CHEESE=Bar ./rendertemplate.sh template.tmpl 
Hello, Foo
Goodbye, Bar

This strips off double quoted strings
Tried: eval "echo $(cat $1)" - w/out quotes, and it worked for me.
From a security perspective, this is bad news. If your template contains $(rm -rf ~), you're running that as code.
eval "echo \"$(cat $1)\"" Works great !
C
ChaPuZ

template.txt

Variable 1 value: ${var1}
Variable 2 value: ${var2}

data.sh

#!/usr/bin/env bash
declare var1="value 1"
declare var2="value 2"

parser.sh

#!/usr/bin/env bash

# args
declare file_data=$1
declare file_input=$2
declare file_output=$3

source $file_data
eval "echo \"$(< $file_input)\"" > $file_output

./parser.sh data.sh template.txt parsed_file.txt

parsed_file.txt

Variable 1 value: value 1
Variable 2 value: value 2

As has been noted elsewhere: Only use this if you fully trust or control the input, because command substitutions (`…` or $(…)) embedded in the input allow execution of arbitrary commands due to use of eval, and the direct execution of shell code due to use of source. Also, double quotes in the input are quietly discarded, and echo could mistake the beginning of a line for one of its command-line options.
Unfortunately, this strips all double quotes (") from the result file. Is there a way to do the same without removing the double quotes?
I found what I was looking for here: stackoverflow.com/a/11050943/795158; I used envsubst. The difference is that the vars have to be exported which was OK with me.
if text file contain "`" or "." ,substitude will failed.
m
mklement0

Here's a robust Bash function that - despite using eval - should be safe to use.

All ${varName} variable references in the input text are expanded based on the calling shell's variables.

Nothing else is expanded: neither variable references whose names are not enclosed in {...} (such as $varName), nor command substitutions ($(...) and legacy syntax `...`), nor arithmetic substitutions ($((...)) and legacy syntax $[...]).

To treat a $ as a literal, \-escape it; e.g.:\${HOME}

Note that input is only accepted via stdin.

Example:

$ expandVarsStrict <<<'$HOME is "${HOME}"; `date` and \$(ls)' # only ${HOME} is expanded
$HOME is "/Users/jdoe"; `date` and $(ls)

Function source code:

expandVarsStrict(){
  local line lineEscaped
  while IFS= read -r line || [[ -n $line ]]; do  # the `||` clause ensures that the last line is read even if it doesn't end with \n
    # Escape ALL chars. that could trigger an expansion..
    IFS= read -r -d '' lineEscaped < <(printf %s "$line" | tr '`([$' '\1\2\3\4')
    # ... then selectively reenable ${ references
    lineEscaped=${lineEscaped//$'\4'{/\${}
    # Finally, escape embedded double quotes to preserve them.
    lineEscaped=${lineEscaped//\"/\\\"}
    eval "printf '%s\n' \"$lineEscaped\"" | tr '\1\2\3\4' '`([$'
  done
}

The function assumes that no 0x1, 0x2, 0x3, and 0x4 control characters are present in the input, because those chars. are used internally - since the function processes text, that should be a safe assumption.


This is one of the best answer here. Even with using eval it is pretty safe to use.
This solution works with JSON files ! (escaping " properly!)
A nice thing with this solution is it'll let you provide defaults for missing variables ${FOO:-bar} or only output something if it's set - ${HOME+Home is ${HOME}} . I suspect with a little extension it could also return exit codes for missing variables ${FOO?Foo is missing} but doesn't currently tldp.org/LDP/abs/html/parameter-substitution.html has a list of these if that helps
Best answer here. All " and ' are fully escaped. Solution with only eval don't work for files with ' or "
T
Thomas

here's my solution with perl based on former answer, replaces environment variables:

perl -p -e 's/\$\{(\w+)\}/(exists $ENV{$1}?$ENV{$1}:"missing variable $1")/eg' < infile > outfile

This is great. Don't always have perl, but when you do, this is simple and straight forward.
s
spudfkc

I would suggest using something like Sigil: https://github.com/gliderlabs/sigil

It is compiled to a single binary, so it's extremely easy to install on systems.

Then you can do a simple one-liner like the following:

cat my-file.conf.template | sigil -p $(env) > my-file.conf

This is much safer than eval and easier then using regex or sed


Great answer! It's a proper templating system and much easier to work with than the other answers.
BTW, better to avoid cat and use <my-file.conf.template instead so you give sigil a real file handle instead of a FIFO.
A
Apriori

Here is a way to get the shell to do the substitution for you, as if the contents of the file were instead typed between double quotes.

Using the example of template.txt with contents:

The number is ${i}
The word is ${word}

The following line will cause the shell to interpolate the contents of template.txt and write the result to standard out.

i='1' word='dog' sh -c 'echo "'"$(cat template.txt)"'"'

Explanation:

i and word are passed as environment variables scopped to the execution of sh.

sh executes the contents of the string it is passed.

Strings written next to one another become one string, that string is: 'echo "' + "$(cat template.txt)" + '"'

'echo "' + "$(cat template.txt)" + '"'

Since the substitution is between ", "$(cat template.txt)" becomes the output of cat template.txt.

So the command executed by sh -c becomes: echo "The number is ${i}\nThe word is ${word}", where i and word are the specified environment variables.

echo "The number is ${i}\nThe word is ${word}",

where i and word are the specified environment variables.


From a security perspective, this is bad news. If your template contains, say, '$(rm -rf ~)'$(rm -rf ~), the literal quotes in the template file will match the ones you added before its expansion.
I don't the in-template quotes are matching the out-template quotes, I believe the shell is resolving the template and the in-terminal string independently (effective removing the quotes) then concatenating them. A version of the test that doesn't delete your home directory is '$(echo a)'$(echo a). It produces 'a'a. The main thing that's happening is that the first echo a inside the ' is getting evaluated, which may not be what you expect since it's in ', but is the same behavior as including ' in a " quoted string.
So, this is not secure in the sense that it allows the template author to have their code executed. However how the quotes are evaluated doesn't really affect security. Expanding anything a "-quoted string (including $(...)) is the point.
Is that the point? I only see them asking for ${varname}, not other, higher-security-risk expansions.
...that said, I must differ (re: in-template and out-template quotes being able to match). When you put a single-quote in your string, you're splitting into a single-quoted string echo ", followed by a double-quoted string with the literal contetns of template.txt, followed by another literal string ", all concatenated into a single argument passed to sh -c. You're right that the ' can't be matched (since it was consumed by the outer shell rather than passed to the inner one), but the " certainly can, so a template containing Gotcha"; rm -rf ~; echo " could be executed.
P
Peter Mortensen

If you are open to using Perl, that would be my suggestion. Although there are probably some sed and/or AWK experts that probably know how to do this much easier. If you have a more complex mapping with more than just dbName for your replacements you could extend this pretty easily, but you might just as well put it into a standard Perl script at that point.

perl -p -e 's/\$\{dbName\}/testdb/s' yourfile | mysql

A short Perl script to do something slightly more complicated (handle multiple keys):

#!/usr/bin/env perl
my %replace = ( 'dbName' => 'testdb', 'somethingElse' => 'fooBar' );
undef $/;
my $buf = <STDIN>;
$buf =~ s/\$\{$_\}/$replace{$_}/g for keys %replace;
print $buf;

If you name the above script as replace-script, it could then be used as follows:

replace-script < yourfile | mysql

Works for single variables, but how do I include 'or' for the others?
There are many ways you can do this with perl, all depending on how complicated and/or safe you wanted to do this. More complicated examples can be found here: perlmonks.org/?node_id=718936
Using perl is so much cleaner than trying to use the shell. Spend the time to make this work rather than trying some of the other mentioned shell-based solutions.
Recently had to tackle a similar issue. In the end I went with perl (envsubst looked promising for a bit, but it was too hard to control).
u
user976433

file.tpl:

The following bash function should only replace ${var1} syntax and ignore 
other shell special chars such as `backticks` or $var2 or "double quotes". 
If I have missed anything - let me know.

script.sh:

template(){
    # usage: template file.tpl
    while read -r line ; do
            line=${line//\"/\\\"}
            line=${line//\`/\\\`}
            line=${line//\$/\\\$}
            line=${line//\\\${/\${}
            eval "echo \"$line\""; 
    done < ${1}
}

var1="*replaced*"
var2="*not replaced*"

template file.tpl > result.txt

This not safe since it will execute command substitutions in the template if they have a leading backslash e.g. \$(date)
Aside from Peter's valid point: I suggest you use while IFS= read -r line; do as the read command, otherwise you'll strip leading and trailing whitespace from each input line. Also, echo could mistake the beginning of a line for one of its command-line options, so it's better to use printf '%s\n'. Finally, it's safer to double-quote ${1}.
g
glenn jackman

I found this thread while wondering the same thing. It inspired me to this (careful with the backticks)

$ echo $MYTEST
pass!
$ cat FILE
hello $MYTEST world
$ eval echo `cat FILE`
hello pass! world

A bash shorthand for $(cat file) is $(< file)
Apparently this method mess up with the line breaks, i.e. my file got echoed all in one line.
@ArthurCorenzan: Indeed, line breaks are replaced with spaces. To fix that, you'd have to use eval echo "\"$(cat FILE)\"" but that may still fall short in that double quotes in the input are discarded.
As has been noted elsewhere: Only use this if you fully trust or control the input, because command substitutions (`…` or $(…)) embedded in the input allow execution of arbitrary commands due to use of eval.
to preserve line breaks and quotes: stackoverflow.com/a/17030906/10390714
s
sfitts

Lots of choices here, but figured I'd toss mine on the heap. It is perl based, only targets variables of the form ${...}, takes the file to process as an argument and outputs the converted file on stdout:

use Env;
Env::import();

while(<>) { $_ =~ s/(\${\w+})/$1/eeg; $text .= $_; }

print "$text";

Of course I'm not really a perl person, so there could easily be a fatal flaw (works for me though).


Works fine. You could drop the Env::import(); line - importing is implied by use. Also, I suggest not building up the entire output in memory first: simply use print; instead of $text .= $_; inside the loop, and drop the post-loop print command.
p
paxdiablo

It can be done in bash itself if you have control of the configuration file format. You just need to source (".") the configuration file rather than subshell it. That ensures the variables are created in the context of the current shell (and continue to exist) rather than the subshell (where the variable disappear when the subshell exits).

$ cat config.data
    export parm_jdbc=jdbc:db2://box7.co.uk:5000/INSTA
    export parm_user=pax
    export parm_pwd=never_you_mind

$ cat go.bash
    . config.data
    echo "JDBC string is " $parm_jdbc
    echo "Username is    " $parm_user
    echo "Password is    " $parm_pwd

$ bash go.bash
    JDBC string is  jdbc:db2://box7.co.uk:5000/INSTA
    Username is     pax
    Password is     never_you_mind

If your config file cannot be a shell script, you can just 'compile' it before executing thus (the compilation depends on your input format).

$ cat config.data
    parm_jdbc=jdbc:db2://box7.co.uk:5000/INSTA # JDBC URL
    parm_user=pax                              # user name
    parm_pwd=never_you_mind                    # password

$ cat go.bash
    cat config.data
        | sed 's/#.*$//'
        | sed 's/[ \t]*$//'
        | sed 's/^[ \t]*//'
        | grep -v '^$'
        | sed 's/^/export '
        >config.data-compiled
    . config.data-compiled
    echo "JDBC string is " $parm_jdbc
    echo "Username is    " $parm_user
    echo "Password is    " $parm_pwd

$ bash go.bash
    JDBC string is  jdbc:db2://box7.co.uk:5000/INSTA
    Username is     pax
    Password is     never_you_mind

In your specific case, you could use something like:

$ cat config.data
    export p_p1=val1
    export p_p2=val2
$ cat go.bash
    . ./config.data
    echo "select * from dbtable where p1 = '$p_p1' and p2 like '$p_p2%' order by p1"
$ bash go.bash
    select * from dbtable where p1 = 'val1' and p2 like 'val2%' order by p1

Then pipe the output of go.bash into MySQL and voila, hopefully you won't destroy your database :-).


You don't have to export the variables from the config.data file; it is sufficient just to set them. You also don't seem to be reading the template file at any point. Or, perhaps, the template file is modified and contains the 'echo' operations...or am I missing something?
Good point on the exports, I do that by default so that they're available to subshells and it causes no harm since they die when go exits. The 'template' file is the script itself with it's echo statements. There's no need to introduce a third file - it's basically a mailmerge-type operation.
The "script itself with it's echo statements" is not a template : it is a script. Think readibility (and maintainability) difference between and echo ''
@Pierre, there are no echo statements in my config script, they're merely exports, and I've shown how you can avoid even that with a minimal amount of pre-processing. If you're talking about the echo statement in my other scripts (like go.bash), you've got the wrong end of the stick - they're not part of the solution, they're just a way of showing that the variables are being set correctly.
@paxdiablo : It seems you just forgot the question : << I want to pipe the output of a "template" file into MySQL >>. So use of a template IS the question, it is not "the wrong end of the stick". Exporting variables and echoing them in another script just doesn't answer the question at all
j
joehep

In place perl editing of potentially multiple files, with backups.

  perl -e 's/\$\{([^}]+)\}/defined $ENV{$1} ? $ENV{$1} : ""/eg' \
    -i.orig \
    -p config/test/*

o
olopopo

I created a shell templating script named shtpl. My shtpl uses a jinja-like syntax which, now that I use ansible a lot, I'm pretty familiar with:

$ cat /tmp/test
{{ aux=4 }}
{{ myarray=( a b c d ) }}
{{ A_RANDOM=$RANDOM }}
$A_RANDOM
{% if $(( $A_RANDOM%2 )) == 0 %}
$A_RANDOM is even
{% else %}
$A_RANDOM is odd
{% endif %}
{% if $(( $A_RANDOM%2 )) == 0 %}
{% for n in 1 2 3 $aux %}
\$myarray[$((n-1))]: ${myarray[$((n-1))]}
/etc/passwd field #$n: $(grep $USER /etc/passwd | cut -d: -f$n)
{% endfor %}
{% else %}
{% for n in {1..4} %}
\$myarray[$((n-1))]: ${myarray[$((n-1))]}
/etc/group field #$n: $(grep ^$USER /etc/group | cut -d: -f$n)
{% endfor %}
{% endif %}


$ ./shtpl < /tmp/test
6535
6535 is odd
$myarray[0]: a
/etc/group field #1: myusername
$myarray[1]: b
/etc/group field #2: x
$myarray[2]: c
/etc/group field #3: 1001
$myarray[3]: d
/etc/group field #4: 

More info on my github


r
roy man

To me this is the easiest and most powerful solution, you can even include other templates using the same command eval echo "$(<template.txt):

Example with nested template

create the template files, the variables are in regular bash syntax ${VARIABLE_NAME} or $VARIABLE_NAME

you have to escape special characters with \ in your templates otherwhise they will be interpreted by eval.

template.txt

Hello ${name}!
eval echo $(<nested-template.txt)

nested-template.txt

Nice to have you here ${name} :\)

create source file

template.source

declare name=royman 

parse the template

source template.source && eval echo "$(<template.txt)"

the output

Hello royman!
Nice to have you here royman :)