Categories
bash command-line shell

How do I set a variable to the output of a command in Bash?

2203

I have a pretty simple script that is something like the following:

#!/bin/bash

VAR1="$1"
MOREF='sudo run command against $VAR1 | grep name | cut -c7-'

echo $MOREF

When I run this script from the command line and pass it the arguments, I am not getting any output. However, when I run the commands contained within the $MOREF variable, I am able to get output.

How can one take the results of a command that needs to be run within a script, save it to a variable, and then output that variable on the screen?

7

  • 1

    A related question stackoverflow.com/questions/25116521/…

    Aug 25, 2016 at 7:09

  • 73

    As an aside, all-caps variables are defined by POSIX for variable names with meaning to the operating system or shell itself, whereas names with at least one lowercase character are reserved for application use. Thus, consider using lowercase names for your own shell variables to avoid unintended conflicts (keeping in mind that setting a shell variable will overwrite any like-named environment variable).

    Mar 27, 2017 at 15:56

  • 2

    As an aside, capturing output into a variable just so you can then echo the variable is a useless use of echo, and a useless use of variables.

    – tripleee

    Jul 21, 2018 at 6:58


  • 3

    As a further aside, storing output in variables is often unnecessary. For small, short strings you will need to reference multiple times in your program, this is completely fine, and exactly the way to go; but for processing any nontrivial amounts of data, you want to reshape your process into a pipeline, or use a temporary file.

    – tripleee

    Jan 18, 2019 at 7:56


  • 1

    Variation: “I know how to use variable=$(command) but I think "$string" is a valid command“; stackoverflow.com/questions/37194795/…

    – tripleee

    Oct 8, 2020 at 6:42


2960

In addition to backticks `command`, command substitution can be done with $(command) or "$(command)", which I find easier to read, and allows for nesting.

OUTPUT=$(ls -1)
echo "${OUTPUT}"

MULTILINE=$(ls \
   -1)
echo "${MULTILINE}"

Quoting (") does matter to preserve multi-line variable values; it is optional on the right-hand side of an assignment, as word splitting is not performed, so OUTPUT=$(ls -1) would work fine.

19

  • 66

    Can we provide some separator for multi line output ?

    – Aryan

    Feb 21, 2013 at 12:26

  • 25

    White space (or lack of whitespace) matters

    – Ali

    Apr 24, 2014 at 10:40

  • 10

    @timhc22, the curly braces are irrelevant; it’s only the quotes that are important re: whether expansion results are string-split and glob-expanded before being passed to the echo command.

    Apr 21, 2015 at 15:37


  • 5

    Ah thanks! So is there any benefit to the curly braces?

    – timhc22

    Apr 21, 2015 at 16:01

  • 22

    Curly braces can be used when the variable is immediately followed by more characters which could be interpreted as part of the variable name. e.g. ${OUTPUT}foo. They are also required when performing inline string operations on the variable, such as ${OUTPUT/foo/bar}

    Jun 1, 2016 at 23:16

354

$(sudo run command)

If you’re going to use an apostrophe, you need `, not '. This character is called “backticks” (or “grave accent”):

#!/bin/bash

VAR1="$1"
VAR2="$2"

MOREF=`sudo run command against "$VAR1" | grep name | cut -c7-`

echo "$MOREF"

5

  • 39

    The backtick syntax is obsolescent, and you really need to put double quotes around the variable interpolation in the echo.

    – tripleee

    Dec 28, 2015 at 12:28

  • 15

    I would add that you have to be careful with the spaces around ‘=’ in the assignment above. You shouln’t have any spaces there, otherwise you’ll get an incorrect assignment

    – zbstof

    Jan 5, 2016 at 11:07

  • 5

    tripleeee’s comment is correct. In cygwin (May 2016), “ doesn’t work while $() works. Couldn’t fix until I saw this page.

    – toddwz

    May 13, 2016 at 12:42


  • 2

    Elaboration such as an example on Update (2018) would be appreciated.

    – Eduard

    Jul 13, 2018 at 13:31


  • The original Bourne shell supported backticks, but not $(…) notation. So you need to use backticks if you require compatibility with older Unix systems.

    – AndyB

    Feb 17 at 23:24


172

Some Bash tricks I use to set variables from commands

Sorry, there is a loong answer, but as is a , where the main goal is to run other commands and react on result code and/or output, ( commands are often piped filter, etc… ).

Storing command output in variables is something basic and fundamental.

Therefore, depending on

  • compatibility ()
  • kind of output (filter(s))
  • number of variable to set (split or interpret)
  • execution time (monitoring)
  • error trapping
  • repeatability of request (see long running background process, further)
  • interactivity (considering user input while reading from another input file descriptor)
  • do I miss something?

First simple, old (obsolet), and compatible way

myPi=`echo '4*a(1)' | bc -l`
echo $myPi 
3.14159265358979323844

Compatible, second way

As nesting could become heavy, parenthesis was implemented for this

myPi=$(bc -l <<<'4*a(1)')

Using backticks in script is to be avoided today.

Nested sample:

SysStarted=$(date -d "$(ps ho lstart 1)" +%s)
echo $SysStarted 
1480656334

features

Reading more than one variable (with Bashisms)

df -k /
Filesystem     1K-blocks   Used Available Use% Mounted on
/dev/dm-0         999320 529020    401488  57% /

If I just want a used value:

array=($(df -k /))

you could see an array variable:

declare -p array
declare -a array='([0]="Filesystem" [1]="1K-blocks" [2]="Used" [3]="Available" [
4]="Use%" [5]="Mounted" [6]="on" [7]="/dev/dm-0" [8]="999320" [9]="529020" [10]=
"401488" [11]="57%" [12]="/")'

Then:

echo ${array[9]}
529020

But I often use this:

{ read -r _;read -r filesystem size using avail prct mountpoint ; } < <(df -k /)
echo $using
529020

( The first read _ will just drop header line. ) Here, in only one command, you will populate 6 different variables (shown by alphabetical order):

declare -p avail filesystem mountpoint prct size using
declare -- avail="401488"
declare -- filesystem="/dev/dm-0"
declare -- mountpoint="/"
declare -- prct="57%"
declare -- size="999320"
declare -- using="529020"

Or

{ read -a head;varnames=(${head[@]//[K1% -]});varnames=(${head[@]//[K1% -]});
  read ${varnames[@],,} ; } < <(LANG=C df -k /)

Then:

declare -p varnames ${varnames[@],,} 
declare -a varnames=([0]="Filesystem" [1]="blocks" [2]="Used" [3]="Available" [4]="Use" [5]="Mounted" [6]="on")
declare -- filesystem="/dev/dm-0"
declare -- blocks="999320"
declare -- used="529020"
declare -- available="401488"
declare -- use="57%"
declare -- mounted="/"
declare -- on=""

Or even:

{ read _ ; read filesystem dsk[{6,2,9}] prct mountpoint ; } < <(df -k /)
declare -p mountpoint dsk
declare -- mountpoint="/"
declare -a dsk=([2]="529020" [6]="999320" [9]="401488")

(Note Used and Blocks is switched there: read ... dsk[6] dsk[2] dsk[9] ...)

… will work with associative arrays too: read _ disk[total] disk[used] ...

Dedicated fd using unnamed fifo:

There is an elegent way! In this sample, I will read /etc/passwd file:

users=()
while IFS=: read -u $list user pass uid gid name home bin ;do
    ((uid>=500)) &&
        printf -v users[uid] "%11d %7d %-20s %s\n" $uid $gid $user $home
done {list}</etc/passwd

Using this way (... read -u $list; ... {list}<inputfile) leave STDIN free for other purposes, like user interaction.

Then

echo -n "${users[@]}"
       1000    1000 user         /home/user
...
      65534   65534 nobody       /nonexistent

and

echo ${!users[@]}
1000 ... 65534

echo -n "${users[1000]}"
      1000    1000 user       /home/user

This could be used with static files or even /dev/tcp/xx.xx.xx.xx/yyy with x for ip address or hostname and y for port number or with the output of a command:

{
    read -u $list -a head          # read header in array `head`
    varnames=(${head[@]//[K1% -]}) # drop illegal chars for variable names
    while read -u $list ${varnames[@],,} ;do
        ((pct=available*100/(available+used),pct<10)) &&
            printf "WARN: FS: %-20s on %-14s %3d <10 (Total: %11u, Use: %7s)\n" \
                "${filesystem#*/mapper/}" "$mounted" $pct $blocks "$use"
     done
 } {list}< <(LANG=C df -k)

And of course with inline documents:

while IFS=\; read -u $list -a myvar ;do
    echo ${myvar[2]}
done {list}<<"eof"
foo;bar;baz
alice;bob;charlie
$cherry;$strawberry;$memberberries
eof

Practical sample parsing CSV files:

In this answer to How to parse a CSV file in Bash?, I read a file by using an unnamed fifo, using exec {FD}<"$file" syntax.
And here is the same script, but using CSV as inline document.

Sample function for populating some variables:

#!/bin/bash

declare free=0 total=0 used=0 mpnt="??"

getDiskStat() {
    {
        read _
        read _ total used free _ mpnt
    } < <(
        df -k ${1:-/}
    )
}

getDiskStat $1
echo "$mpnt: Tot:$total, used: $used, free: $free."

Nota: declare line is not required, just for readability.

About sudo cmd | grep ... | cut ...

shell=$(cat /etc/passwd | grep $USER | cut -d : -f 7)
echo $shell
/bin/bash

(Please avoid useless cat! So this is just one fork less:

shell=$(grep $USER </etc/passwd | cut -d : -f 7)

All pipes (|) implies forks. Where another process have to be run, accessing disk, libraries calls and so on.

So using sed for sample, will limit subprocess to only one fork:

shell=$(sed </etc/passwd "s/^$USER:.*://p;d")
echo $shell

And with Bashisms:

But for many actions, mostly on small files, Bash could do the job itself:

while IFS=: read -a line ; do
    [ "$line" = "$USER" ] && shell=${line[6]}
  done </etc/passwd
echo $shell
/bin/bash

or

while IFS=: read loginname encpass uid gid fullname home shell;do
    [ "$loginname" = "$USER" ] && break
  done </etc/passwd
echo $shell $loginname ...

Going further about variable splitting

Have a look at my answer to How do I split a string on a delimiter in Bash?

Alternative: reducing forks by using backgrounded long-running tasks

In order to prevent multiple forks like

myPi=$(bc -l <<<'4*a(1)'
myRay=12
myCirc=$(bc -l <<<" 2 * $myPi * $myRay ")

or

myStarted=$(date -d "$(ps ho lstart 1)" +%s)
mySessStart=$(date -d "$(ps ho lstart $$)" +%s)

This work fine, but running many forks is heavy and slow.

And commands like date and bc could make many operations, line by line!!

See:

bc -l <<<$'3*4\n5*6'
12
30

date -f - +%s < <(ps ho lstart 1 $$)
1516030449
1517853288

So we could use a long running background process to make many jobs, without having to initiate a new fork for each request.

You could have a look how reducing forks make Mandelbrot bash, improve from more than eight hours to less than 5 seconds.

Under , there is a built-in function: coproc:

coproc bc -l
echo 4*3 >&${COPROC[1]}
read -u $COPROC answer
echo $answer
12

echo >&${COPROC[1]} 'pi=4*a(1)'
ray=42.0
printf >&${COPROC[1]} '2*pi*%s\n' $ray
read -u $COPROC answer
echo $answer
263.89378290154263202896

printf >&${COPROC[1]} 'pi*%s^2\n' $ray
read -u $COPROC answer
echo $answer
5541.76944093239527260816

As bc is ready, running in background and I/O are ready too, there is no delay, nothing to load, open, close, before or after operation. Only the operation himself! This become a lot quicker than having to fork to bc for each operation!

Border effect: While bc stay running, they will hold all registers, so some variables or functions could be defined at initialisation step, as first write to ${COPROC[1]}, just after starting the task (via coproc).

Into a function newConnector

You may found my newConnector function on GitHub.Com or on my own site (Note on GitHub: there are two files on my site. Function and demo are bundled into one unique file which could be sourced for use or just run for demo.)

Sample:

source shell_connector.sh

tty
/dev/pts/20

ps --tty pts/20 fw
    PID TTY      STAT   TIME COMMAND
  29019 pts/20   Ss     0:00 bash
  30745 pts/20   R+     0:00  \_ ps --tty pts/20 fw

newConnector /usr/bin/bc "-l" '3*4' 12

ps --tty pts/20 fw
    PID TTY      STAT   TIME COMMAND
  29019 pts/20   Ss     0:00 bash
  30944 pts/20   S      0:00  \_ /usr/bin/bc -l
  30952 pts/20   R+     0:00  \_ ps --tty pts/20 fw

declare -p PI
bash: declare: PI: not found

myBc '4*a(1)' PI
declare -p PI
declare -- PI="3.14159265358979323844"

The function myBc lets you use the background task with simple syntax.

Then for date:

newConnector /bin/date '-f - +%s' @0 0
myDate '2000-01-01'
  946681200
myDate "$(ps ho lstart 1)" boottime
myDate now now
read utm idl </proc/uptime
myBc "$now-$boottime" uptime
printf "%s\n" ${utm%%.*} $uptime
  42134906
  42134906

ps --tty pts/20 fw
    PID TTY      STAT   TIME COMMAND
  29019 pts/20   Ss     0:00 bash
  30944 pts/20   S      0:00  \_ /usr/bin/bc -l
  32615 pts/20   S      0:00  \_ /bin/date -f - +%s
   3162 pts/20   R+     0:00  \_ ps --tty pts/20 fw

From there, if you want to end one of background processes, you just have to close its fd:

eval "exec $DATEOUT>&-"
eval "exec $DATEIN>&-"
ps --tty pts/20 fw
    PID TTY      STAT   TIME COMMAND
   4936 pts/20   Ss     0:00 bash
   5256 pts/20   S      0:00  \_ /usr/bin/bc -l
   6358 pts/20   R+     0:00  \_ ps --tty pts/20 fw

which is not needed, because all fd close when the main process finishes.

10

  • The nested sample above is what I was looking for. There may be a simpler way, but what I was looking for was the way to find out if a docker container already exists given its name in an environment variable. So for me: EXISTING_CONTAINER=$(docker ps -a | grep "$(echo $CONTAINER_NAME)") was the statement I was looking for.

    Aug 2, 2017 at 18:02


  • 3

    @capricorn1 That’s a useless use of echo; you want simply grep "$CONTAINER_NAME"

    – tripleee

    Nov 15, 2017 at 4:20

  • 2

    Instead of all the “Edits” notes and strikeovers (that is what the revision history is for), it would be better to have it as if this answer was written today. If there are some lessons to be learned it could be documented in a section, e.g. “Things not to do”.

    Nov 10, 2020 at 0:26


  • 2

    @Cadoiz Yes, there was some typos… read _ to drop not only skip… and so… Answer edited, added link to CSV parser sample. Thanks!

    Oct 28, 2021 at 13:26