≡ Menu

6 Useful Linux One Liners

Individual Linux commands can be combined in the command line, to accomplish tasks that otherwise would require shell scripts to be written.

This article provides 6 linux one liners that will help you accomplish few useful tasks.

1. Display Username and UID sorted by UID Using cut, sort and tr

cut command is used to extract specific part of a file. The following example cuts the username and UID from /etc/passwd file, and sort the output using sort command using username as a key and “:” as a delimiter.

As a part of formatting the output, you can use any other character to display username and UID. Using tr command you can convert to “:” to any other character.

$ cut -d ':' -f 1,3 /etc/passwd | sort -t ':' -k2n - | tr ':' '\t'
root    0
daemon    1
bin    2
sys    3
sync    4
games    5
man    6
lp    7
mail    8
news    9
uucp    10
proxy    13

2. Find List of Unique Words in a file Using tr, sed, uniq

The following example lists the words which has only alphabets. tr command converts all the character other than alphabets
to newline. So all the words will be listed out with number of newlines. Sed command removes the empty lines and finally uniquely sort the output to avoid the duplicates.

For this example, let us use the following Readme.txt file.

$ cat Readme1.txt
Installtion Steps:

Run the below steps as root user:

1.Copy the output file at any temporary location.
2.Unzip the file using unzip command.
3.Copy the tar to data directory

Now, execute the following command to find list of uniq words in the above Readme.txt file.

$ tr -c a-zA-Z '\n' < Readme1.txt  | sed '/^$/d' | sort | uniq -i -c
1 any
1 as
1 at
1 below
1 command
2 Copy
1 data
1 directory
2 file
1 Installtion
1 location
1 output
1 root
1 Run
2 steps
1 tar
1 temporary
4 the
1 to
2 unzip
1 user
1 using
$

Note: uniq with -i ignores the cases, so the count of the word ‘unzip’ shows as two. (unzip and Unzip)

Linux sed command plays a vital role in text manipulation operations. Please refer our Sed Tutorial for detailed explanation about sed command.

3. Join Two Files (Where one file is not sorted) Using sort and join

Join command joins two files based on a common field between two files. For join to work properly, both the files should be sorted. In case if one file is sorted and one more is not, then the following example will help you to join.

In the example below, the file m1.txt has Employee name and Employee Id and its not sorted. Second file m2.txt has employee name and Department name. To join these two files, sort the first file and give the sorted output as one of the input stream for join. Without any option join command uses the first field in a file as a common field.

$ cat m1.txt
Jincy 500
Amit 300
Saurab 100
Jobi 400
Kumar 200

$ cat m2.txt
Amit Monitoring
Jincy Marketing
Jobi Accounts
Kumar Sales
Saurab Maintenence

$ sort m1.txt | join - m2.txt
Amit 300 Monitoring
Jincy 500 Marketing
Jobi 400 Accounts
Kumar 200 Sales
Saurab 100 Maintenence
$

4. Find out which process is using up your memory using ps, awk, sort

The following command lists all the process sorted based on the used memory size.

$ ps aux | awk '{if ($5 != 0 ) print $2,$5,$6,$11}' | sort -k2n
PID VSZ RSS COMMAND
3823 3788 484 /sbin/mingetty
3827 3788 484 /sbin/mingetty
3830 3788 484 /sbin/mingetty
3833 3788 488 /sbin/mingetty
3834 3788 484 /sbin/mingetty
3873 3788 484 /sbin/mingetty
2173 3796 568 /usr/sbin/acpid
1835 3800 428 klogd
1832 5904 596 syslogd
2054 5932 540 /usr/sbin/sdpd
2281 6448 360 gpm

The above command lists the PID, Used virutal memory size, Used resident set-size and process command. The output is sorted on VSZ.

While debugging performance issues, use this command to find out which process is using up the memory.

Awk is an extremely useful language to manipulate structured data very quickly. We have written tons of article on awk earlier. Start by reading this Awk Introduction article, and dive in deep by reading the whole awk tutorial series.

5. Find out Top 10 Largest File or Directory Using du, sort and head

du command shows summarized disk usage for each file and directory of a given location (/var/log/*). The output of a sort command is reversely sorted based on the size.

# du -sk /var/log/* | sort -r -n | head -10
1796    /var/log/audit
1200    /var/log/sa
612     /var/log/anaconda.log
512     /var/log/wtmp
456     /var/log/messages.4
92      /var/log/messages.2
76      /var/log/scrollkeeper.log
72      /var/log/secure
56      /var/log/cups
48      /var/log/messages.1

6. Find out Top 10 Most Used Commands.

Ever wondered what command you type a lot from your command line? If you are like most people it might be pwd or ls. Check it out yourself using this one liner.

Bash maintains all the commands you execute in a hidden file called .bash_history under your home directory, as we’ve explained earlier in our 15 Examples to Master Linux Command History article.

Use the following one liner to identify which command you execute a lot from your command line.

$ cat ~/.bash_history | tr "\|\;" "\n" | sed -e "s/^ //g" | cut -d " " -f 1 | sort | uniq -c | sort -n | tail -n 15
11 ssh
12 shutdown
15 cp
15 vncserver
22 cat
23 find
23 pwd
24 mv
25 ovc
47 grep
58 ps
67 vi
74 ll
117 ls
118 cd

The above example does the following operation to get the top 10 used commands

  • Multiple commands can be executed in a single command lines each separated by | or ‘;’. To count each command separately, convert pipe character or semicolon to newline.
  • Remove the spaces in the beginning (if exists) using sed command.
  • Cut the command field (first field).
  • Sort the commands and find number of occurrences of each command using uniq command.
  • Sort based on the number of occurrences and print only last 15 lines.

What is your favorite Linux one liner? Leave a comment and let us know.

Add your comment

If you enjoyed this article, you might also like..

  1. 50 Linux Sysadmin Tutorials
  2. 50 Most Frequently Used Linux Commands (With Examples)
  3. Top 25 Best Linux Performance Monitoring and Debugging Tools
  4. Mommy, I found it! – 15 Practical Linux Find Command Examples
  5. Linux 101 Hacks 2nd Edition eBook Linux 101 Hacks Book

Bash 101 Hacks Book Sed and Awk 101 Hacks Book Nagios Core 3 Book Vim 101 Hacks Book

Comments on this entry are closed.

  • Madharasan September 28, 2010, 2:42 am

    # head -6 /thegeekstuff/RameshNatarajan/BestOneLinerOfLinux.txt
    $ cut -d ‘:’ -f 1,3 /etc/passwd | sort -t ‘:’ -k2n – | tr ‘:’ ‘\t’
    $ tr -c a-zA-Z ‘\n’ < Readme1.txt | sed '/^$/d' | sort | uniq -i -c
    $ sort m1.txt | join – m2.txt
    $ ps aux | awk '{if ($5 != 0 ) print $2,$5,$6,$11}' | sort -k2n
    # du -sk /var/log/* | sort -r -n | head -10
    $ cat ~/.bash_history | tr "\|\;" "\n" | sed -e "s/^ //g" | cut -d " " -f 1 | sort | uniq -c | sort -n | tail -n 15

  • Chris F.A. Johnson September 28, 2010, 12:19 pm

    #2 does not give the output shown; sort requires the -f option, and the sed command is unnecessary — use the -s option with tr:

    tr -sc a-zA-Z ‘\n’ < Readme1.txt | sort -f | uniq -ic

  • dj September 28, 2010, 4:49 pm

    Regarding #6, redirect ~/.bash_history to tr and the cat can be eliminated (like #2)

    # What other 3 character extension files exist. Find all files with 3 character extensions # and eliminate the known to find the unknown.

    find . -regextype posix-extended \
    -iregex “.+\.[[:alpha:]]{3}$” \
    -a ! \
    -iregex “.+\.(pdf|css|png|jpg|txt|gif|htm|exe|bin|flv|php|xml)$”

  • Arun Maurya September 29, 2010, 5:03 am

    thanks a lot for very nice and useful one liners

  • mguy September 29, 2010, 4:14 pm

    1) what network connections are in use?
    $ netstat -ant |cut -c69- |sort |uniq -c
    1
    1 ESTABLISHED
    7 LISTEN
    1 State
    2) 10 newest files in a directory
    $ ls -ltr /var/log/ |tail
    -rw-r—– 1 syslog adm 2127 2010-09-29 17:36 lpr.log
    -rw-r—– 1 syslog adm 66659 2010-09-29 17:36 kern.log
    -rw-r–r– 1 root root 3168 2010-09-29 17:36 pm-powersave.log
    -rw-r—– 1 syslog adm 14075 2010-09-29 17:36 debug
    -rw-r—– 1 syslog adm 22050 2010-09-29 17:36 daemon.log
    -rw-r–r– 1 root root 35302 2010-09-29 18:06 Xorg.0.log
    -rw-r—– 1 syslog adm 85133 2010-09-29 18:09 syslog
    -rw-r—– 1 syslog adm 54831 2010-09-29 18:10 auth.log
    -rw-rw-r– 1 root utmp 28416 2010-09-29 18:10 wtmp
    -rw-rw-r– 1 root utmp 293168 2010-09-29 18:10 lastlog

  • Mikko October 2, 2010, 2:26 pm

    I’ve never had enough time to really learn to use effectively a reasonable set of the commands for processing text files. That means I just try to get things done with grep or awk — and I’m not even good at writing awk scripts 🙁

  • Anonymous January 21, 2016, 1:18 am

    sort -r -n –> sort -rn