The sponge command is part of the moreutils package. It is a utility that provides a function that is so simple it’s genius. It’s basic use is to soak up (get it? sponge..) standard input and write it to a file. The terminology “soak up” is more important than just creating a fun play on words. In this short tutorial we show you the sponge commands basic usage and why the term “soak up” is important.
To use the sponge command you must install the moreutils package.
How To Use Sponge Command
The most basic way to use sponge is to call it at the command line and give it some input. You can use Ctrl+d (
EOF) to signal the end of your input.
The sponge command does not have any options, it only takes a single filename as an argument.
[[email protected] ~]$ sponge newfile.txt This is all going to be written to newfile.txt I can keep typing, hitting enter for a newline When done typing, hit Ctrl+d to signal end of file [[email protected] ~]$ cat newfile.txt This is all going to be written to newfile.txt I can keep typing, hitting enter for a newline When done typing, hit Ctrl+d to signal end of file
My first instinct was to question the usefulness of the command. I asked, why can’t I just create a new file with vim or some other editor. Well you can. It is just a few more steps. The beauty of this utility comes in how we can use it in pipelines (read below).
Sponge Command vs Shell Redirection
The real power of the sponge command becomes apparent when you start using it in pipes. At first I questioned whether it was necessary since you can just use redirection. For example:
[[email protected]putor ~]$ echo "Put this text in a file" | sponge file1.txt [[email protected] ~]$ echo "Put this text in a file" > file2.txt
Both of the above commands generate the desired result. So why bother with sponge? That is where the “soak up” terminology becomes important. Sponge will “soak up” (or read in) ALL of the input before it opens the output file. Whereas a redirect will open the file for writing, then read the “stream” into it. If you try to read and write to the same file with a redirect, you run a real risk of corrupting it.
Here is an example of this would be reading and writing changes to the same file inside of a pipeline. Let’s use the following file.
[[email protected] ~]$ cat cities.txt # Cities in Pennsylvania Philadelphia Pittsburgh Harrisburg # Cities in New Jersey Camden Trenton Newark
Now let’s assume you want to remove all of the comments, or lines that start with
#. You can use grep like so:
[[email protected] ~]$ grep -v '^#' cities.txt Philadelphia Pittsburgh Harrisburg Camden Trenton Newark
However, that doesn’t change the file, it just prints it to standard output with the comment lines removed. How about just redirecting the output to itself?
[[email protected] ~]$ grep -v '^#' cities.txt > cities.txt grep: input file ‘cities.txt’ is also the output [[email protected] ~]$ cat cities.txt [[email protected] ~]$
Doh! Grep doesn’t like that and basically all you accomplished was emptying your file (unless you were lucky enough to append).
Most people would resort to using a temporary file to hold the updated contents. Then write that temporary file back to the original.
[[email protected] ~]$ grep -v '^#' cities.txt > /tmp/cities.tmp [[email protected] ~]$ mv /tmp/cities.tmp ./cities.txt [[email protected] ~]$ cat cities.txt Philadelphia Pittsburgh Harrisburg Camden Trenton Newark
Temporary files come with their own risks and complications. Read “Working with Temporary Files in Shell Scripts” for more information.
The sponge command fixes all of these problems. Since it reads ALL of the input before opening the output file for writing, you can use the same file on a pipeline.
[[email protected] ~]$ grep -v '^#' cities.txt | sponge cities.txt [[email protected] ~]$ cat cities.txt Philadelphia Pittsburgh Harrisburg Camden Trenton Newark
NOTE: I know there are other options like sed, this is just an example. The syntax is also much easier to remember than sed.
Resources and Further Reading
- Sponge Man Page (Sorry it’s not much to read)
This site uses Akismet to reduce spam. Learn how your comment data is processed.