Is there a simple command to display the total aggregate size (disk usage) of all files in a directory (folder)?

I have tried these, and they don't do what I want:

  • ls -l, which only displays the size of the individual files in a directory, nor
  • df -h, which only displays the free and used space on my disks.
  • 26,508
  • 16
  • 77
  • 115
David Barry
  • 10,863
  • 3
  • 15
  • 8

13 Answers13


The command du "summarizes disk usage of each FILE, recursively for directories," e.g.,

du -hs /path/to/directory
  • -h is to get the numbers "human readable", e.g. get 140M instead of 143260 (size in KBytes)
  • -s is for summary (otherwise you'll get not only the size of the folder but also for everything in the folder separately)

As you're using -h you can sort the human readable values using

du -h | sort -h

The -h flag on sort will consider "Human Readable" size values.

If want to avoid recursively listing all files and directories, you can supply the --max-depth parameter to limit how many items are displayed. Most commonly, --max-depth=1

du -h --max-depth=1 /path/to/directory
  • 33
  • 4
Marcel Stimberg
  • 38,846
  • 8
  • 46
  • 45
  • 133
    I use `du -sh` or DOOSH as a way to remember it (NOTE: the command is the same, just the organization of commandline flags for memory purposes) – Marco Ceppi Aug 05 '10 at 18:56
  • 4
    There is a useful option to du called the --apparent-size. It can be used to find the actual size of a file or directory (as opposed to its footprint on the disk) eg, a text file with just 4 characters will occupy about 6 bytes, but will still show up as taking up ~4K in a regular du -sh output. However, if you pass the --apparent-size option, the output will be 6. man du says: --apparent-size print apparent sizes, rather than disk usage; although the apparent size is usually smaller, it may be larger due to holes in (‘sparse’) files, internal fragmentation, indirect blocks – Hopping Bunny Jun 03 '16 at 04:45
  • 4
    This works for OS X too! Thanks, I was really looking for a way to clear up files, both on my local machine, and my server, but automated methods seemed not to work. So, I ran `du -hs *` and went into the largest directory and found out which files were so large... This is such a good method, and the best part is you don't have to install anything! Definitely deserved my upvote – Dev Sep 14 '16 at 17:57
  • @BandaMuhammadAlHelal I think there are two reasons: rounding (`du` has somewhat peculiar rounding, showing no decimals if the value has more than one digit in the chosen unit), and the classical [1024 vs. 1000 prefix](https://en.wikipedia.org/wiki/Binary_prefix#Deviation_between_powers_of_1024_and_powers_of_1000) issue. `du` has an option `-B` (or `--block-size`) to change the units in which it displays values, or you could use `-b` instead of `-h` to get the "raw" value in bytes. – Marcel Stimberg May 23 '17 at 09:12

Recently I found a great, ncurses based interactive tool, that quickly gives you an overview about directory sizes. Searched for that kind of tool for years.

  • quickly drilldown through file hierarchy
  • you can delete e.g. huge temporary files from inside the tool
  • extremely fast

Think of it as baobab for the command line:

apt-get install ncdu
  • 2,539
  • 1
  • 16
  • 14
  • 5
    `ncdu` is awesome! After installing it, just do this `ncdu /`. You will very quickly find the biggest files on the system. Also press `h` while inside ncdu's console interface. It has very useful shortcuts – vlad-ardelean Dec 12 '17 at 16:34
  • 1
    cool tool. use the man page to find all the cool shortcuts like `a` which shows the apparent size of all the files. That's the biggest problem I have with `du` it takes too much typing to see the apparent size, which is usually what I really want to know. – gMale Jan 05 '21 at 21:15

This finds the size recursively and puts it next to each folder name, along with total size at the bottom, all in the human format

du -hsc *
  • 857
  • 6
  • 2


du foldername

More information on that command here

  • 32,229
  • 40
  • 86
  • 100

Below is what I am using to print total, folder, and file size:

$ du -sch /home/vivek/* | sort -rh


   -c, --total
          produce a grand total
   -h, --human-readable
          print sizes in human readable format (e.g., 1K 234M 2G)
   -s, --summarize
          display only a total for each argument
   -h, --human-numeric-sort
          compare human readable numbers (e.g., 2K 1G)
   -r, --reverse
          reverse the result of comparisons


 70M    total
 69M    /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/lib
992K    /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/results
292K    /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/target
 52K    /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/user-files
Peter Mortensen
  • 955
  • 2
  • 11
  • 17
  • 454
  • 1
  • 5
  • 12

tree is another useful command for this job:

Just install it via sudo apt-get install tree and type the following:

tree --du -h /path/to/directory

33.7M used in 0 directories, 25 files

From man tree:

-h    Print  the size of each file but in a more human readable way, e.g. appending a size letter for kilo‐
      bytes (K), megabytes (M), gigabytes (G), terabytes (T), petabytes (P) and exabytes (E).

--du  For each directory report its size as the accumulation of sizes of all its files and  sub-directories
      (and their files, and so on). The total amount of used space is also given in the final report (like
      the 'du -c' command.)
  • 33,982
  • 39
  • 126
  • 187

The answers have made it obvious that du is the tool to find the total size of a directory. However, there are a couple of factors to consider:

  • Occasionally, du output can be misleading because it reports the space allocated by the filesystem, which may be different from the sum of the sizes of the individual files. Typically the filesystem will allocate 4096 bytes for a file even if you stored just one character in it!

  • Output differences due to power of 2 and power of 10 units. The -h switch to du divides the number of bytes by 2^10 (1024), 2^20 (1048576) etc to give a human readable output. Many people might be more habituated to seeing powers of 10 (e.g. 1K = 1000, 1M = 1000000) and be surprised by the result.

To find the total sum of sizes of all files in a directory, in bytes, do:

find <dir> -ls | awk '{sum += $7} END {print sum}'


$ du -s -B 1

$ find .  -ls | awk '{sum += $7} END {print sum}'
  • 287
  • 3
  • 8
  • The find-ls-awk will return a wrong value for large folders [#1](http://stackoverflow.com/questions/8857866/printing-long-integers-in-awk). For newer awk you can add `--bignum` or `-M` ; if that is not an option use `find . -ls | tr -s ' '|cut -d' ' -f 7| paste -sd+ |bc` [#2](http://stackoverflow.com/questions/21277631/awk-sum-of-large-integers). – goozez Jun 29 '16 at 20:25
  • If powers of 2 being used is a problem, there's the `--si` option: "like -h, but use powers of 1000 not 1024" – muru Dec 15 '17 at 03:25

To see the sizes of all files and directories, use

du -had1 dir/

(maybe like "do you had 1")

  • -h: human readable sizes
  • -a: show files, not just directories
  • -d1: show totals only at depth 1, i.e. the current directory's contents

For the current directory, the directory argument can be left off.

du -sh dir/* has the same effect but doesn't show hidden files and directories due to shell globbing.

  • 2,469
  • 20
  • 36

You can use the tool Dust:

PS C:\git> dust
   0B       ┌── templates           │                                      █ │   0%
   0B     ┌─┴ git-core              │                                      █ │   0%
   0B   ┌─┴ share                   │                                      █ │   0%
  76B   ├── readme.md               │                                      █ │   0%
 156K   │   ┌── less.exe            │▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒█ │   2%
 2.7M   │   ├── git-remote-https.exe│▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒█████████████████ │  42%
 3.6M   │   ├── git.exe             │▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒██████████████████████ │  56%
 6.5M   │ ┌─┴ git-core              │███████████████████████████████████████ │ 100%
 6.5M   ├─┴ libexec                 │███████████████████████████████████████ │ 100%
 6.5M ┌─┴ .                         │███████████████████████████████████████ │ 100%

My example is from Windows, but Linux and Apple are also supported:


  • 1
  • 20
  • 21

I'm conditioned to the ll command which is aliased to ls -alF. It is just missing a file count and size of files at the bottom. I played with du and tree but could not get the totals I needed. So I created lll to do that for me.

In your ~/.bashrc place the following:

lll () {
    ls -alF "$@"
    arr=($(ls -alF "$@" | awk '{TOTAL+=$5} END {print NR, TOTAL}'))
    printf " \33[1;31m ${arr[0]}\33[m line(s).  "
    printf "Total size: \33[1;31m ${arr[1]}\33[m\n"
#    printf "Total size: \33[1;31m $(BytesToHuman <<< ${arr[1]})\33[m\n"

Save the file and resource it using . ~/.bashrc (or you can restart your terminal).

Sample output

The nice thing about ll output is it's colors. This is maintained with lll but lost when using find or du:

lll sample output.png


A bonus function you can add to ~/.bashrc is called BytesToHuman(). This does what most console users would expect converting large numbers to MiB, GiB, etc:

function BytesToHuman() {

    # https://unix.stackexchange.com/questions/44040/a-standard-tool-to-convert-a-byte-count-into-human-kib-mib-etc-like-du-ls1/259254#259254

    read StdIn

    b=${StdIn:-0}; d=''; s=0; S=(Bytes {K,M,G,T,E,P,Y,Z}iB)
    while ((b > 1024)); do
        d="$(printf ".%02d" $((b % 1024 * 100 / 1024)))"
        b=$((b / 1024))
        let s++
    echo "$b$d ${S[$s]}"

} # BytesToHuman ()

Next flip the comment between two lines in lll () function to look like this:

#    printf "Total size: \33[1;31m ${arr[1]}\33[m\n"
    printf "Total size: \33[1;31m $(BytesToHuman <<< ${arr[1]})\33[m\n"

Now your output looks like this:

lll sample output 2.png

As always don't forget to re-source with . ~/.bashrc whenever making changes. (Or restart the terminal of course)

PS - Two weeks in self-quarantine finally gave me time to work on this five year old goal.

  • 95,212
  • 32
  • 218
  • 388

If your desired directory has many sub-directories then, use the following:

$ cd ~/your/target/directory
$ du -csh 

-c, --total produce a grand total
-s, --summarize display only a total for each argument
-h, --human-readable print sizes in human readable format (e.g., 1K 234M 2G)

which would then produce a overall total of the memory usage by all files/folders in the current directory.

  • 939
  • 1
  • 9
  • 19

For only the directory size in a readable format, use the below:

du -hs directoryname

This probably isn't in the correct section, but from the command line, you could try:

ls -sh filename

The -s is size, and the -h is human readable.

Use -l to show on ls list, like below:

ls -shl
Peter Mortensen
  • 955
  • 2
  • 11
  • 17
Shiv Singh
  • 6,413
  • 2
  • 13
  • 16

du /foldername is the standard command to know the size of a folder. It is best practice to find the options by reading the man page:

man du

You should read the man page (available online) before you use the command.

  • 185,114
  • 48
  • 449
  • 695