Skip to main content

Stack Overflow

  1. About
  2. Products
  3. OverflowAI

  4. Log in
  5. Sign up

    1. Home
    2. Questions
    3. Tags

    4. Users
    5. Companies
    6. Labs
    7. Jobs
    8. Discussions
    9. Recent Tags

    10. docker
    11. docker-machine
    12. centos
    13. Collectives

    14. Communities for your favourite technologies. Explore all Collectives
  6. Teams

    Now available on Stack Overflow for Teams! AI features where you work: search, IDE, and chat.

    Learn more Explore Teams

How to fix the running out of disk space error in Docker?

Ask Question

Asked 6 years, 2 months ago

Modified 1 month ago

Viewed 135k times

71

When I am trying to build the docker image I am getting out of disk space error and after investigating I find the following:

df -h
Filesystem      Size  Used Avail Use% Mounted on
/dev/vda1        4G   3.8G     0 100% /

How do I fix this out of space error?

Share

Improve this question

edited Jul 10, 2018 at 8:27

asked Jul 9, 2018 at 5:46

](https://stackoverflow.com/users/1475228/pritam-banerjee)

Pritam Banerjee

18.8k

1010 gold badges

9696 silver badges

110110 bronze badges

Add a comment

12 Answers

Sorted by:

130

docker system prune [-a]

https://docs.docker.com/engine/reference/commandline/system_prune/

This will clean up all images, containers, networks, volumes not used. We generally try to clean up old images when creating a new one but you could also have this run as a scheduled task on your docker server every day.

Other answers address listing system memory usage and increasing the amount of Docker disk space in Docker Desktop:

The docker system df command can be used to view reclaimable memory –Abishek_Jain

Open up the docker settings -> Resources -> Advanced and up the amount of Hard Drive space it can use under disk image size. –Nico

Share

Improve this answer

edited Dec 13, 2023 at 21:30

](https://stackoverflow.com/users/13969/rjurney)

rjurney

5,088

55 gold badges

4242 silver badges

6464 bronze badges

answered Jul 9, 2018 at 9:55

](https://stackoverflow.com/users/1794961/nick-spicer)

Nick Spicer

2,617

44 gold badges

2222 silver badges

2626 bronze badges

  • 26

    You need to pass the --volumes flag to prune the volumes as well. Without this only ‘unused containers, networks, images (both dangling and unreferenced)’ will be pruned.

    – jannis

    Commented Jul 27, 2018 at 8:28

  • 2

    I feel that this is only a temporary solution. We have over 14Gb free disk space and Docker still says that no space is left. We called “docker system prune” and it worked, but only for a while. Not to mention that the database was deleted as well but thats our fault (always make regular backups people!)

    – Ziga Petek

    Commented Oct 4, 2019 at 7:07

Add a comment

30

use command - docker system prune -a This will clean up total Reclaimable Sise for Images, Network & Volume….. This will remove all images related reclaimable space which are not associated with any running container…..

Run docker system df command to view Reclaimable memory

In case there is some Reclaimable memory then if above command does not work in first go then run the same command twice then it should cleaned up…. I have been experiencing this behaviour almost on daily basis….. Planning to report this bug to Docker Community but before that want to reproduce this bug with new release to see if this has been fixed or not with latest one….

Share

Improve this answer

edited Nov 2, 2022 at 5:56

](https://stackoverflow.com/users/2458858/tryingtolearn)

tryingToLearn

11.5k

1212 gold badges

8484 silver badges

126126 bronze badges

answered Nov 4, 2018 at 17:46

](https://stackoverflow.com/users/9819333/abhishek-jain)

Abhishek Jain

4,119

22 gold badges

2929 silver badges

3030 bronze badges

Add a comment

20

Open up the docker settings -> Resources -> Advanced and up the amount of Hard Drive space it can use under disk image size.

Share

Improve this answer

answered Aug 4, 2020 at 14:23

](https://stackoverflow.com/users/1143358/nico)

Nico

1,162

1313 silver badges

1717 bronze badges

Add a comment

13

If you are using linux, then most probably docker is filling up the directory /var/lib/docker/containers, because it is writing container logs to <CONTAINER_ID>-json.log file under this directory. You can use the command cat /dev/null > <CONTAINER_ID>-json.log to clear this file or you can set the maximum log file size be editing /etc/sysconfig/docker. More information can be found in this RedHat documentation. In my case, I have created a crontab to clear the contents of the file every day at midnight. Hope this helps!

NB:

  1. You can find the docker containers with ID using the following command sudo docker ps --no-trunc
  2. You can check the size of the file using the command du -sh $(docker inspect --format='{{.LogPath}}' CONTAINER_ID_FOUND_IN_LAST_STEP)

Share

Improve this answer

edited Jun 24, 2020 at 11:40

answered Jun 24, 2020 at 11:30

](https://stackoverflow.com/users/784929/kaushik)

kaushik

2,473

66 gold badges

3838 silver badges

5353 bronze badges

Add a comment

4

Nothing works for me. I change the disk images max size in Docker Settings, and just after that it free huge size.

Share

Improve this answer

answered Oct 15, 2018 at 12:24

](https://stackoverflow.com/users/471137/troger19)

troger19

1,289

22 gold badges

1515 silver badges

3131 bronze badges

  • 1

    Wow, I didn’t see this option! Was wondering why my Windows showed 60GB free disk space, but Docker containers said “Not enough disk space left” - I had my limit at 50GB (which was all used up) - set it to 200 and it worked!

    – Alex

    Commented Jul 13, 2019 at 18:44

  • well the command should be as mentioned above, the thing is it doesnt work for me.. only manually clicking on a button

    – troger19

    Commented Apr 9, 2021 at 9:33

Add a comment

4

Not sure if this is still relevant, but in case of docker system prune not working, and if you don’t want to go docker system prune -a, you should pick and delete images using either

  • docker image prune --filter

or

  • Picking and deleting them from Docker for Dekstop

Share

Improve this answer

edited May 1, 2023 at 13:36

](https://stackoverflow.com/users/13393940/cconsta1)

cconsta1

807

11 gold badge

99 silver badges

2222 bronze badges

answered Apr 27, 2023 at 17:51

](https://stackoverflow.com/users/12505078/l0t)

L0t

61

44 bronze badges

Add a comment

3

Going to leave this here since I couldn’t find the answer.

Go to the Docker GUI -> Prefereces -> Reset -> Uninstall

Completely uninstall Docker.

Then install it fresh using this link

My docker was using 20GB of space when building an image, after fresh install, it uses 3-4GB max. Definitely helps!

Also, if you using a macbook, have look at ~/Library/Containers/docker*

This folder for me was 60 GB and was eating up all the space on my mac! Even though this may not be relevant to the question, I believe it is vital for me to leave this here.

Share

Improve this answer

answered Oct 11, 2019 at 9:51

](https://stackoverflow.com/users/11317776/dudanf)

DUDANF

2,932

11 gold badge

1515 silver badges

4444 bronze badges

  • This actually helped me a lot, thanks. I didn’t have to reinstall, but finding where all the junk data that was not getting pruned was did the trick.

    – Jim Crozier

    Commented Feb 16 at 14:56

Add a comment

1

There are at least two directories where docker can fill up space. Assuming you have a bigger disk partition on your computer, say /bigdisk, here are the possible solution:

  1. Temp dir in /var/tmp: docker will honour the TMPDIR settings
mkdir -p /bigdisk/bigtmp
chmod 1777 /bigdisk/bigtmp
export TMPDIR=/bigdisk/bigtmp
  1. /var/lib/containers (older version of docker might use /var/lib/docker). You can solve this by making /var/lib/containers to be a symlink to the bigger disk:
mkdir -p /bigdisk/var-lib/containers
ln -s /bigdisk/var-lib/containers /var/lib/containers

Now if you have already some docker images in /var/lib/containers, you need to move those first. First stop any docker daemon running

mkdir -p /bigdisk/var-lib/containers
##### copy existing container content to bigdisk
(cd /var/lib/containers; tar cvf - . ) | (cd /bigdisk/var-lib/containers; tar xvf - )
#### ensure the copy is good.
du -sh /var/lib/containers 
du -sh /bigdisk/var-lib/containers
rmdir /var/lib/containers
ln -s /bigdisk/var-lib/containers /var/lib/containers

Then run docker to verify things run as before

Share

Improve this answer

answered May 1 at 6:47

](https://stackoverflow.com/users/21485043/monapy)

MonaPy

159

11 silver badge

55 bronze badges

Add a comment

0

I wanted to add this to my CI/CD pipeline so I won’t have to do it manually on my AWS EC2 instance.

The -f option ensured there was no prompt for confirmation

Without it, I kept getting: Are you sure you want to continue? [y/N] And my updates failed to deploy.

Here’s what worked for me: In my .yml file, :

deploy:
   steps:
      - name: Prune and free some space in images and containers
      run: docker system prune -f

Share

Improve this answer

answered Feb 18 at 4:50

](https://stackoverflow.com/users/1585461/chelathegreat)

ChelaTheGreat

75

11 silver badge

88 bronze badges

Add a comment

0

Unless you are using a logging driver other than the default json_file logging driver with its default unlimited max-size, your container logs may grow large and fill up either your host machine’s disk (Linux host machine) or your docker machine VM’s disk (macOS host machine and probably also Windows). This can be hard to inspect (ref: my question).

kaushik’s answer explains how to truncate the log files on Linux. On macOS though it’s more complicated because the log files are stored in the Docker Machine VM. See my answer here for full details.

Share

Improve this answer

answered Feb 23 at 19:01

](https://stackoverflow.com/users/2562319/jbyler)

jbyler

7,672

33 gold badges

3838 silver badges

4444 bronze badges

Add a comment

0

I have the same problem with my Mac. I didn’t want to remove docker and found a new way

Go to the Docker GUI -> Troubleshooting -> Reset to factory defaults

This made about 50 GB available

Share

Improve this answer

answered Apr 17 at 14:29

](https://stackoverflow.com/users/5382500/%d0%9f%d0%b5%d1%82%d1%80-%d0%94%d1%80%d1%83%d0%b1%d0%be%d0%b2)

Петр Друбов

81

11 gold badge

11 silver badge

66 bronze badges

Add a comment

0

I run following command:

 docker system prune -a

before prune: enter image description here

when running: enter image description here

after prune: enter image description here

Share

Improve this answer

answered Jul 18 at 3:57

](https://stackoverflow.com/users/3610643/m-namjo)

M.Namjo

412

33 silver badges

1515 bronze badges

Add a comment

Your Answer

Sign up or log in

Sign up using Google

Sign up using Email and Password

Post as a guest

Name

Email

Required, but never shown

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

##

Not the answer you’re looking for? Browse other questions tagged

or ask your own question.

Linked

105How to clean Docker container logs?

6disk space is full by `vda` files, how to clear them?

-1How can I inspect the disk usage of docker container logs on macOS?

743Docker error : no space left on device

3Docker run, no space left on device

152Docker Machine: No space left on device

82How can I fix Docker/Mac no space left on device error?

1docker start: no space left on device

58How can I fix ‘No space left on device’ error in Docker?

7Docker Error : no space left on device on windows

77docker no space left on device macOS

1docker load : no space left on device

1Why did my docker volume run out of disk space?

Hot Network Questions

Question feed

Stack Overflow
Products
Company
Stack Exchange Network

Site design / logo © 2024 Stack Exchange Inc; user contributions licensed under CC BY-SA . rev 2024.9.11.15092

Updated: