Openvidu storage space

Hi @cruizba @micael.gallego, what is the solution to free up disk space when storage usage is %90 ?

What OpenVidu Edition do you have deployed?: CE, PRO or ENTERPRISE
What version are you using?

What is the directory with more space occupied? You can take a look with this command:

du -hs /opt/openvidu/* | sort -n -r | head -n 10

Can you also check the space of your docker logs? (From versions <=2.16.0 docker logs was not limitted so coturn generates too much logs and sometimes fill the disk.).

du -shc /var/lib/docker/containers/*/*-json.log | tail -n1

With those commands you can tell me what is the directory that is filling your disk.


I am using 2.15.1 pro ( OpenVidu Docs)
26G /opt/openvidu/elasticsearch
du: cannot access ‘/var/lib/docker/containers//-json.log’: No such file or directory (above command could not find, I ran with sudo)
I found below:

27G containers/138ab8116a97edd19ffa7f8146840b4a33755120c3fa8c17c4cd4e6d0de9d300/138ab8116a97edd19ffa7f8146840b4a33755120c3fa8c17c4cd4e6d0de9d300-json.log

11G containers/1cd6b434a5be9e9efadc6c08bececd64665851a5338ac0b86e96907d2f22e285/1cd6b434a5be9e9efadc6c08bececd64665851a5338ac0b86e96907d2f22e285-json.log

7.0G containers/bf207d36aaa2fe03e436a511c3c00c9ffffeb0cfc9f75e84dc95b5f4d0dd5e9b/bf207d36aaa2fe03e436a511c3c00c9ffffeb0cfc9f75e84dc95b5f4d0dd5e9b-json.log

267M containers/2379fc3ea97fc695540360d479b5c8f001bad7b35fb83990ce97b96526303b4d/2379fc3ea97fc695540360d479b5c8f001bad7b35fb83990ce97b96526303b4d-json.log

11M containers/4337284194f958b338d25b8f92f31a2c667d2fa1a9b6425f605a7c7fcabe5679/4337284194f958b338d25b8f92f31a2c667d2fa1a9b6425f605a7c7fcabe5679-json.log

266M containers/b26ab802c75e558f054a5742f28bb59eca71a1c68b970b629894a239bdb85b32/b26ab802c75e558f054a5742f28bb59eca71a1c68b970b629894a239bdb85b32-json.log

Is it ok if I delete those logs?

Yes, you can remove that logs with no problem.

I recommend you to update to a newer version (latest is 2.20.0) as soon as possible.


Is 2.15 going to be deprecated?

Not deprecated itself, but we can offer better support for newer versions.


@cruizba I have deleted all above log files but it still shows disk usage the same. Should I reboot or what to do?

It should be updated eventually. Did you checked again?

What free space shows…:

df -h

/dev/xvda1 97G 83G 15G 85% / - I have deleted about >45GB log

We recomend to a newer versions as a regular basic.

The new versions are mostly backward compatible and you do not have to change anything in your application (only minor things that maybe affect to you). Please take a look to breaking changes section in releases pages:


Have you encountered this kind of disk space issue before?

Yes, mostly in versions <= 2.15.1 where no log policy was taken in place.

After that we’ve added this policy for docker logging in all containers:

Did you tried to restart the infra? Are you still having problems with space? Maybe logs was not deleted correctly