'Jupyter Notebook Won't Open Erno122

I'm trying to open Jupyter Notebook for some fairly heavy data analysis (pulling in datasets from hdf5 files and looking at certain aspects of them). I've done this in the past, but recently had to log out and am now getting

IOError: [Errno 122] Disk quota exceeded: u'/nethome/myname/.local/share/jupyter/runtime-nbserver-14392.json

I've already seen that if I try to look at too much from any dataset at a time or too many datasets, it kills the kernel, but now I can't even open it.

I've looked at the info at https://superuser.com/questions/1427724/jupyter-oserror-errno-122-disk-quota-exceeded, but using the solution there du --si -s $HOME shows that I only have 1.1G there, well below my allowable quota.

I also looked at Jupyter Notebook says that my disk quota is being exceeded, but I have plenty of space in my home directory, but it frankly didn't seem as helpful.

I've checked my disk usage and it seems I have a fair amount of space left.

(base) [myname@mycomputer ~]$ df -h
Filesystem                      Size  Used Avail Use% Mounted on
devtmpfs                         16G     0   16G   0% /dev
tmpfs                            16G   61M   16G   1% /dev/shm
tmpfs                            16G  1.6G   15G  10% /run
tmpfs                            16G     0   16G   0% /sys/fs/cgroup
/dev/mapper/physvg01-slash       79G   21G   54G  28% /
/dev/sda2                       976M  344M  565M  38% /boot
/dev/sda1                       300M  9.9M  290M   4% /boot/efi
/dev/mapper/physvg01-var         20G   14G  4.9G  74% /var
/dev/mapper/physvg01-localdata  1.7T  1.4T  279G  84% /localdata
phys-file:/data/home/myname      12T  5.7T  5.6T  51% /nethome/myname
tmpfs                           3.2G   68K  3.2G   1% /run/user/1088055

Any help is appreciated as I need to do some analysis in there as soon as possible. Thanks.



Solution 1:[1]

Well, it is obvious that your local machine has no more sufficient memories for this kernel. So my suggestion for you is to try some alternatives such as Google Colab, Kaggle Notebook, or GCP AI Platforms' notebook (not free if you have already signed up and run out of the free quota). I hope it works!

Solution 2:[2]

Faced the same issu.
I noticed that every try, a new runtime-nbserver-****.json file is created.

I solved in by removing recent files by rm /path/to/runtime-sbserver-files.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Memphis Meng
Solution 2 ofir1080