'IOPub data rate exceeded in Jupyter notebook (when viewing image)

I want to view an image in Jupyter notebook. It's a 9.9MB .png file.

from IPython.display import Image
Image(filename='path_to_image/image.png')

I get the below error:

IOPub data rate exceeded.
The notebook server will temporarily stop sending output
to the client in order to avoid crashing it.

A bit surprising and reported elsewhere.

Is this expected and is there a simple solution?

(Error msg suggests changing limit in --NotebookApp.iopub_data_rate_limit.)



Solution 1:[1]

Try this:

jupyter notebook --NotebookApp.iopub_data_rate_limit=1.0e10

Or this:

yourTerminal:prompt> jupyter notebook --NotebookApp.iopub_data_rate_limit=1.0e10 

Solution 2:[2]

I ran into this using networkx and bokeh

This works for me in Windows 7 (taken from here):

  1. To create a jupyter_notebook_config.py file, with all the defaults commented out, you can use the following command line:

    $ jupyter notebook --generate-config

  2. Open the file and search for c.NotebookApp.iopub_data_rate_limit

  3. Comment out the line c.NotebookApp.iopub_data_rate_limit = 1000000 and change it to a higher default rate. l used c.NotebookApp.iopub_data_rate_limit = 10000000

This unforgiving default config is popping up in a lot of places. See git issues:

It looks like it might get resolved with the 5.1 release

Update:

Jupyter notebook is now on release 5.2.2. This problem should have been resolved. Upgrade using conda or pip.

Solution 3:[3]

Removing print statements can also fix the problem.

Apart from loading images, this error also happens when your code is printing continuously at a high rate, which is causing the error "IOPub data rate exceeded". E.g. if you have a print statement in a for loop somewhere that is being called over 1000 times.

Solution 4:[4]

By typing 'jupyter notebook --NotebookApp.iopub_data_rate_limit=1.0e10' in Anaconda PowerShell or prompt, the Jupyter notebook will open with the new configuration. Try now to run your query.

Solution 5:[5]

Some additional advice for Windows(10) users:

  1. If you are using Anaconda Prompt/PowerShell for the first time, type "Anaconda" in the search field of your Windows task bar and you will see the suggested software.
  2. Make sure to open the Anaconda prompt as administrator.
  3. Always navigate to your user directory or the directory with your Jupyter Notebook files first before running the command. Otherwise you might end up somewhere in your system files and be confused by an unfamiliar file tree.

The correct way to open Jupyter notebook with new data limit from the Anaconda Prompt on my own Windows 10 PC is:

(base) C:\Users\mobarget\Google Drive\Jupyter Notebook>jupyter notebook --NotebookApp.iopub_data_rate_limit=1.0e10

Solution 6:[6]

For already running docker containers, try editing the file name - ~/.jupyter/jupyter_notebook_config.py uncomment the line - NotebookApp.iopub_data_rate_limit = and set high number like 1e10. Restart the docker, it should fix the problem

Solution 7:[7]

I ran into this problem running version 6.3.0. When I tried the top rated solution by Merlin the powershell prompt notified me that iopub_data_rate_limit has moved from NotebookApp to ServerApp. The solution still worked but wanted to mention the variation, especially as internal handling of the config may become deprecated.

Solution 8:[8]

Easy workaround is to create a for loop and print. Then there wont be any issue. Printing directly wcc would cause if graph is huge. Hence any of below code will work as workaround.

wcc=list(nx.weakly_connected_components(train_graph)) for i in range(1,10): print(wcc[i])

for i in wcc): print(wcc)

Solution 9:[9]

Like others pointed out, print statement at a high rate can cause this. Resolve it by printing modulo a number using if statement. Example in python:

k = 10
if (i % k == 0):
   print("Something")

Increase k if the warning persists.

Solution 10:[10]

Using Visual Studio Code, the Jupyter extension will be able to handle big data. launch from anaconda navigator

Solution 11:[11]

In general, trying to print something that is too long will trigger this error. I tried to print a string that was 9221593 characters long (too long), and that triggered the error.