'Increasing Spark application timeout in Jupyter/Livy

I'm using a shared EMR cluster with Jupyterhub installed. If my cluster is under heavy load, I get an error enter image description hereHow do I increase the timeout for a spark application from 60 seconds to something greater like 900 seconds (15 mins)?



Solution 1:[1]

I've found the correct file to adjust the timeout.

/etc/jupyter/conf/config.json

"livy_session_startup_timeout_seconds": 900

Now the timeout is set to 900 seconds vs 60 before.

Solution 2:[2]

Set the following properties value to a higher number

# How long the rsc client will wait when attempting to connect to the Livy server
# livy.rsc.server.connect.timeout = 60s

In case of YARN as resource manager, the application goes to accepted state when resources are not available( application has not started yet).
The above value implies the amount of time Livy server can wait for resources to become available.

Solution 3:[3]

This was for an ambari cluster, but we had to tweak:

livy.server.yarn.app-lookup-timeout = 600s

Found it in the config template here.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 sho
Solution 2 DaRkMaN
Solution 3 jon