'How to close the spark instance

I want to stop my spark instance here once I complete my job running on Jupyter notebook. I did execute spark.stop() at the end, but when I open my terminal, I'm still see the spark process there ps -ef | grep spark So everytime I have to kill spark process ID manually. Anyone knows how to solve this problem? Thanks!!

spark = SparkSession.builder \
    .master("local") \
    .appName("Test") \
    .config("spark.executorEnv.PYTHONPATH", "pyspark.zip:py4j-0.10.7-src.zip")\
    .config('spark.jars','/Users/xxx/Documents/snowflake-jdbc-3.12.8.jar,/Users/xxx/Documents/spark-snowflake_2.11-2.7.2-spark_2.4.jar') \
    .config('spark.jars.packages','org.apache.hadoop:hadoop-aws:2.7.3') \
    .getOrCreate()


Solution 1:[1]

Try by shutting down the sparkContext instead of spark session. You can Try following things:

sc.stop()

or

spark.sparkContext.stop()

and than you can do

spark.stop()

Solution 2:[2]

I've had luck closing the session running:

exit()

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 code.gsoni
Solution 2 willtwait