Multiple SparkContexts error in tutorial

412    Asked by Aashishchaursiya in Salesforce , Asked on Jul 12, 2021

I am new to Spark.

While using pyspark, when I attempt to initialize a new SparkContext,

from pyspark import SparkContext
sc = SparkContext("local[4]", "test")

I get the following error:

ValueError: Cannot run multiple SparkContexts at once

I'm wondering if my previous attempts at running example code loaded something into memory that didn't clear out.

Answered by Ankit Chauhan

To solve cannot run multiple sparkcontexts at once

running ./bin/pyspark interactively automatically loads the sparkContext.

You will even see that written on the screen when you execute pyspark.

So, you can either run "del sc" or stop.sc() at the beginning and create a new sparkcontext:

→ stop.sc() → sc = SparkContext.getOrCreate()

or just carry on and use "sc" as automatically defined.



Your Answer

Interviews

Parent Categories