Customize SparkContext using sparkConf.set(..) when using spark-shell

Spark 2.0+ You should be able to use SparkSession.conf.set method to set some configuration option on runtime but it is mostly limited to SQL configuration. Spark < 2.0 You can simply stop an existing context and create a new one: import org.apache.spark.{SparkContext, SparkConf} sc.stop() val conf = new SparkConf().set(“spark.executor.memory”, “4g”) val sc = new SparkContext(conf) … Read more