Skip to content


LocalSparkCluster is a single-JVM Spark Standalone cluster that is available as local-cluster master URL.

NOTE: local-cluster master URL matches local-cluster[numWorkers,coresPerWorker,memoryPerWorker] pattern where <>, <> and <> are all numbers separated by the comma.

LocalSparkCluster can be particularly useful to test distributed operation and fault recovery without spinning up a lot of processes.

LocalSparkCluster is <> when SparkContext is created for local-cluster master URL (and so requested to[create the SchedulerBackend and the TaskScheduler]).

[[logging]] [TIP] ==== Enable INFO logging level for org.apache.spark.deploy.LocalSparkCluster logger to see what happens inside.

Add the following line to conf/

Refer to Logging.

=== [[creating-instance]] Creating LocalSparkCluster Instance

LocalSparkCluster takes the following when created:

  • [[numWorkers]] Number of workers
  • [[coresPerWorker]] CPU cores per worker
  • [[memoryPerWorker]] Memory per worker
  • [[conf]] SparkConf

LocalSparkCluster initializes the <>.


start(): Array[String]


start is used when...FIXME

=== [[stop]] Stopping LocalSparkCluster

[source, scala]

stop(): Unit


NOTE: stop is used when...FIXME

Back to top