Deployment Environments — Run Modes

Spark Deployment Environments (aka Run Modes):

A Spark application is composed of the driver and executors that can run locally (on a single JVM) or using cluster resources (like CPU, RAM and disk that are managed by a cluster manager).

You can specify where to run the driver using the deploy mode (using --deploy-mode option of spark-submit or spark.submit.deployMode Spark property).

Master URLs

Spark supports the following master URLs (see private object SparkMasterRegex):

You can specify the master URL of a Spark application as follows: