Management Scripts for Standalone Master

You can start a Spark Standalone master (aka standalone Master) using sbin/ and stop it using sbin/


sbin/ script starts a Spark master on the machine the script is executed on.


The script prepares the command line to start the class org.apache.spark.deploy.master.Master and by default runs as follows:

org.apache.spark.deploy.master.Master \
  --ip japila.local --port 7077 --webui-port 8080
The command sets SPARK_PRINT_LAUNCH_COMMAND environment variable to print out the launch command to standard error output. Refer to Print Launch Command of Spark Scripts.

It has support for starting Tachyon using --with-tachyon command line option. It assumes tachyon/bin/tachyon command be available in Spark’s home directory.

The script uses the following helper scripts:

  • sbin/

  • bin/

  • conf/ contains environment variables of a Spark executable.

Ultimately, the script calls sbin/ start to kick off org.apache.spark.deploy.master.Master with parameter 1 and --ip, --port, and --webui-port command-line options.

Command-line Options

You can use the following command-line options:

  • --host or -h the hostname to listen on; overrides SPARK_MASTER_HOST.

  • --ip or -i (deprecated) the IP to listen on

  • --port or -p - command-line version of SPARK_MASTER_PORT that overrides it.

  • --webui-port - command-line version of SPARK_MASTER_WEBUI_PORT that overrides it.

  • --properties-file (default: $SPARK_HOME/conf/spark-defaults.conf) - the path to a custom Spark properties file. Refer to spark-defaults.conf.

  • --help - prints out help


You can stop a Spark Standalone master using sbin/ script.

FIXME Review the script

It effectively sends SIGTERM to the master’s process.

You should see the ERROR in master’s logs: