AbstractCommandBuilder is the base command builder for SparkSubmitCommandBuilder and
SparkClassCommandBuilder specialized command builders.
AbstractCommandBuilder expects that command builders define
The only abstract method that subclasses have to define.
Loads the configuration file for a Spark application, be it the user-specified properties file or
List<String> buildJavaCommand(String extraClassPath)
buildJavaCommand builds the Java command for a Spark application (which is a collection of elements with the path to
java executable, JVM options from
java-opts file, and a class path).
javaHome is set,
[javaHome]/bin/java to the result Java command. Otherwise, it uses
JAVA_HOME or, when no earlier checks succeeded, falls through to
java.home Java’s system property.
FIXME Who sets
buildJavaCommand loads extra Java options from the
java-opts file in configuration directory if the file exists and adds them to the result Java command.
buildJavaCommand builds the class path (with the extra class path if non-empty) and adds it as
-cp to the result Java command.
List<String> buildClassPath(String appClassPath)
buildClassPath builds the classpath for a Spark application.
|Directories always end up with the OS-specific file separator at the end of their paths.|
buildClassPath adds the following in that order:
1) Locally compiled Spark classes in
test-classesand Core’s jars.
FIXME Elaborate on "locally compiled Spark classes".
FIXME Elaborate on the SQL testing case
loadPropertiesFile is part of
AbstractCommandBuilder private API that loads Spark settings from a properties file (when specified on the command line) or spark-defaults.conf in the configuration directory.
It loads the settings from the following files starting from the first and checking every location until the first properties file is found:
propertiesFile(if specified using
--properties-filecommand-line option or set by
getConfDir to compute the current configuration directory of a Spark application.
childEnv which is always empty anyway or as a environment variable) and falls through to
getSparkHome internal method).
getSparkHome to compute Spark’s home directory for a Spark application.
childEnv which is always empty anyway or as a environment variable).
SPARK_HOME is not set, Spark throws a
Spark home not found; set it explicitly or use the SPARK_HOME environment variable.