AbstractCommandBuilder
== [[AbstractCommandBuilder]] AbstractCommandBuilder
AbstractCommandBuilder
is the base command builder for spark-submit-SparkSubmitCommandBuilder.md[SparkSubmitCommandBuilder] and SparkClassCommandBuilder
specialized command builders.
AbstractCommandBuilder
expects that command builders define buildCommand
.
.AbstractCommandBuilder
Methods [cols="1,2",options="header",width="100%"] |=== | Method | Description | buildCommand
| The only abstract method that subclasses have to define. | <spark-defaults.conf
file under the Spark configuration directory. |===
=== [[buildJavaCommand]] buildJavaCommand
Internal Method
[source, java]¶
List buildJavaCommand(String extraClassPath)¶
buildJavaCommand
builds the Java command for a Spark application (which is a collection of elements with the path to java
executable, JVM options from java-opts
file, and a class path).
If javaHome
is set, buildJavaCommand
adds [javaHome]/bin/java
to the result Java command. Otherwise, it uses JAVA_HOME
or, when no earlier checks succeeded, falls through to java.home
Java's system property.
CAUTION: FIXME Who sets javaHome
internal property and when?
buildJavaCommand
loads extra Java options from the java-opts
file in <
Eventually, buildJavaCommand
<-cp
to the result Java command.
=== [[buildClassPath]] buildClassPath
method
[source, java]¶
List buildClassPath(String appClassPath)¶
buildClassPath
builds the classpath for a Spark application.
NOTE: Directories always end up with the OS-specific file separator at the end of their paths.
buildClassPath
adds the following in that order:
SPARK_CLASSPATH
environment variable- The input
appClassPath
- The spark-AbstractCommandBuilder.md#getConfDir[configuration directory]
-
(only with
SPARK_PREPEND_CLASSES
set orSPARK_TESTING
being1
) Locally compiled Spark classes inclasses
,test-classes
and Core's jars. + CAUTION: FIXME Elaborate on "locally compiled Spark classes". -
(only with
SPARK_SQL_TESTING
being1
) ... + CAUTION: FIXME Elaborate on the SQL testing case -
HADOOP_CONF_DIR
environment variable -
YARN_CONF_DIR
environment variable -
SPARK_DIST_CLASSPATH
environment variable
NOTE: childEnv
is queried first before System properties. It is always empty for AbstractCommandBuilder
(and SparkSubmitCommandBuilder
, too).
=== [[loadPropertiesFile]] Loading Properties File -- loadPropertiesFile
Internal Method
[source, java]¶
Properties loadPropertiesFile()¶
loadPropertiesFile
is part of AbstractCommandBuilder
private API that loads Spark settings from a properties file (when specified on the command line) or spark-properties.md#spark-defaults-conf[spark-defaults.conf] in the <
It loads the settings from the following files starting from the first and checking every location until the first properties file is found:
propertiesFile
(if specified using--properties-file
command-line option or set byAbstractCommandBuilder.setPropertiesFile
).[SPARK_CONF_DIR]/spark-defaults.conf
[SPARK_HOME]/conf/spark-defaults.conf
NOTE: loadPropertiesFile
reads a properties file using UTF-8
.
=== [[getConfDir]][[configuration-directory]] Spark's Configuration Directory -- getConfDir
Internal Method
AbstractCommandBuilder
uses getConfDir
to compute the current configuration directory of a Spark application.
It uses SPARK_CONF_DIR
(from childEnv
which is always empty anyway or as a environment variable) and falls through to [SPARK_HOME]/conf
(with SPARK_HOME
from <
=== [[getSparkHome]][[home-directory]] Spark's Home Directory -- getSparkHome
Internal Method
AbstractCommandBuilder
uses getSparkHome
to compute Spark's home directory for a Spark application.
It uses SPARK_HOME
(from childEnv
which is always empty anyway or as a environment variable).
If SPARK_HOME
is not set, Spark throws a IllegalStateException
:
Spark home not found; set it explicitly or use the SPARK_HOME environment variable.