List<String> buildCommand( Map<String, String> env)
Mainis requested to buildCommand
WorkerCommandBuilderis requested to
List<String> buildJavaCommand( String extraClassPath)
buildJavaCommand builds the Java command for a Spark application (which is a collection of elements with the path to
java executable, JVM options from
java-opts file, and a class path).
javaHome is set,
[javaHome]/bin/java to the result Java command. Otherwise, it uses
JAVA_HOME or, when no earlier checks succeeded, falls through to
java.home Java's system property.
CAUTION: FIXME Who sets
javaHome internal property and when?
buildJavaCommand loads extra Java options from the
java-opts file in configuration directory if the file exists and adds them to the result Java command.
buildJavaCommand builds the class path (with the extra class path if non-empty) and adds it as
-cp to the result Java command.
List<String> buildClassPath( String appClassPath)
buildClassPath builds the classpath for a Spark application.
Directories always end up with the OS-specific file separator at the end of their paths.
buildClassPath adds the following in that order:
- The input
- The configuration directory
1) Locally compiled Spark classes in
test-classesand Core's jars. + CAUTION: FIXME Elaborate on "locally compiled Spark classes".
1) ... + CAUTION: FIXME Elaborate on the SQL testing case
childEnv is queried first before System properties. It is always empty for
Loading Properties File¶
loadPropertiesFile loads the settings from the following files starting from the first and checking every location until the first properties file is found:
propertiesFile(if specified using
--properties-filecommand-line option or set by
Spark's Configuration Directory¶
getConfDir to compute the current configuration directory of a Spark application.
childEnv which is always empty anyway or as a environment variable) and falls through to
SPARK_HOME from getSparkHome).
Spark's Home Directory¶
getSparkHome to compute Spark's home directory for a Spark application.
childEnv which is always empty anyway or as a environment variable).
SPARK_HOME is not set, Spark throws a
Spark home not found; set it explicitly or use the SPARK_HOME environment variable.