SparkConf¶
SparkConf
is Serializable
(Java).
Creating Instance¶
SparkConf
takes the following to be created:
- loadDefaults flag
loadDefaults Flag¶
SparkConf
can be given loadDefaults
flag when created.
Default: true
When true
, SparkConf
loads spark properties (with silent
flag disabled) when created.
getAllWithPrefix¶
getAllWithPrefix(
prefix: String): Array[(String, String)]
getAllWithPrefix
collects the keys with the given prefix
in getAll.
In the end, getAllWithPrefix
removes the given prefix
from the keys.
getAllWithPrefix
is used when:
SparkConf
is requested to getExecutorEnv (spark.executorEnv.
prefix), fillMissingMagicCommitterConfsIfNeeded (spark.hadoop.fs.s3a.bucket.
prefix)ExecutorPluginContainer
is requested for the executorPlugins (spark.plugins.internal.conf.
prefix)ResourceUtils
is requested to parseResourceRequest, listResourceIds, addTaskResourceRequests, parseResourceRequirementsSortShuffleManager
is requested to loadShuffleExecutorComponents (spark.shuffle.plugin.__config__.
prefix)ServerInfo
is requested toaddFilters
Loading Spark Properties¶
loadFromSystemProperties(
silent: Boolean): SparkConf
loadFromSystemProperties
records all the spark.
-prefixed system properties in this SparkConf
.
Silently loading system properties
Loading system properties silently is possible using the following:
new SparkConf(loadDefaults = false).loadFromSystemProperties(silent = true)
loadFromSystemProperties
is used when:
SparkConf
is created (with loadDefaults enabled)SparkHadoopUtil
is created
Executor Settings¶
SparkConf
uses spark.executorEnv.
prefix for executor settings.
getExecutorEnv¶
getExecutorEnv: Seq[(String, String)]
getExecutorEnv
gets all the settings with spark.executorEnv. prefix.
getExecutorEnv
is used when:
SparkContext
is created (and requested for executorEnvs)
setExecutorEnv¶
setExecutorEnv(
variables: Array[(String, String)]): SparkConf
setExecutorEnv(
variables: Seq[(String, String)]): SparkConf
setExecutorEnv(
variable: String, value: String): SparkConf
setExecutorEnv
sets the given (key-value) variables with the keys with spark.executorEnv. prefix added.
setExecutorEnv
is used when:
SparkContext
is requested to updatedConf
Logging¶
Enable ALL
logging level for org.apache.spark.SparkConf
logger to see what happens inside.
Add the following line to conf/log4j.properties
:
log4j.logger.org.apache.spark.SparkConf=ALL
Refer to Logging.