SparkConf¶
SparkConf is Serializable (Java).
Creating Instance¶
SparkConf takes the following to be created:
- loadDefaults flag
loadDefaults Flag¶
SparkConf can be given loadDefaults flag when created.
Default: true
When true, SparkConf loads spark properties (with silent flag disabled) when created.
getAllWithPrefix¶
getAllWithPrefix(
prefix: String): Array[(String, String)]
getAllWithPrefix collects the keys with the given prefix in getAll.
In the end, getAllWithPrefix removes the given prefix from the keys.
getAllWithPrefix is used when:
SparkConfis requested to getExecutorEnv (spark.executorEnv.prefix), fillMissingMagicCommitterConfsIfNeeded (spark.hadoop.fs.s3a.bucket.prefix)ExecutorPluginContaineris requested for the executorPlugins (spark.plugins.internal.conf.prefix)ResourceUtilsis requested to parseResourceRequest, listResourceIds, addTaskResourceRequests, parseResourceRequirementsSortShuffleManageris requested to loadShuffleExecutorComponents (spark.shuffle.plugin.__config__.prefix)ServerInfois requested toaddFilters
Loading Spark Properties¶
loadFromSystemProperties(
silent: Boolean): SparkConf
loadFromSystemProperties records all the spark.-prefixed system properties in this SparkConf.
Silently loading system properties
Loading system properties silently is possible using the following:
new SparkConf(loadDefaults = false).loadFromSystemProperties(silent = true)
loadFromSystemProperties is used when:
SparkConfis created (with loadDefaults enabled)SparkHadoopUtilis created
Executor Settings¶
SparkConf uses spark.executorEnv. prefix for executor settings.
getExecutorEnv¶
getExecutorEnv: Seq[(String, String)]
getExecutorEnv gets all the settings with spark.executorEnv. prefix.
getExecutorEnv is used when:
SparkContextis created (and requested for executorEnvs)
setExecutorEnv¶
setExecutorEnv(
variables: Array[(String, String)]): SparkConf
setExecutorEnv(
variables: Seq[(String, String)]): SparkConf
setExecutorEnv(
variable: String, value: String): SparkConf
setExecutorEnv sets the given (key-value) variables with the keys with spark.executorEnv. prefix added.
setExecutorEnv is used when:
SparkContextis requested to updatedConf
Logging¶
Enable ALL logging level for org.apache.spark.SparkConf logger to see what happens inside.
Add the following line to conf/log4j.properties:
log4j.logger.org.apache.spark.SparkConf=ALL
Refer to Logging.