SparkTransportConf Utility¶
fromSparkConf¶
fromSparkConf(
_conf: SparkConf,
module: String, // (1)
numUsableCores: Int = 0,
role: Option[String] = None): TransportConf // (2)
- The given
module
isshuffle
most of the time except:rpc
for NettyRpcEnvfiles
for NettyRpcEnv
- Only defined in NettyRpcEnv to be either
driver
orexecutor
fromSparkConf
makes a copy (clones) the given SparkConf.
fromSparkConf
sets the following configuration properties (for the given module
):
spark.[module].io.serverThreads
spark.[module].io.clientThreads
The values are taken using the following properties in the order and until one is found (with suffix
being serverThreads
or clientThreads
, respectively):
spark.[role].[module].io.[suffix]
spark.[module].io.[suffix]
Unless found, fromSparkConf
defaults to the default number of threads (based on the given numUsableCores
and not more than 8
).
In the end, fromSparkConf
creates a TransportConf (for the given module
and the updated SparkConf
).
fromSparkConf
is used when:
SparkEnv
utility is used to create a SparkEnv (with the spark.shuffle.service.enabled configuration property enabled)ExternalShuffleService
is createdNettyBlockTransferService
is requested to initNettyRpcEnv
is created and requested for a downloadClientIndexShuffleBlockResolver
is createdShuffleBlockPusher
is requested to initiateBlockPushBlockManager
is requested to readDiskBlockFromSameHostExecutor