SparkTransportConf Utility¶
fromSparkConf¶
fromSparkConf(
_conf: SparkConf,
module: String, // (1)
numUsableCores: Int = 0,
role: Option[String] = None): TransportConf // (2)
- The given
moduleisshufflemost of the time except:rpcfor NettyRpcEnvfilesfor NettyRpcEnv
- Only defined in NettyRpcEnv to be either
driverorexecutor
fromSparkConf makes a copy (clones) the given SparkConf.
fromSparkConf sets the following configuration properties (for the given module):
spark.[module].io.serverThreadsspark.[module].io.clientThreads
The values are taken using the following properties in the order and until one is found (with suffix being serverThreads or clientThreads, respectively):
spark.[role].[module].io.[suffix]spark.[module].io.[suffix]
Unless found, fromSparkConf defaults to the default number of threads (based on the given numUsableCores and not more than 8).
In the end, fromSparkConf creates a TransportConf (for the given module and the updated SparkConf).
fromSparkConf is used when:
SparkEnvutility is used to create a SparkEnv (with the spark.shuffle.service.enabled configuration property enabled)ExternalShuffleServiceis createdNettyBlockTransferServiceis requested to initNettyRpcEnvis created and requested for a downloadClientIndexShuffleBlockResolveris createdShuffleBlockPusheris requested to initiateBlockPushBlockManageris requested to readDiskBlockFromSameHostExecutor