Skip to content

Configuration Properties

spark.pyspark.driver.python

Default: (undefined)

spark.pyspark.python

Default: (undefined)

spark.python.use.daemon

Because forking processes from Java is expensive, we prefer to launch a single Python daemon, pyspark/daemon.py (by default) and tell it to fork new workers for our tasks. This daemon currently only works on UNIX-based systems now because it uses signals for child management, so we can also fall back to launching workers, pyspark/worker.py (by default) directly.

Default: true (always disabled on Windows)

Used when PythonWorkerFactory is created

spark.python.daemon.module

Default: pyspark.daemon

Used when PythonWorkerFactory is created

spark.python.worker.module

Default: (undefined)

Used when PythonWorkerFactory is created


Last update: 2021-03-02