= Spark Architecture

Spark uses a master/worker architecture. There is a[driver] that talks to a single coordinator called[master] that manages[workers] in which[executors] run.

.Spark architecture image::driver-sparkcontext-clustermanager-workers-executors.png[align="center"]

The driver and the executors run in their own Java processes. You can run them all on the same (horizontal cluster) or separate machines (vertical cluster) or in a mixed machine configuration.

.Spark architecture in detail image::sparkapp-sparkcontext-master-slaves.png[align="center"]

Physical machines are called hosts or nodes.

Last update: 2020-10-06