PushDownOperatorsToDataSource Logical Optimization¶
PushDownOperatorsToDataSource
is a logical optimization that <
Technically, PushDownOperatorsToDataSource
is a <Rule[LogicalPlan]
.
PushDownOperatorsToDataSource
is part of the Push down operators to data source scan once-executed rule batch of the SparkOptimizer.
=== [[apply]] Executing Rule -- apply
Method
[source, scala]¶
apply(plan: LogicalPlan): LogicalPlan¶
apply
...FIXME
apply
is part of the Rule abstraction.
=== [[pushDownRequiredColumns]] pushDownRequiredColumns
Internal Method
[source, scala]¶
pushDownRequiredColumns(plan: LogicalPlan, requiredByParent: AttributeSet): LogicalPlan¶
pushDownRequiredColumns
branches off per the input <
. For <pushDownRequiredColumns
takes the <requiredByParent
attributes are not considered in the required columns.
. For Filter
unary logical operator, pushDownRequiredColumns
adds the <requiredByParent
attributes and executes itself recursively on the child logical operator
. For <pushDownRequiredColumns
...FIXME
. For other logical operators, pushDownRequiredColumns
simply executes itself (using TreeNode.mapChildren) recursively on the child nodes (logical operators)
pushDownRequiredColumns
is used when PushDownOperatorsToDataSource
logical optimization is requested to <
=== [[FilterAndProject]][[unapply]] Destructuring Logical Operator -- FilterAndProject.unapply
Method
[source, scala]¶
unapply(plan: LogicalPlan): Option[(Seq[NamedExpression], Expression, DataSourceV2Relation)]¶
unapply
is part of FilterAndProject
extractor object to destructure the input <
unapply
works with (matches) the following logical operators:
. For a Filter
with a <unapply
...FIXME
. For a Filter
with a <unapply
...FIXME
. For others, unapply
returns None
(i.e. does nothing / does not match)
NOTE: unapply
is used exclusively when PushDownOperatorsToDataSource
logical optimization is requested to <