Standard Functions¶
org.apache.spark.sql.functions object defines built-in standard functions to work with (values produced by) columns.
You can access the standard functions using the following import statement in your Scala application:
import org.apache.spark.sql.functions._
udaf¶
udaf[IN: TypeTag, BUF, OUT](
agg: Aggregator[IN, BUF, OUT]): UserDefinedFunction // (1)!
udaf[IN, BUF, OUT](
agg: Aggregator[IN, BUF, OUT],
inputEncoder: Encoder[IN]): UserDefinedFunction
- Uses an ExpressionEncoder of
INtype
udaf creates a UserDefinedAggregator with the given Aggregator and Encoder.
Creating AggregateExpression for AggregateFunction¶
withAggregateFunction(
func: AggregateFunction,
isDistinct: Boolean = false): Column
withAggregateFunction requests the given AggregateFunction expression to toAggregateExpression (with the given isDistinct flag).
In the end, withAggregateFunction creates a Column for the AggregateExpression.