BoundReference¶
BoundReference
is a leaf expression that evaluates to a value (of the given data type) that is at the specified position in the given InternalRow.
Creating Instance¶
BoundReference
takes the following to be created:
- Position (ordinal)
- Data type of values
-
nullable
flag
BoundReference
is created when:
Encoders
utility is used to create a generic ExpressionEncoderScalaReflection
utility is used to serializerForTypeExpressionEncoder
utility is used to tupleRowEncoder
utility is used to create a RowEncoderBindReferences
utility is used to bind an AttributeReferenceUnsafeProjection
utility is used to create an UnsafeProjection- others
Code-Generated Expression Evaluation¶
doGenCode(
ctx: CodegenContext,
ev: ExprCode): ExprCode
doGenCode
is part of the Expression abstraction.
doGenCode
...FIXME
import org.apache.spark.sql.catalyst.expressions.BoundReference
import org.apache.spark.sql.types.LongType
val boundRef = BoundReference(ordinal = 0, dataType = LongType, nullable = true)
// doGenCode is used when Expression.genCode is executed
import org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext
val ctx = new CodegenContext
val code = boundRef.genCode(ctx).code
scala> println(code)
boolean isNull_0 = i.isNullAt(0);
long value_0 = isNull_0 ?
-1L : (i.getLong(0));
Interpreted Expression Evaluation¶
eval(
input: InternalRow): Any
eval
is part of the Expression abstraction.
eval
gives the value at position from the given InternalRow.
eval
returns null
if the value at the position is null
. Otherwise, eval
uses the methods of InternalRow
per the defined data type to access the value.
String Representation¶
toString: String
toString
is part of the Expression abstraction.
toString
is the following text:
input[[ordinal], [dataType], [nullable]]
Catalyst DSL¶
Catalyst DSL's at can be used to create a BoundReference
.
import org.apache.spark.sql.catalyst.dsl.expressions._
val boundRef = 'id.string.at(4)
import org.apache.spark.sql.catalyst.expressions.BoundReference
assert(boundRef.isInstanceOf[BoundReference])
Demo¶
import org.apache.spark.sql.catalyst.expressions.BoundReference
import org.apache.spark.sql.types.LongType
val boundRef = BoundReference(ordinal = 0, dataType = LongType, nullable = true)
scala> println(boundRef)
input[0, bigint, true]
import org.apache.spark.sql.catalyst.InternalRow
val row = InternalRow(1L, "hello")
val value = boundRef.eval(row).asInstanceOf[Long]
assert(value == 1L)