DeltaOptions¶
DeltaOptions
is a type-safe abstraction of the supported write and read options.
DeltaOptions
is used to create WriteIntoDelta command, DeltaSink, and DeltaSource.
import org.apache.spark.sql.delta.DeltaOptions
assert(DeltaOptions.OVERWRITE_SCHEMA_OPTION == "overwriteSchema")
val options = new DeltaOptions(Map.empty[String, String], spark.sessionState.conf)
assert(options.failOnDataLoss, "failOnDataLoss should be enabled by default")
val options = new DeltaOptions(
Map(DeltaOptions.OVERWRITE_SCHEMA_OPTION -> true.toString),
spark.sessionState.conf)
assert(
options.canOverwriteSchema,
s"${DeltaOptions.OVERWRITE_SCHEMA_OPTION} should be enabled")
Creating Instance¶
DeltaOptions
takes the following to be created:
- Case-Insensitive Options
-
SQLConf
(Spark SQL)
When created, DeltaOptions
verifies the options.
Verifying Options¶
verifyOptions(
options: CaseInsensitiveMap[String]): Unit
verifyOptions
finds invalid options among the input options
.
Note
In the open-source version verifyOptions
does really nothing. The underlying objects (recordDeltaEvent
and the others) are no-ops.
verifyOptions
is used when:
DeltaOptions
is createdDeltaDataSource
is requested for a relation (for loading data in batch queries)
Serializable¶
DeltaOptions
is a Serializable
(Java) (so it can be used in Spark tasks).