DeltaCatalog¶
DeltaCatalog
is a DelegatingCatalogExtension
(Spark SQL) and a StagingTableCatalog
(Spark SQL).
DeltaCatalog
is registered using spark.sql.catalog.spark_catalog configuration property (while creating a SparkSession
in a Spark application).
Altering Table¶
alterTable(
ident: Identifier,
changes: TableChange*): Table
alterTable
...FIXME
alterTable
is part of the TableCatalog
(Spark SQL) abstraction.
Creating Table¶
createTable(
ident: Identifier,
schema: StructType,
partitions: Array[Transform],
properties: util.Map[String, String]): Table
createTable
...FIXME
createTable
is part of the TableCatalog
(Spark SQL) abstraction.
Loading Table¶
loadTable(
ident: Identifier): Table
loadTable
loads a table by the given identifier from a catalog.
If found and the table is a delta table (Spark SQL's V1Table with delta
provider), loadTable
creates a DeltaTableV2.
loadTable
is part of the TableCatalog
(Spark SQL) abstraction.
Creating Delta Table¶
createDeltaTable(
ident: Identifier,
schema: StructType,
partitions: Array[Transform],
properties: util.Map[String, String],
sourceQuery: Option[LogicalPlan],
operation: TableCreationModes.CreationMode): Table
createDeltaTable
...FIXME
createDeltaTable
is used when:
DeltaCatalog
is requested to createTableStagedDeltaTableV2
is requested to commitStagedChanges
Last update: 2020-12-07