Skip to content

DeltaCatalog

DeltaCatalog is a DelegatingCatalogExtension (Spark SQL) and a StagingTableCatalog (Spark SQL).

DeltaCatalog is registered using spark.sql.catalog.spark_catalog configuration property (while creating a SparkSession in a Spark application).

Altering Table

alterTable(
  ident: Identifier,
  changes: TableChange*): Table

alterTable is part of the TableCatalog (Spark SQL) abstraction.

alterTable loads the table and continues for DeltaTableV2. Otherwise, alterTable delegates to the parent TableCatalog.

alterTable...FIXME

Creating Table

createTable(
  ident: Identifier,
  schema: StructType,
  partitions: Array[Transform],
  properties: util.Map[String, String]): Table

createTable is part of the TableCatalog (Spark SQL) abstraction.

createTable...FIXME

Loading Table

loadTable(
  ident: Identifier): Table

loadTable is part of the TableCatalog (Spark SQL) abstraction.

loadTable loads a table by the given identifier from a catalog.

If found and the table is a delta table (Spark SQL's V1Table with delta provider), loadTable creates a DeltaTableV2.

Creating Delta Table

createDeltaTable(
  ident: Identifier,
  schema: StructType,
  partitions: Array[Transform],
  properties: util.Map[String, String],
  sourceQuery: Option[LogicalPlan],
  operation: TableCreationModes.CreationMode): Table

createDeltaTable...FIXME

createDeltaTable is used when:


Last update: 2021-06-13
Back to top