InsertIntoHiveTable Logical Command¶
HiveAnalysis.md[HiveAnalysis] logical resolution rule is executed and resolves a ../InsertIntoTable.md[InsertIntoTable] logical operator with a HiveTableRelation.md[Hive table] CreateHiveTableAsSelectCommand.md[CreateHiveTableAsSelectCommand] logical command is executed === [[run]] Executing Data-Writing Logical Command -- run( sparkSession: SparkSession, child: SparkPlan): Seq[Row] NOTE: In the end, === [[processInsert]] processInsert( sparkSession: SparkSession, externalCatalog: ExternalCatalog, hadoopConf: Configuration, tableDesc: TableDesc, tmpLocation: Path, child: SparkPlan): Unit NOTE: InsertIntoHiveTable is a SaveAsHiveFile.md[logical command] that writes the result of executing a <>.
InsertIntoHiveTable is <
Creating Instance¶
InsertIntoHiveTable takes the following to be created:
Map[String, Option[String]])overwrite FlagifPartitionNotExists Flagrun Method[source, scala]¶
run is part of ../DataWritingCommand.md#run[DataWritingCommand] contract.run requests the input ../SparkSession.md[SparkSession] for ../SparkSession.md#sharedState[SharedState] that is then requested for the ../SharedState.md#externalCatalog[ExternalCatalog].run requests the ../SessionState.md[SessionState] for a new ../SessionState.md#newHadoopConf[Hadoop Configuration].run HiveClientImpl.md#toHiveTable[converts the CatalogTable metadata to Hive's].run SaveAsHiveFile.md#getExternalTmpPath[getExternalTmpPath].run <run requests the input ../SparkSession.md[SparkSession] for ../SparkSession.md#catalog[Catalog] that is requested to uncache the table.run un-caches the Hive table. run requests the input ../SparkSession.md[SparkSession] for ../SparkSession.md#sessionState[SessionState]. run requests the SessionState for the ../SessionState.md#catalog[SessionCatalog] that is requested to invalidate the cache for the table.run update the table statistics.processInsert Internal Method[source, scala]¶
processInsert...FIXMEprocessInsert is used when InsertIntoHiveTable logical command is <