// Use :paste -raw to paste the following code in spark-shell // BEGIN package org.apache.spark import org.apache.spark.sql.hive.HiveUtils object opener { def CONVERT_METASTORE_PARQUET = HiveUtils.CONVERT_METASTORE_PARQUET } // END
You should see one of the following INFO messages in the logs:
Initializing HiveMetastoreConnection version [hiveMetastoreVersion] using Spark classes.
Initializing HiveMetastoreConnection version [hiveMetastoreVersion] using maven.
Initializing HiveMetastoreConnection version [hiveMetastoreVersion] using [jars]
In the end, newClientForMetadata requests the IsolatedClientLoader for a IsolatedClientLoader.md#createClient[HiveClient].
newClientForMetadata is used when HiveExternalCatalog is requested for a HiveClient.
withHiveExternalCatalog simply sets the ../StaticSQLConf.md#spark.sql.catalogImplementation[spark.sql.catalogImplementation] configuration property to hive for the input SparkContext.
NOTE: withHiveExternalCatalog is used when the deprecated HiveContext is created.