importorg.apache.spark.sql.SparkSessionvalspark=SparkSession.builder.appName("My Spark Application")// optional and will be autogenerated if not specified.master("local[*]")// only for demo and testing purposes, use spark-submit instead.enableHiveSupport()// self-explanatory, isn't it?.config("spark.sql.warehouse.dir","target/spark-warehouse").withExtensions{extensions=>extensions.injectResolutionRule{session=>...}extensions.injectOptimizerRule{session=>...}}.getOrCreate
You do not need any existing Hive installation to use Spark's Hive support. SparkSession context will automatically create metastore_db in the current directory of a Spark application and the directory configured by spark.sql.warehouse.dir configuration property.
hiveClassesArePresent returns true when the initialization succeeded, and false otherwise (due to ClassNotFoundException or NoClassDefFoundError errors).