SaveIntoDataSourceCommand Logical Command¶
SaveIntoDataSourceCommand
is a logical runnable command.
SaveIntoDataSourceCommand
is <DataSource
is requested to create a logical command for writing (to a CreatableRelationProvider data source).
[[innerChildren]] SaveIntoDataSourceCommand
returns the <
[source, scala]¶
// DEMO Example with inner nodes that should be shown as an inner nested tree of this node
val lines = Seq("SaveIntoDataSourceCommand").toDF("line")
// NOTE: There are two CreatableRelationProviders: jdbc and kafka // jdbc is simpler to use in spark-shell as it does not need --packages val url = "jdbc:derby:memory:;databaseName=/tmp/test;create=true" val requiredOpts = Map("url" -> url, "dbtable" -> "lines") // Use overwrite SaveMode to make the demo reproducible import org.apache.spark.sql.SaveMode.Overwrite lines.write.options(requiredOpts).format("jdbc").mode(Overwrite).save
// Go to web UI's SQL tab and see the last executed query¶
[[simpleString]] SaveIntoDataSourceCommand
redacts the <
SaveIntoDataSourceCommand [dataSource], [redacted], [mode]
=== [[run]] Executing Logical Command -- run
Method
[source, scala]¶
run( sparkSession: SparkSession): Seq[Row]
NOTE: run
is part of <
run
simply requests the <
In the end, run
returns an empty Seq[Row]
(just to follow the signature and please the Scala compiler).
Creating Instance¶
SaveIntoDataSourceCommand
takes the following when created:
- [[query]] Logical query plan
- [[dataSource]] CreatableRelationProvider data source
- [[options]] Options (as
Map[String, String]
) - [[mode]] SaveMode