Skip to content

SaveIntoDataSourceCommand Logical Command

SaveIntoDataSourceCommand is a logical runnable command.

SaveIntoDataSourceCommand is <> exclusively when DataSource is requested to create a logical command for writing (to a CreatableRelationProvider data source).

[[innerChildren]] SaveIntoDataSourceCommand returns the <> when requested for the inner nodes (that should be shown as an inner nested tree of this node).

[source, scala]

// DEMO Example with inner nodes that should be shown as an inner nested tree of this node

val lines = Seq("SaveIntoDataSourceCommand").toDF("line")

// NOTE: There are two CreatableRelationProviders: jdbc and kafka // jdbc is simpler to use in spark-shell as it does not need --packages val url = "jdbc:derby:memory:;databaseName=/tmp/test;create=true" val requiredOpts = Map("url" -> url, "dbtable" -> "lines") // Use overwrite SaveMode to make the demo reproducible import org.apache.spark.sql.SaveMode.Overwrite lines.write.options(requiredOpts).format("jdbc").mode(Overwrite).save

// Go to web UI's SQL tab and see the last executed query

[[simpleString]] SaveIntoDataSourceCommand redacts the <> for the <>.

SaveIntoDataSourceCommand [dataSource], [redacted], [mode]

=== [[run]] Executing Logical Command -- run Method

[source, scala]

run( sparkSession: SparkSession): Seq[Row]


NOTE: run is part of <> to execute (run) a logical command.

run simply requests the <> to save the rows of a structured query (a DataFrame).

In the end, run returns an empty Seq[Row] (just to follow the signature and please the Scala compiler).

Creating Instance

SaveIntoDataSourceCommand takes the following when created: