This tutorial will explain how to write data from Spark dataframe into various types of databases(such as Mysql, SingleStore, Teradata) using JDBC Connection.

PySpark: Dataframe To DB

This tutorial will explain how to write data from Spark dataframe into various types of databases(such as Mysql, SingleStore, Teradata) using JDBC Connection.


Write To Mysql: This section will explain (with examples) how to write dataframe into Mysql database using JDBC connection. "mysql-connector-java-8.0.11.jar" jar should be present in Spark library to write data to Mysql database using JDBC connection. This jar can be downloaded from Mysql website.

Replace below attributes in the examples:
Write To SingleStore: This section will explain (with examples) how to write dataframe into SingleStore / MemSQL database using JDBC connection.

Replace below attributes in the examples:
Write To Teradata: This section will explain (with examples) how to write dataframe into Teradata database using JDBC connection. "terajdbc4.jar" jar should be present in Spark library to write data from Teradata using JDBC connection. This jar can be downloaded from Teradata website.

Replace below attributes in the examples:
Write To Hive: This section will explain (with examples) how to write data into hive table. If hive setup is present for Spark then saveAsTable() function can be used to write data into Hive table.