Databricks — read/write to a SQL table

Michal Molka
2 min readSep 16, 2022

It isn’t a surprise. If I say that when a developer works with a Databricks he needs to reference external data sources e.g., files placed inside a Data Lake, SQL databases and so forth. Today we will look at the second source, a SQL database.

The first example. We would like to read data from an Azure SQL Database.

Here is a SQL table which will be read from Databricks:

A link to a code repository is placed at the end of the article.

Here are connection variables:

The following code creates an iowa_liquor_sales data frame using jdbc connector which retrieves first 5 rows from the SQL table.

How to write data to the SQL table? The process is pretty similar. We will change records a bit before.

And here you are, we can write the data frame back to the SQL table.

Here you can take a look how to keep a SQL table as a Databricks entity: Databricks — reference a SQL table

Here is the source code:

spark_notebooks/sql-server-table.py at main · michalmolka/spark_notebooks (github.com)

--

--