Introduction

In this tutorial, we will explore how to read a Delta table into a PySpark DataFrame.

Goal

A Delta table stored in a lakehouse should be read into a PySpark DataFrame.

Prequestisies

☑️ Notebook created

We have already created the notebook "dlnerds_notebook". If you want to know how to create a notebook check out the following post:

How to create a Notebook in Microsoft Fabric: A Step-by-Step Guide
Introduction Microsoft Fabric is a powerful All-in-One Data Platform (SaaS) in the Azure Cloud that combines various Azure components to cover the fields of Data Integration, Data Engineering, Data Science and Business Intelligence. One key component of the Microsoft Fabric architecture is the Notebook. In this tutorial, we will explain

☑️ Lakehouse created

We have already created the lakehouse "dlnerds_lakehouse". If you want to know how to create a lakehouse check out the following post:

How to create a Lakehouse in Microsoft Fabric: A Step-by-Step Guide
Introduction Microsoft Fabric is a powerful All-in-One Data Platform (SaaS) in the Azure Cloud that combines various Azure components to cover the fields of Data Integration, Data Engineering, Data Science and Business Intelligence. One key component of the Microsoft Fabric architecture is the Lakehouse. In this tutorial, we will explain

☑️ Lakehouse and Notebook connected

We have already established a connection between the notebook and the lakehouse. If you want to know how to add a lakehouse to a notebook check out the following post:

How to connect a Lakehouse and a Notebook in Microsoft Fabric
Introduction In Microsoft Fabric, notebooks can interact very closely with lakehouses. In this tutorial, we will explain step-by-step how to connect a lakehouse and a notebook in Microsoft Fabric. Goal A lakehouse and a notebook should be connected. Prequestisies ☑️ Notebook created We have already created the notebook “dlnerds_notebook”. If

Step 1: View Delta Table

First, let's have a look on the Delta table "framework" stored in the lakehouse "dlnerds_lakehouse".

Step 2: Open Notebook

Open the created notebook.

Step 3: Read Delta Table into PySpark DataFrame

Let's read the Delta table "framework" into a PySpark DataFrame.

You can view this post with the tier: Academy Membership

Join academy now to read the post and get access to the full library of premium posts for academy members only.

Join Academy Already have an account? Sign In