HANDS-ON-LAB

Real Estate Ingestion and Dashboarding with Snowflake

Problem Statement

This hands-on process snowflake storage integration azure code aims to create a Lambda function to cleanse YouTube statistics reference data and store it in an S3 bucket in CSV format. Additionally, the cleansed data should be exposed in the Glue catalog. 

The statistics reference data (the JSON files) is placed in the raw S3 bucket:

s3://<raw_bucket_name>/youtube/raw_statistics_reference_data/

Tasks

  1. Create a container location in Azure Blob Storage with Azure Queues.

  2. Set up a Snowflake pipe to read data from the Azure Blob Storage location.

  3. Create a new table in Snowflake to store the real estate transactions.

  4. Manually upload the JSON data files into the Azure Blob Storage location.

  5. Verify that each uploaded file triggers the creation of a new record in the Snowflake table.

  6. Create a Snowflake Dashboard and configure appropriate queries for the following graphs:

    1.  Graph showing the prices of the five houses.

    2.  Graph showing the distribution of house types (Residential / Condo).


Unlock the power of real-time data analysis with Snowflake and Azure Blob Storage. Start the lab now and become proficient in data ingestion and dashboarding.

Learnings

  • Understanding the process of ingesting data from Azure Blob Storage into Snowflake using Azure Queues and Snowflake pipes.

  • Creating and managing tables in Snowflake to store data.

  • Hands-on experience with Snowflake Dashboard for data visualization.

  • Writing queries to extract specific data for plotting graphs on the Dashboard.

FAQs

Q1. What is Snowflake?

Snowflake is a cloud-based data warehousing platform that allows for the efficient storage, processing, and analysis of large volumes of data. It provides scalability, elasticity, and ease of use for data engineers and analysts.

 

Q2. What are the key tasks in this lab exercise?

The lab exercise involves setting up a container location in Azure Blob Storage, configuring Snowflake pipe to read data from Azure Blob Storage, creating a table in Snowflake to store real estate transactions, manually uploading JSON data files, verifying data ingestion into Snowflake, and creating a Snowflake Dashboard for data visualization.

 

Q3. What will I learn from this exercise?

A: By completing this exercise, you will gain hands-on experience in data ingestion from Azure Blob Storage to Snowflake using Azure Queues and Snowflake pipes. You will also learn how to create tables in Snowflake, upload data files, and visualize data using Snowflake Dashboard.