Building a Data Warehouse Dimensional Model using Azure Synapse Analytics SQL Serverless

Ғылым және технология

The Serverless SQL Pools service within Azure Synapse Analytics allows querying CSV, JSON and Parquet data in Azure Storage, Data Lake Gen1/2 and Cosmos DB. With this functionality we are able to create a Logical Data Warehouse over data stored in these systems without moving and loading the data. However, the source data may not be in the best possible format for analytical workloads...
In this session we'll be looking at using Azure Synapse Analytics SQL Serverless Pools to create a Data Warehouse using the Dimensional Modelling technique to create a set of Dimensions and Facts and store this data in a more appropriate structure and file format.
All data will be stored in an Azure Data Lake Gen2 account with processing and serving performed by the SQL Serverless Pools engine.

Пікірлер: 25

  • @BaGua79
    @BaGua79 Жыл бұрын

    13min in, already obvious this is a fantastic video. Thanks for doing this!

  • @CloudLunchLearn

    @CloudLunchLearn

    Жыл бұрын

    Thank you @BaGua79. We are happy you enjoyed it. Please help us promote it, by sharing it with your colleagues and friends using your social accounts. Have a great day.

  • @MrArsalan1988
    @MrArsalan19882 жыл бұрын

    very information session sir

  • @CloudLunchLearn

    @CloudLunchLearn

    Жыл бұрын

    Thank you @Arsalan. We are happy you enjoyed it. Please help us promote it, by sharing it with your colleagues and friends using your social accounts. Have a great day.

  • @user-du5rg4em6i
    @user-du5rg4em6i9 ай бұрын

    i enjoyed watching this video but please get us 1080p or 4K going forward. THANKS

  • @kenpoken1
    @kenpoken12 жыл бұрын

    Great video. thanks for sharing

  • @CloudLunchLearn

    @CloudLunchLearn

    Жыл бұрын

    Thank you @Ken. We are happy you enjoyed it. Please help us promote it, by sharing it with your colleagues and friends using your social accounts. Have a great day.

  • @artus198
    @artus1988 ай бұрын

    How do you take care of scd 1 , scd 2 etc ? Coming from traditional SQL server background, this all sounds f'ed up to me. They talk about external table, then views , then CSV files 😯 ?? W t f is going on

  • @smw999
    @smw9997 ай бұрын

    Very helpful and clear. Thanks for sharing this.

  • @AHMEDALDAFAAE1
    @AHMEDALDAFAAE1Ай бұрын

    Thank you this amazing video!!

  • @CloudLunchLearn

    @CloudLunchLearn

    Ай бұрын

    Thank you for your support Ahmed. We appreciate it. If you have a few minutes, please share the session on your social so your friends and contacts can watch it as well. And you can tag the Cloud Lunch and Learn group. Have a great day!

  • @AHMEDALDAFAAE1

    @AHMEDALDAFAAE1

    Ай бұрын

    @@CloudLunchLearn I did already. Thank you 😊

  • @geehaf
    @geehaf2 жыл бұрын

    Great session - really like your "show it for real" demo style. So the use of external tables in this session is primarily to easily transform CSV to Parquet format? Also, for adding new partitions to the view, it relies on the Parquet file remaining after a Drop External table has been issued?

  • @CloudLunchLearn

    @CloudLunchLearn

    Жыл бұрын

    Thank you @George. We are happy you enjoyed it. Please help us promote it, by sharing it with your colleagues and friends using your social accounts. Have a great day.

  • @joshuaandresblancojerez6455
    @joshuaandresblancojerez64552 жыл бұрын

    nice video, a shame the resolution.

  • @CloudLunchLearn

    @CloudLunchLearn

    Жыл бұрын

    Thank you @Joshua. We are happy you enjoyed it. Please help us promote it, by sharing it with your colleagues and friends using your social accounts. Have a great day.

  • @RodrigoBocanegraCruz
    @RodrigoBocanegraCruz2 жыл бұрын

    Great, thanks for sharing! I was trying to implement a DataVault logical DWH but there is no hashbytes :| I hope this feature will be supported in the future

  • @DatahaiBI

    @DatahaiBI

    2 жыл бұрын

    Hi, the HASHBYTES function is now supported in Serverless SQL Pools

  • @CloudLunchLearn

    @CloudLunchLearn

    Жыл бұрын

    Thank you @Rodrigo. We are happy you enjoyed it. Please help us promote it, by sharing it with your colleagues and friends using your social accounts. Have a great day.

  • @valentinloghin4004
    @valentinloghin400410 ай бұрын

    Super nice !! I is there any way to have the scripts and source files ? Thank you !!

  • @nandarajm880
    @nandarajm88019 күн бұрын

    Can you please give us the csv files

  • @Rothbardo
    @Rothbardo2 жыл бұрын

    Would looping through a series of dates be synchronous? Is there a way to do that asynchronously?

  • @DatahaiBI

    @DatahaiBI

    2 жыл бұрын

    This could be done using Pipelines/Data Factory to iterate and trigger the SP asynchronously. The external table name would need to be unique (dynamic sql to generate the table creation syntax)

  • @sanishthomas2858
    @sanishthomas28589 ай бұрын

    I m surprised I can't see anything

  • @Mac-vn5rf
    @Mac-vn5rf2 жыл бұрын

    Your intro is too long

Келесі