Landing data with Dataflows Gen2 in Microsoft Fabric

Ғылым және технология

Pipelines are cool in Microsoft Fabric, but how could we use Dataflows to get data into our Data Warehouse? Patrick shows another way to move your data with just a few clicks!
📢 Become a member: guyinacu.be/membership
*******************
Want to take your Power BI skills to the next level? We have training courses available to help you with your journey.
🎓 Guy in a Cube courses: guyinacu.be/courses
*******************
LET'S CONNECT!
*******************
-- / guyinacube
-- / awsaxton
-- / patrickdba
-- / guyinacube
-- / guyinacube
-- guyinacube.com
**Gear**
🛠 Check out my Tools page - guyinacube.com/tools/
#MicrosoftFabric #Dataflows #GuyInACube

Пікірлер: 32

  • @TimothyGraupmann
    @TimothyGraupmann9 ай бұрын

    Merge needs a nice search feature so you can quickly filter the columns to find the field you need.

  • @user-og4uv2tm5u
    @user-og4uv2tm5u6 ай бұрын

    Great video. 04:20 Pipelines also has an option to append or replace data at the destination.

  • @alexanderbarclay6286
    @alexanderbarclay62869 ай бұрын

    Q: wish y’all would start declaring in the description what licensing/capacity is required to replicate whatever it is you’re doing in your videos. Often, I’ll see one of your tutorials, get exited about implementing a version into our own workspace, and then realize it requires premium capacity, PPU, or some other licensing we don’t have (at least not at the moment).

  • @betrayedslinky
    @betrayedslinky9 ай бұрын

    Q: Great video and now I see there’s a general data warehouse option, but then there’s a kql warehouse, and lake house with sql endpoint. What makes sense to architect beyond the bronze layer where a lake house intuitively makes sense? What are the advantages/disadvantages for each of them if a team handles a medallion structure end to end?

  • @youssefmejri4728
    @youssefmejri472818 күн бұрын

    Hi Patrick @guyinacube can you please make a video about streaming dataflows and streaming datasets in Microsoft Fabric ?

  • @user-ll7gi5qu9z
    @user-ll7gi5qu9z9 ай бұрын

    Before running a downstream ETL I wanna make sure the upstream ETL tables have complete information for the previous day/hour. How to create dependencies in pipeline so that I am not running ETL on same old data? Thanks!

  • @MrSparkefrostie
    @MrSparkefrostie9 ай бұрын

    I have mentioned that Dataflow does not seem to be able to link to another query with load enabled, (I did drop info using the contact us on your website,) visually looking at Gen2 Dataflow, seems that Gen2 can do this, Edit: a PowerBI dataset would be an awesome destination, though I see it doesnt do incremental refresh so seems Gen1 still has a use case

  • @noahhadro8213
    @noahhadro82139 ай бұрын

    Can you trigger a pipeline with an end point URL call. Like you can with Power Automate? There is a trigger that is an HTTP request and it gives you a URL to call to trigger the flow?

  • @DavidZebrowitz
    @DavidZebrowitz9 ай бұрын

    I see the DataFlows Gen 2 has the ability to make columns a "key" and was hopeful that we would be able to do an upsert type of operation when writing to a target. For example, this could be a huge help when building a dimension, adding an "index" column as a surrogate key, and then doing an upsert into the target. Instead it current just appends all records to the existing records or wipes and replaces.

  • @pratik2998

    @pratik2998

    8 ай бұрын

    Exactly. I was curious about updates (upserts)

  • @ghamper1
    @ghamper19 ай бұрын

    Hi Patrick! Is Analysis Services supported with these? I think it was depreciated recently in the last Dataflows….

  • @EmmanuelAguilar
    @EmmanuelAguilar9 ай бұрын

    what about upsert? and Merge the information? , It's a good idea delete and load the information all time?

  • @sergzador
    @sergzador9 ай бұрын

    Q:will it have same risk when referenced queries run multiple times?

  • @Sevententh
    @Sevententh9 ай бұрын

    Hi, how can I pass a parameter / variable to a dataflow from the pipeline??

  • @quiosaevaristo7746
    @quiosaevaristo77465 ай бұрын

    Thanks for the beautiful content. I saw that whenever I load data to data warehouse you delete the data alredy in DW How is done incremental loading in frabric?

  • @melikagerami

    @melikagerami

    17 күн бұрын

    That's also something I'm searching for. Incremental refresh schedule is not available for dataflow Gen2 and I'm looking for a way to incrementally refresh the tables in warehouse

  • @prakash8522
    @prakash85228 ай бұрын

    this is same as datamart.. what is the difference btw gen2 and datamart??

  • @derekwilliams4557
    @derekwilliams45579 ай бұрын

    Can dataflows gen2 handle high volume data like Spark? Or for large data, should I continue to use Spark?

  • @paulmaksimovic9235

    @paulmaksimovic9235

    9 ай бұрын

    I wouldn't recommend it - they are painfully slow

  • @joseangelmartinez308
    @joseangelmartinez3082 ай бұрын

    what happens to all the pipelines and data flows I already have in Azure Data Factory? Can they be migrated?

  • @anjalisingh1588
    @anjalisingh15887 ай бұрын

    Can we put one data destination for all tables that we load in Dataflow gen2? Is it possible?? Please help.

  • @cargouvu
    @cargouvu2 ай бұрын

    How much does it cost in the trial Fabric? I am scared that I will run up the costs.

  • @vicentiustefan4057
    @vicentiustefan40579 ай бұрын

    With this new ability of Fabric, can we make a JOIN between two Power BI cubes? JOIN with cardinality type and not like merge, JOIN of PK with FK

  • @Sandroider
    @Sandroider9 ай бұрын

    After 1:33 I had to click View - diagram view to get same view as Patrik

  • @franciscoclaudio4818
    @franciscoclaudio48184 ай бұрын

    Gen2 Data Flow tends to be slower, why?

  • @paulmaksimovic9235
    @paulmaksimovic92359 ай бұрын

    You can use these for tranformations but NEVER to move data - they are painfully slow. Use a copy activity in a pipeline or a notbook to load the data. You can do tranformations after they have loaded. In fairness - if you use a notebook to load the data you might aswell use it to transform the data too.

  • @Fernando_Calero
    @Fernando_Calero9 ай бұрын

    Great video Patrick, thank you very much! (although it took you 13 days to publish it

  • @jonathanfernandezferrer3470
    @jonathanfernandezferrer34703 ай бұрын

    Hi guys, sorry i have A question from the company I work for. We are trying to create a dataflow in a workspace by connecting to a table of a dataflow in another workspace. We can connect to the table, but the new dataflow can only replicate that table, we cannot save the new dataflow after applying a simple filter of the table of the original dataflow. The new dataflow cannot be saved after applying a simple transformation to the original table (message from power BI referencing “Linked tables”).thank you

  • @danb6908
    @danb69089 ай бұрын

    Looking forward to the day when I can try this with on-premise SQL server (i.e. "my own") data. For now, all I get is: "There was a problem refreshing the dataflow. Please try again later. "

  • @jason.campbell474

    @jason.campbell474

    9 ай бұрын

    You should be able to do this if your gateway version is >= July 2023. I can load a dataflow, but it will not write to a Lakehouse or Warehouse. Checking gateway firewall settings to see if that's where the problem is.

  • @quantum_field

    @quantum_field

    9 ай бұрын

    @@jason.campbell474 I am seeing the same behaviour, let me know what you find.

  • @IrishJimmyPA
    @IrishJimmyPA9 ай бұрын

    Q: Is any of this scriptable? Clicking tabs, gears, checkboxes, etc. has to be the slowest way to develop and maintain data pipelines.

Келесі