Advancing Fabric - Orchestration with Data Factory
The Data Factory experience in Microsoft Fabric has two main parts - the Data Factory pipelines which will be familiar to Azure Data Factory and Synapse users alike, and Data Flows, which fans of Power BI Data Flows will find very familiar. But where do you start?
In this video, Simon and our Chief Fabricator Craig dive into Data Pipelines with the Data Factory experience inside Microsoft Fabric. They take a quick look at data flows and where they are used, before talking orchestration with data factory.
If you're looking to get started on your Microsoft Fabric journey, give Advancing Analytics a call to arrange a POC today!
Пікірлер: 8
how does one run these copies in parallel not sequentially. only way i can think is to orchestrate externally
Love your fabric videos, need such more videos 👏
In Pipelines we can't invoke HDInsights or Databricks jobs as we do in ADF - looks like they want to ensure that the only kind of 'custom activity' we can call is Fabric's Spark notebook. or maybe Azure functions/batch etc. Even KQL or Data Explorer activity is missing. For UI experience perspective, I can't arrange my notebooks into folders (the way databricks provide) - and that's a big miss or a bad UI experience.
Would be interesting to know the upper limits of processing changing data daily for high-volume tables
Is that possible to copy files form Sharepoint to blob storage by using ADF?
Still a lot of key functionality missing to actually think of transitioning from current Azure implementations with synapse, data factory. Definitely good to start migrating solely Power BI based reporting efforts.
@AdvancingAnalytics
11 ай бұрын
Yep. It's a "If everything you're doing resides in Fabric, you can probably make it work" scenario currently - if you need outside integrations & orchestration, that's not baked in yet
@NeumsFor9
11 ай бұрын
You know Microsoft.....wait for the 2nd or 3rd release while heroes like Simon sweat it out and fill in the gaps.