I help beginner and experienced data professionals learn Microsoft Fabric.
How?
✅I publish videos here on KZread,
✅ I manage the amazing, free, online Fabric community here: skool.com/microsoft-fabric
What do I cover?
My focus for 2024 is to build a collection of lessons that help you build a really strong foundation in concepts that are important in the Microsoft Fabric world: so data [warehousing, pipelines, engineering and science] - and to have fun doing it!
Who am I?
I'm Will 👋 I've been working in data since 2016 when I built my first Power BI dashboard.
Since then, I've worked in data science, real-time analytics, data engineering and solution architecture. All within the Microsoft stack.
I'm passionate about the transformative impact Microsoft Fabric can have on your business, and your career.
In 2024, I quit my consulting job to focus 100% on learning and teaching Microsoft Fabric, both here on KZread, and in my Skool community (link below).
Пікірлер
I have one doubt in fabric data pipelines and dataflow present in both data factory and data engineering and they look same what is the difference between them,what is the need of presenting in two like in data factory and data engineering,can anyone explain
Yes they will be the same. Some items appear in more than one 'experience' because they are useful to different personas (also the notebook is available in Data Engineering and Data Science experiences)
@@LearnMicrosoftFabric thanks man
👍
Good video and a good summary.🔥
Thanks for the amazing series, Will!! It has really helped me solidify my Fabric knowledge. I'm sure it's the same for many others. Let's hope it's enough to pass the exam. Thanks again and all the best!
Will, your content and its quality always leaves me speechless! Thank you so much!
amazing content! Love it!
Hey will, I have an azure student subscription credits to my personal account with around 40$ left in it. Now when I create an EntraID for fabric within my personal account, Do I need to have a subscription for newly created EntraID account to create Fabric Capacity in Azure Portal? Or is there a way to use from those left credits? Thanks in Advance. Great content by the way.
Not 100% sure on how the student subscription works to be honest, but I would guess it still works..? Give it a go and find out!
3:21:54 Data aggregation means collecting data and presenting it in a summarized form to facilitate in statistical analysis. There are several types of it like roll-up, drill-down, Pivot and slice & dice
Thanks for your comment, yes my point was that the term data aggregation is ambiguous, it can mean many different things in analytics engineering, and it isn't clear what exactly Microsoft expect. Especially when it is combined with 'disaggregation'. But yes, I believe they are looking for a Power BI view on data aggregation 👍
Is there any option that we can take exam with free voucher?
participate in microsoft challenges
I don't think you will get 100% off, that was only available around April time. Check out the Microsoft Challenges for 50% off though. The most recent one was the AI Skills Challenge (50% off), not sure if that's still avaiable
@@BasitAIi may I get more details on it please?
Hi Will, First I reallly would like to thank you for enabling this videos for free. At around 25min you start to show how Fabric works with Azure devops. It's a little bit confusing because the way you configured your envirement isn't the best way. What happens is that when you have a main branche protection enabled, you are suppose to have one workspace for the main branch and each developer is suppose to have it's own workspace linked to it's specific feature branch. Once the developer finished his implementation, he goes to the Azure devops and create a new Pull Request. If you don't have a main branch protection enabled, you can keep just one workspace where every developer works on.
Hi Hugo, thanks for the comment - indeed there are many ways to configure Azure DevOps and version control within Fabric. The purpose of the video (in the context of studying for the DP-600) was just to walk through the concepts of Git, version control, branching etc. In the future, I will definitely go into more detail about different architectures and approaches and the pros/ cons of each 👍
❤🔥❤🔥❤🔥
Sir, How can we build the JDBC/Pyodbc connection between Fabric Data warehouse and Fabric Notebook. I have been finding it since a long time, but un-successful
Check the Fabic DWH documentation - there is a page which mentions JDBC connectivity
After signing up for the portal, it is asking to upload a picture. After uploading it keeps on loading and never goes to next step
I believe you have now entered, correct?
@@LearnMicrosoftFabric yes I do thanks
Hi eager to begin. There is a 3.5 hours video here but also a 9 video series that seems to cover the same chapters from a couple of months before. Which is best for a Power BI but no Fabric person like me? And is there any point in watching both?
This video combines the whole series into one video 👍
Hello and thank you for your efforts. I want to ask if this full course the same previous series of 12 vids combined in one video?
Yes, it combines all previous videos into one 👍
👏
Will thanks for this wonderful learning stuffs can you bring some real life end to end project with fabric which will help to gain more practical understanding
Absolutely, yes I'm planning more hands-on learning in the future 🙌
Thanks! Ive passed the exam watching your tutorials.
It's a great achievement - congratulations!! And thank you so much 🙌🙌🙌
thanks mate, well explained.
Can i use Power Automate to run a pipeline as I have done on adf before?
I haven't done this personally, but I have seen others talk about it, so I think it's possible!
Hey Will. Longtime viewer and love your stuff. I'm sure that you've got a lot of great stuff in the works keeping you busy. But I had something that I want to run by you. I've been a Power BI dev for 3 years. Like a lot of devs, I came from the business side (finance) and so I don't really have a foundational background in things like CS or data engineering. However, over the past year or so I've been devling more into the data engineering realm trying to add that to my toolchest. And now with Fabric, it seems like the perfect opportunity. I've paseed the DP-600, but I know that I need to get a lot more practice with Fabric in order to truly be competent in it. So I've been working on a side project that incorporates a lot of the data engineering aspects (data pipelines, pyspark notebooks, task flows, Dev Ops integration, semantic link, et...) as well as a Power BI report that will use direct lake. It's already taught me a lot but I know that even a lot of the stuff I've gotten to work is sub-optimal at best and there are a lot of best practices and knowledge I'm missing. And I wanted to put a video series out about what I've put together in order for more experienced data folks to critique what I've done and give advice on what can be improved. I'm not trying to make a YT channel out of it or anything like that. I want to put them out there so I can learn from my mistakes and became more of an all around BI engineer. But not just for my benefit, but bc I know there are a lot of Power BI devs in my same situation who this could also benefit. My thought was that it would be neat if you reviewed these on your channel and tore them apart with your knowledge and how the whole set-up/process could be better, that may be of benefit to a lot of your viewers who probably have the same misunderstandings and knowledge gaps that I do. Let me know if this may be something that you would be interested in. At the moment, my goal is to start making the videos a month from now.
I recently passed my exam, and I can confidently say that this course surpasses all other paid courses available. It's the only course I used, along with the practice tests from SkillsCertPro. diligently practiced all the sets and thoroughly reviewed the explanations to grasp the concepts. 80% from these practice tests questions mirrored in my main exam which really made it easy.
Amazing Video. Thank you Will☺
❤🔥Great intro! Thank you!
🔥Best Fabric intro video
Hi, great content! Do you know how similar the practice exam is to the real exam? As in are they very watered down or quite similar to what we should expect?
The content is fairly similar, but normally the exam questions are longer/ with more context. I definitely recommend going through the Microsoft Practice Assessment DP-600 a few times to understand the types of questions you can expect 👍
🔥
can one create semantic model based of the tables from external sources such as on-prem database, azure sql or synapse workspaces?
Great video
🔥 Thanks for this- very helpful!
🔥
Thank you Will for this amazing course. I passed my DP 600 today with a score of 826. Your videos are my main preparation material. I have shared your videos to everyone I know, preparing for the exam.
That's amazing - well done!! Thank you for sharing the videos also 🙂
Congrats on passing the exam. Is this course alone is enough to pass the exam ?
@@BasitAIi Some people have, yes, but I always recommend using a variety of materials to prepare, all of which I link to throughout the study notes in Skool
If I failed an exam, can I retake it? Will there be extra charges?
See here for retake policy: learn.microsoft.com/en-us/credentials/support/retake-policy#microsoft-certification-exam-retake-policy-for-role-based-specialty-and-fundamentals-exams
Thank you for all the effort put into creating this course. It is truly incredible. The material is presented very clearly and is straight to the point, making it easy to understand and follow.Also having the main character scenario makes it very engaging , camilla's problems helped a lot in understanding fabric :)
Amazing course! Thanks a lot for sharing it! ❤
Thanks for watching!
I have a copy data activity in the development workspace. That copy data activity pulls the data from my Azure Blob and then copies the exact file in my Lakehouse. In Azure Blob, I have 3 containers something like (wf-dev, wf-stage, wf-prod). In three workspaces I have assigned different containers in the source of copy data activity. Now, as soon as I create a deployment pipeline and try to deploy dev to stage to prod. All of my source of copy data activities are overwritten by dev container. I know there is a selective deployment option, but is not there a way to dynamically provide parameter. I tried doing the same copy files from azure blob to lakehouse using just notebook but the notebooks also get overwritten. I tried making it dynamic by passing parameters from the pipeline parameters but these parameters are also overwritten.
I have a copy data activity in the development workspace. That copy data activity pulls the data from my Azure Blob and then copies the exact file in my Lakehouse. In Azure Blob, I have 3 containers something like (wf-dev, wf-stage, wf-prod). In three workspaces I have assigned different containers in the source of copy data activity. Now, as soon as I create a deployment pipeline and try to deploy dev to stage to prod. All of my source of copy data activities are overwritten by dev container. I know there is a selective deployment option, but is not there a way to dynamically provide parameter. I tried doing the same copy files from azure blob to lakehouse using just notebook but the notebooks also get overwritten. I tried making it dynamic by passing parameters from the pipeline parameters but these parameters are also overwritten.
thank you so much for your support!
Thanks for watching!!
Great video and extremely informative. I just have two questions regarding this content: 1. Is it possible to create stored procedures and functions in the SQL endpoint in a lakehouse? In your example, you use a data warehouse but could the same work be done in the SQL endpoint? 2. Regarding the scheduling of data pipelines, there is a new action called "Trigger" in preview. Would that be something one can get tested on since you didn't mention it in the video?
1. Yes (but only for select statements/ read-only) 2. Highly unlikely
Thank you. I love the was you explained each concepts.⏳
Hey Will, I'd like to thank you for enable all of your videos freely on KZread. I'm preparing myself to the exam and your videos help me a lot to focus only on the exam content. Next week I'll do the test and I'll let you if I passed. Best regards
Greetings form Guatemala Central America, Thanks I´m learning a lot with your videos ⌛
Thank you very much Will. Following this series harmonized my learnings altogether. Thank you very much for your time
🔥 i like how it's so easy to understand! great job!
Great course so far! I have some doubts regarding the practice questions: 1) 01:09:00 - I tested this case by creating a separate workspace as a Fabric admin. Inside the workspace I added a Lakehouse 1 with sample Covid data loaded and Notebook 1 that simply imports the table as a dataframe, adds a column and overwrites the original table. Then I shared the items with another user who has no access to the workspace. I gave him ReadAll for Lakehouse 1 and Edit permission for Notebook 1. Now there's a message that you have to grant the Run permission to the user with Edit permission, but it was still possible to grant the Edit permission only. Not surprisingly, this user was able to open Notebook 1, but couldn't run it. Granting the Run permission fixed this issue, so I think it should be mentioned in the answer to the questions as well. But there's more - I thought that ReadAll permission grants READ permission to Spark data only, but apparently this user was able to run the notebook from start to end, overwriting the table (or even creating a new one after code updates) without the Write permission to the lakehouse. Is that the intended behaviour? What's more, the user could even create shortcuts and delete tables from the Lakehouse explorer, so what's the point of Write permission granted with the Contributor role? Funny enough, the user couldn't see the contents of the lakehouse from the notebook level (with the message that permission was denied), but could see everything from the lakehouse explorer. 2) 2:18:00 - I also tested this and in my case I needed the Contributor access to both Workspaces. With the Viewer in Workspace B I was able to see the lakehouse, but not the tables, thus I was unable to create a shortcut. I'd be grateful for help!
Hey, thanks for your comment! I will go through them and respond shortly 👍
Very helpful video! especially loved the easy manner in which you explained the differences between the different ETL/ELT methods. Do you have a video in which you go over how to implement file partitioning?
Wills material is the best material I have found so far. Thanks for all the effort
In this video: kzread.info/dash/bejne/pHyV05uyo6nWnqg.html
❤🔥
Hi Will, I have some complex "Scalar user defined functions" defined in MYSQL and I have to migrate them to fabric, but as of now fabric doesn't support creation of "Scalar user defined functions" in warehouse. In this scenario please let me know alternative options I can use. Thanks
Hi, how to unlock Skool community ?
Create an account, Join the community 👍
I did but still locked :(. Any advice here ? I can not upgrade to level 1.
Thanks!
Thanks a lot for your support 😃🙏