The Hybrid Virtual Group covers Cloud and On-premises technology stacks including hybrid deployments with a focus on the Microsoft Cloud and Data Platform. We host bi-weekly webinars with industry experts and seek to grow your skills in these areas for free!
Пікірлер
Thank you for putting this together, it is informative and useful.
Awesome tutorial! I got my OAUTH2 flow working as well! Only question that I have is: What Sink did you configure in the 'Copy Xero Invoices' copy activity? In my case I want to send data from system A to system B, but I think I just have to use two Web activities (Get and Post/Put) instead of a Copy activity (because I think they don't support POST/PUT in the Sink of a Copy Activity).
Thank you very much for this video. I have a question: How can I install TensorFlow in SQL Server 2019?
Thank you for this wonderful introduction and overview of the Power BI Activity logs and methods of collecting that data.
where will i get Auth URL and Access Token URL? please help
So is postman or other third party tool required to configure Oath between ADF and your example cloud app?
great video
Great video and very nicely explained. I liked to understand how you used the Azure Key Vault to securely save the token/url etc to use for the API calls or is any code publish in Git Repo, if yes please share the link. Thanks in advance.
new sub here! thanks for this awesome content! keep it coming 😊
Really appreciate the multi-language presentation. There is now ARULESPY package for exploring association rules and frequent Itemsets in Python. Would love to see you do a video on that.
Can I preview sample data of source like I do in ADF copy activity ?How downstream can use purview output like data catalog etc...?
Great content, thanks for sharing!
Very well explained
Thank you
Great session, thank you for contributing to the KQL learning community!
Nice way to measure the adoption. On the other hand, there are some pretty expensive formulas you are using here. with FILTER function taking full table as a first argument.
I hope you keep on doing what you are doing. 🙂
Thanks for sharing. Interesting.
Hi, where can I go for legitimate DevOps training?
Thank you for the detailed explanation. I had followed all the steps as explained here. However, when I create the web activity in ADF to refresh the Xero access token, I keep getting the error "invalid_client" with error code 2108. It works perfectly fine when in Postman and when I look into the input of the web activity, the Authorization and grant_type are exactly in sync with that of Postman. Tried searching for the same error on the internet but wasn't successful.
Hello and thank you for this video. I tried to follow your steps for Power BI Auto ML, but the only difference is that I'm using a SharePoint document. Unfortunately, I have encountered multiple problems which I managed to correct except for this one: "DataSource.Error: An error occurred while reading data from the provider: 'Client attempted to make a remote call from environment PowerQueryOnline, but remote calls are disabled for this workload.' Details: DataSourceKind = AIFunctions, DataSourcePath = AIFunctions." Does anyone have an idea on how to solve this?
what is the "GetSnapshots" activity
This is invaluable, clear and very professional
Great Stuff, thank you for this!
'Promosm' 💥
is there a place that shows what each operation means?
Amazing Pres! Thanks a lot
cool content....
I see, you have implemented refresh grant type in ADF, do I need to run the Auth code flow in Postman first and get the refresh token and use the same refresh token in ADF with grant type as refresh?
Yes, you need to go through the Auth flow outside of ADF first in order to get your first set of tokens. After that, you can use those tokens in ADF to get a new refresh token.
Great video! Question. I see that the refresh token is updated in Key Vault with a PUT operation, where does the Access Token get saved? Does it get saved under a secret in key vault. I am working on a Quickbooks Online integration that requires a "client secret" and "refresh token" in the linked service
Thank you! Yes, I like to save the access token, refresh token and client secret all in Key Vault. The access token is saved in a similar fashion to the refresh token. Because of the fact that the client secret doesn't change, you can add that to the Key Vault manually and don't have to do it as part of the pipeline. A word of caution regarding the QuickBooks linked service, as I am dealing with it on another project...don't use it. Use the native REST linked service for REST APIs...the built-in linked services usually don't do a good job of dealing with the nuances the different vendors like to implement.
@@MartinSchoombee Thank you for the heads up! Its always a risk when trying to use Preview features as it is. I will try out the REST API connector instead. I appreciate all of your help and great explanations!
@@MartinSchoombee Is there any chance that you may have a tutorial of the QuickBooks Online integration that you are working on as well?
@@tbuck51 Not specific to QuickBooks, but have a look at my blog series on the topic...the logic should be virtually identical to what I have there.
@@MartinSchoombee Will do! Thank you!
Posts such as this our the lifeblood of data engineers!! People such as Martin should be congratulated for going to the trouble to share this type of content! Brilliantly explained. Incredibly useful. Thank you.
Thank you for your kind feedback. We agree, Martin is a superstar.
Thank you very much for the kind words, John. Comments like these help motivate us to keep going, knowing that we were able to help some folks.
Thanks.... my lots of confusion got resolved now... again many thanks....
Use the following timings to jump to your content of interest: 00:00 Introduction 02:10 Paginated Reports just got easier! 07:44 New TSQL functions in SQL Server 2022 20:18 The 10 commandments of ETL 30:41 Top 3 Tips for Managing Security in Power BI 41:24 Quiz Bowl 2022
This really helped me get started with composer
Thanks for this information! Very well explained
Great video! So helpful, and great job explaining everything. That being said, I believe you forgot the step where you have to add an "Access Policy" that includes the Key Permissions for the Data Factory.
Yes, good catch thank you :-)
What exactly is meant by what you said? I am getting an error that says "AKV10000: Request is missing a Bearer or PoP token." which I think is related to creating an access policy; however, when I click on Access Policy it says that everything is handled in the Access Control (IAM) page
Great video. Do u have a link to the excel framework you demonstrated?
Hi, getting an error which means "can't connect to the azure cognitive services with the anonymous access ". Please help how to fix it
This is amazing, I liked how you used the Azure Key Vault to securely save the token/url etc to use for the API calls.
Use the following timings to jump to your content of interest: 00:00 Introduction 02:22 What is OAuth 03:44 The Authorization Flow 11:20 Testing APIs (with Postman) 36:08 Azure Data Factory - Linked Services, Datasets & Pipelines 48:09 Questions
Thanks in a million this content is second to none! Very well explained, and easy to follow. This is the nth time that I am revisiting this course in full. Great content. Awesome. I couldn't find this explanation--simply put anywhere else. “Great teachers are hard to find”. Grade: A++
Thanks Wolfgang.
Use the following timings to jump to your content of interest: 00:00 Introduction 02:14 The problem and/or opportunity for techies 04:16 Developer Velocity 17:38 Microsoft increased DV through DevOps 24:02 Actions 54:27 Questions
Thanks Wolfgang. Good material and easily explained
Sir , i need training for Azure Data Engineer please suggest 🙏
Use the following timings to jump to your content of interest: 00:00 Introduction 03:41 What is Azure Data Explorer 12:00 Architecture 16:00 Azure Data Explorer output plug-in for Telegraf 17:22 Getting Started 19:45 Demo 50:33 Questions
Use the following timings to jump to your content of interest: 00:00 Introduction 00:44 Why is this relevant? 03:54 Main presentation 52:45 Summary 53:52 Questions
Use the following timings to jump to your content of interest: 00:00 Introduction 01:27 Demo 1: Creating Synapse workspace 07:08 What's new in Azure Synapse Analytics? 09:45 Demo 2: A quick tour of Synapse Pools (including Data Explorer Pools) 20:27 Demo 3: Synapse Workspace configuration 22:35 Demo 4: SQL Pools and Synapse Studio 29:06 Demo 5: Using SQL Server Management Studio 30:37 Demo 7: Monitoring and Visualisation 37:01 Demo 8: Spark Pool 39:36 Demo 9: Synapse Pipelines 42:12 Demo 10: Other bits and pieces (OPENROWSET, Templates, Release notes) 47:54 Questions
Where did you get your test data?
Use the following timings to jump to your content of interest: 00:00 Introduction 00:44 Why Azure Arc 11:47 Demo 37:09 Wrap up and links 37:48 Questions