Managed By Vishal Bulbule , Fully Certified Google Cloud Professional.
Google Cloud Champion Innovator
Google Cloud Certified Cloud Digital Leader
Google Cloud Certified Associate Cloud Engineer
Google Cloud Certified Professional Cloud Architect
Google Cloud Certified Professional Data Engineer
Google Cloud Certified Professional Cloud Network Engineer
Google Cloud Certified Professional Cloud Security Engineer
Google Cloud Certified Professional Cloud DevOps Engineer
Google Cloud Certified Professional Workspace Administrator
Google Cloud Certified Professional Database Engineer
Google Cloud Certified Professional Cloud Developer
Google Cloud Certified Professional Machine Learning Engineer
Microsoft Certified: Azure Fundamentals
HashiCorp Certiffied Terraform Associate
Email - [email protected]
Пікірлер
Is this tutorial also for beginners?
good content ! keep up the good work
I did the same thing what you explained but I'm not able to see the cloud Run and Cloud Function Metrics. Is there any possible way to see that serverless metrics.
Sir it while excepting code in vs it says quota limit exceeded
This is great! I followed your video step-by-step, and now it's time for me to do a project of my own based on your stuff! Will use something more European though, like soccer or basketball haha :D Thanks!!!
True...better for you not to use Cricket 😅😅
Very simple and well explained, thanks!
Your videos on GCP data engineering are outstanding! You explain complex concepts with such clarity and ease, making them accessible for everyone. Your step-by-step approach from the basics to advanced topics is incredibly helpful. Thanks for sharing your knowledge and making learning GCP so enjoyable and effective. Keep up the great work and post more videos! I have already SUSBCRIBED :)
Thanks for the kind words 🎉
You could do away with dataflow here. A simple python job using load_table_from_uri with auto schema detect enabled from trigger function would do this work.
Yes , single python would work definitely. This is to learn different services in GCP.
Search and Conversation in 2:10 is now called Agent Builder
not getting mask data option in wrangler
i am not able to create composer env
Very informative brother. Can we also include uploading the files in the frontend and ask the model to compare with the previous dataset. That would be really helpful.
How do you create the 3 Separate csv files???
I first downloaded CSV file from kaggle & then copy pasted a few( desired) records to another csv
Thanks Vishal for the detailed pipeline design and development video. Great job.
Thank you so much for this video. It helped me a lot.
Happy to know it helped you
Hello, sir! Great video. If we need to implement CDC or append new data to a table, do we have to extract the data date-wise and load it to GCS? And how do we append that data to an existing table in BigQuery? Cloud Composer: Extract data from an API and load it to GCS. Cloud Function: Trigger the event to load a new CSV file to BigQuery using Dataflow. So where do we need to write the logic to append the new data to an existing table in BigQuery?
How great is this stuff, thanks for your time!!!
Superb.... !!!
excellent and easy to understand.🤩
Informative video, Could you please make more videos on Datadog only. Thank you!
Sure I will do
we have one server on google CLOUD everyone can access RDP. we need to protect specified persons to access it from our network.
What permissions were granted to another GCP account i.e. "mytrapture" ?? This doesn't appear in your video. Something is missing between 06:22 - 06:29 minutes.
Hi, thank you for a very useful tutorial! When I set Polish language, chat cannot answer my questions, but in English it works well. My training data is in Polish. Is it possible to set a language for chatbot?
Very good explanation! Thanks!
Muchas gracias. Muy util y practico
Subscribing!
Too quiet
Hi I liked your explanation. Could you please let me know how I can deploy HSM in KMS in GCP through Pipelines? I will really appreciate if you could reply. TIA
Is Private Service Access same as Private Service Connection ?
On the cloudsql side, do we need to enable the proxy server ?
I am guessing workbench is only on Windows and not on linux ?
Super 😍
voice bahut low hai bro next time se thoda dhang se karna sunai nhi de raha hai
Thank you for the video, How do we identify all the required attribute mappings between Google & our IDP(eg: AWS, OIDC, SAML)?
Brilliant !
Do you have anything that explains extracting data from BQ and writing it to cloud SQL ?
How can I assign a user Organization Policy Administrator role to a user
Why don't you make a video about cloudrun which now has a feature called vpc direct access and it doesn't work with shared vpc, and google is trying to solve this. Would be good. thank you and congratulations
amazing video, Please create a video on skaffold as welll.........
I have received my digital badge (on Credly) and claim swag but didn't receive the certification yet. Should I just wait more? I have checked all my email and confirm there's no link or anything related to certification...
Certificate usually available on google.accredible.com
What is the cloudrun monthly pricing for this app
Thanks, this video helped me a lot
You do not have the required 'resourcemanager.projects.create' permission to create projects in this location. How to fix this issue
To create a project, you must have the resourcemanager.projects.create permission. This permission is included in roles like the Project Creator role (roles/resourcemanager.projectCreator). The Project Creator role is granted by default to the entire domain of a new organization resource and to free trial users. For information on how to grant individuals the role and limit organization-resource wide access, see the Managing Default Organization Roles page.
can we have slides for this course
Very helpful. Could you please also do it for Gitlab
I will do it soon
thanks, I just wanted OIDC setup :)
Hi.thanks for sharing this video. Is there a way to export the trained model to my own vector database so can run locally?
Great video, thanks :) I want to create a chatbot that lets my customers ask about products from my suppliers. Do you know how I might return links behind the supplier names, or links behind product names? So when a supplier name is shown in the chat, it's actually a url to the supplier profile on my website?
Fusion is not parsing the salary and many fields although they are in the csv
Amazing explanation! The video was really informative. One suggestion for improvement: could you please consider zooming in a bit more next time? It would help make the content even clearer. Thanks!