Creating an ETL Data Pipeline on Google Cloud with Cloud Data Fusion & Airflow - Part 1
Ғылым және технология
Part 2 - • Creating an ETL Data P...
Source Code - github.com/vishal-bulbule/etl...
Creating an ETL Data Pipeline on Google Cloud with Cloud Data Fusion & Airflow
Explore the magic of building an ETL pipeline in Google Cloud with this comprehensive tutorial. Learn how to craft a seamless process for extracting, transforming, and loading data into BigQuery, then visualize it effortlessly in Looker Studio.
Step 1: Begin by extracting dummy employee data using the Python Faker library, seamlessly storing it in a designated Google Cloud Storage (GCS) bucket.
Step 2: Dive into the creation of a Cloud Fusion instance, setting up the groundwork for your data pipeline journey.
Step 3: Unveil the magic of Data Fusion as you craft a robust pipeline. Witness the transformation of data while ensuring sensitive information remains masked, ultimately loading it into BigQuery for further analysis.
Step 4: Elevate your data visualization game as you harness the power of Looker Studio, bringing your insights to life in a visually compelling manner.
Join me on this illuminating journey through the intricacies of ETL pipelines, empowering you to master data orchestration and visualization in the Google Cloud ecosystem.
Looking to get in touch?
Drop me a line at vishal.bulbule@gmail.com, or schedule a meeting using the provided link topmate.io/vishal_bulbule
Playlists
Associate Cloud Engineer -Complete Free Course
• Associate Cloud Engine...
Google Cloud Data Engineer Certification Course
• Google Cloud Data Engi...
Google Cloud Platform(GCP) Tutorials
• Google Cloud Platform(...
Generative AI
• Generative AI
Getting Started with Duet AI
• Getting started with D...
Google Cloud Projects
• Google Cloud Projects
Python For GCP
• Python for GCP
Terraform Tutorials
• Terraform Associate C...
Linkedin
/ vishal-bulbule
Medium Blog
/ vishalbulbule
Github
Source Code
github.com/vishal-bulbule
#googlecloud #gcp #airflow #dataengineeringessentials #dataengineering #bigquery #dataengineeringprojects
Пікірлер: 35
Thanks Vishal for the detailed pipeline design and development video. Great job.
Very simple and well explained, thanks!
Thank You Vishal for doing this. It will be definitely a great help! Kudos to you!
Thank you for the help
thank you!!
Great video as always ! Can you do make a timestamp for this video ?
i am getting more environment error while connecting data fusion and python code has error
not getting mask data option in wrangler
Nice video, can you create a pipeline using server / serverless dataproc.?
awesome video, can you create complete composer airflow video for this one
@techtrapture
2 ай бұрын
Seperate playlist for Composer Cloud Composer - Airflow on GCP: kzread.info/head/PLLrA_pU9-Gz22Zml5mxcszG4A9ecqWtd4
i am not able to create composer env
in place of Airflow i want to use Mage ai.
cloud composer environment showing error and image version not showing while creating environment manually..is their any update
@adityajoshi2797
4 ай бұрын
please reply on that
How to use gcloud in vs code? Error: gcloud : The term 'gcloud' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again
@techtrapture
3 ай бұрын
Install Google cloud SDK in your system . Use below link cloud.google.com/sdk/docs/install#windows
Fusion is not parsing the salary and many fields although they are in the csv
composer shows "This environment has errors"
I got these errors "Cannot load filesystem: java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.hdfs.web.HftpFileSystem not found. Can not load the default value of `spark.yarn.isHadoopProvided` from `org/apache/spark/deploy/yarn/config.properties` with error, java.lang.NullPointerException. Using `false` as a default value." Any clues on how to fix it?
@figh761
3 ай бұрын
did you fix this
@akshaymantena6699
19 күн бұрын
I'm also getting the same error, Did you fix it?
Amazing video, unfortunately I have problems creating my cloud composer environment, maybe because I am in a free trial. I get this error after create the environment: CREATE operation on this environment failed 49 minutes ago with the following error message: Some of the GKE pods failed to become healthy. Please check the GKE logs for details, and retry the operation.
@Abracadanz00
Ай бұрын
I'm having the same issue, any idea how to resolve it?
@lmarwarl
Ай бұрын
@@Abracadanz00 Nothing yet, but after searching a lot I read a post from Google that says you have to activate your billing account in GCP before creating the cloud composer environment.
@paranoya733
15 күн бұрын
@@Abracadanz00 If you want to use shorter free pipeline in this part 14:57 cut off these part: Cloud Composer, Cloud Storage, Cloud Data Fusion, BigQuery, and replace them with free short pipelines: google sheets (data) -> Looker Studio. If you extract API data, in google sheets add extension called "API Connector" configure it (search in youtube) -> looker studio
kindly make this kind of pipeline ETL video with the {GCS-->(COMPOSER---DATAFLOW)--->BIGQUERY}
@techtrapture
5 ай бұрын
It's already there kzread.info/dash/bejne/h4x-2sWQl9vdpZM.html
@VthePeople4156
4 ай бұрын
Please explain total project 3-5 sentences for interview purpose Like what is the flow of project, Which gcp services used for project How u developed all different modules by using all different GCP services...
@Rajdeep6452
4 ай бұрын
@@VthePeople4156 Cant you see and tell? Does he have to spoon feed you now? your parents still wash your ass?
@VthePeople4156
4 ай бұрын
@@Rajdeep6452 yes
@Rajdeep6452
4 ай бұрын
@@VthePeople4156 idiot xD
its written gcloud is not an executable so your login stuff doesnt work with everyone and you did stuffs before without telling it in video. please next time show everything from scratch, i mean for real, not saying but doing it in reality too
@techtrapture
2 ай бұрын
Apologies if I missed. You need to install gcloud/ cloud SDK first to execute your command.