Airflow with DBT tutorial - The best way!
Airflow with DBT tutorial - The best way!
🚨 Cosmos is still under (very) active development and in Alpha version. Expect possible breaking changes in a near future.
There are different ways to integrate DBT in Airflow.
If you use the BashOperator to run dbt commands, forget about that.
It's time to discover Cosmos! The open-source framework that parses and renders dbt projects in Airflow within seconds!
📖 Materials: robust-dinosaur-2ef.notion.si...
📚 Cosmos Doc: astronomer.github.io/astronom...
🏆 BECOME A PRO: www.udemy.com/course/the-comp...
👍 Smash the like button to become an Airflow Super Hero!
❤️ Subscribe to my channel to become a master of Airflow
🚨 My Patreon: / marclamberti
Enjoy ❤️
Пікірлер: 99
To anyone following the video now, The DBTDeps module has been depreciated. Deps are automatically installed if they are present in packages.yml files inside your dbt project. Follow the official docs.
@jonasl3683
11 ай бұрын
Does that mean i have to put the gcc and python3 inside the packages.yml or can i just delete the packages.txt file in the Astro folder?
@user-ck9zd4gj7y
9 ай бұрын
Is any tutorial available? please provide some link or further explanation
@johnnote7
7 ай бұрын
work on astronomer-cosmos[dbt.all]==0.6.0
This is awesome! Thanks for sharing! Subscribed. 👍🏼
Cool videos.. for DBT cloud you can define the job and then use a post request to trigger via air flow. You can also set dependencies between jobs
Great video! Would you have any example of how to run only a specific model, or any other commands, instead of the whole project? Couldn't find it on the docs!
Great video , very informative ! one question , does the Cosmos allow us to run specific model in DBT or a specific tag in the dbt model ?
Thank you so much for sharing this with us on KZread
@MarcLamberti
Жыл бұрын
my pleasure ❤️
Great content!! I try to follow along with this content and it works fine, like 95% of it. Just a few additional settings in case you might face some problem with the module name "pytz" (I got the module name pytz not found error while trying to run the dag), You could just add pytz into the requirements.txt file then it would work perfectly.
@amansharma-gj7eu
6 ай бұрын
did you get below error during execution of jaffle_shop dag? improper relation name (too many dotted names): public.***.public.customers__dbt_backup
@rattaponinsawangwong5482
6 ай бұрын
No, I don’t. But I worked with this tutorial 8 months ago. So, maybe the tutorial was updated with something I never try. Based on error message, I think it about naming of some parameters. You might cross-check if it matched with the tutorial.
@khrs2077
17 күн бұрын
hai i got this error , i just add pytz==2022 but doesnt work for me
I was actually learning from the best of the bests on Udemy. I had no idea. I am enjoying your teaching as well.
@MarcLamberti
Жыл бұрын
You’re the best 🫶
@zahabkhan6832
22 күн бұрын
@@MarcLamberti where can i find the code files u mentioned u will put in the discription
Excellent !! Thanks a lot!
Great video, there are some changes I had to make to have this example working but in the end it helped me a lot, thank you :)
@MarcLamberti
8 ай бұрын
Thank you! Could you tell me which one so I can pin that in comment?
@IWasBoredSo
8 ай бұрын
In your notion there is a definition for a jaffle_shop DAG that throws errors in current state during import (I took code from Notion provided in description): TypeError: DAG.__init__() got an unexpected keyword argument 'dbt_executable_path' #1 TypeError: DAG.__init__() got an unexpected keyword argument 'conn_id' #2 TypeError: DbtToAirflowConverter.__init__() missing 1 required positional argument: 'profile_config' #3 TypeError: DbtToAirflowConverter.__init__() missing 1 required positional argument: 'project_config' #4 So instead of defining conn_id and dbt_executable_path creating DbtDag it should be done this way for example: from airflow.datasets import Dataset from datetime import datetime from cosmos import DbtDag, ProjectConfig, ProfileConfig, ExecutionConfig from cosmos.profiles import PostgresUserPasswordProfileMapping profile_config = ProfileConfig( profile_name="demo_dbt", target_name="dev", profile_mapping=PostgresUserPasswordProfileMapping( conn_id="postgres", profile_args={"schema": "public"}, ), ) config = ProjectConfig("/usr/local/airflow/dbt/my_project") exec_config = ExecutionConfig(dbt_executable_path="/usr/local/airflow/dbt_venv/bin/dbt") dbt_model = DbtDag( dag_id="dbt_model", start_date=datetime(2023, 1, 1), schedule=[Dataset(f"SEED://seed_dataset")], profile_config=profile_config, project_config=config, execution_config=exec_config, # default exec mode ExecutionMode.LOCAL ) dbt_model I am defining ProjectConfig, ProfileConfig, ExecutionConfig separetly and then passing all necessary config to DbtDag, same stuff I did in part with seeds but there is no problem with passed values straight into DbtRunOperationOperator and DbtSeedOperator so change in tutorial is not needed right now :)
@IWasBoredSo
8 ай бұрын
I have different names for the profile and dataset, etc., but the logic is the same as on the Notion site
@amansharma-gj7eu
6 ай бұрын
did you got below error during execution of jaffle_shop dag? improper relation name (too many dotted names): public.***.public.customers__dbt_backup @@IWasBoredSo
@amansharma-gj7eu
6 ай бұрын
did you got below error during execution of jaffle_shop dag? improper relation name (too many dotted names): public.***.public.customers__dbt_backup @@MarcLamberti
Hi Marc, thanks for sharing this
Thanks for this walkthrough. It's very helpful. While I use the BashOperator, I could specify the threads and run multiple models in parallel. When I use the cosmos package and DBTTaskGroup, there doesn't seem to be any such config to run models in parallel. This increases our run times. Am I missing some config to run in parallel?
Thank you for the video! I have airflow in production on a kubernetes cluster (deployed it using the official helm charts). Is there any straight-forward way of integrating cosmos with git-sync?
Great videos. Thanks. One question, What If I have to use k8sExecutor? In this case, `dbt deps` should be precedented on every dbt tasks(because each container task in a pod will loose very first dbt deps context). How can I handle this?
Nice and well explained video ! Do you have plan to do a DBT + Dagster intégration vidéo ? It could be interesting :)
@MarcLamberti
Жыл бұрын
I didn’t try Dagster yet but why not 🤓
Great read. Has anyone installed the cosmos package without the Astro CLI and get the dbt dags working?
What version of astonomer-comos were you using while creating this tutorial? The module is actively developing and its changing so cant follow thorughly.
Please make video on dataform and airflow
When will this dbt-core with airflow be supported as standards in Airflow
Hey Marc, thanks for the great tut!!! :) But i cant really get it to work, i get the error message "ModuleNotFoundError: No module named 'cosmos.providers'" when trying to import the DAG. Which package should i install and in which --configuration file should i put it (packages, requirements, dbt-requirements or the Dockerfile???) i am kinda confused why there are two requirements files...
@dffffffawsefdsfgvsef
11 ай бұрын
I am getting the same error. Any solution found for this ?
@carolinabtt
8 ай бұрын
I got the same error. If anyone knows how to fix it let us know! thanks
@renatomoratti5947
7 ай бұрын
Same error here, did you find a solution?
@amansharma-gj7eu
6 ай бұрын
did you got below error during execution of jaffle_shop dag? improper relation name (too many dotted names): public.***.public.customers__dbt_backup @@renatomoratti5947
@amansharma-gj7eu
6 ай бұрын
did you got below error during execution of jaffle_shop dag? improper relation name (too many dotted names): public.***.public.customers__dbt_backup @@dffffffawsefdsfgvsef
When is support for Clickhouse expected?
Hi! Could someone give me the answer on the next question? Is it possible to use full refresh with cosmos package?
Amazing! now make one for cloud instead of local? :D
Thank you very much Mark, for your generous initiative. A single point is that the udemy link in the video details returns an error.
@MarcLamberti
Жыл бұрын
Hi Luciano, Where? In the email I sent?
@farisazhan6428
Жыл бұрын
@@MarcLamberti in this video's description. the udemy link doesn't work
@elteixeiras
Жыл бұрын
@@MarcLamberti Where it says BECOME A PRO:
@MarcLamberti
Жыл бұрын
Fixed! Thank you guys ❤️
@user-by8um8bk1w
Жыл бұрын
@@MarcLamberti When is support for Clickhouse expected?
"ModuleNotFoundError: No module named 'cosmos.providers'" when trying to import the DAG. Which package should i install and in which --configuration file should i put it (packages, requirements, dbt-requirements or the Dockerfile???
@dffffffawsefdsfgvsef
11 ай бұрын
I am getting the same error. Any solution found for this ?
@renatomoratti5947
7 ай бұрын
Same here, did you find the solution already?
@johnnote7
7 ай бұрын
in requirements.txt change astronomer-cosmos[dbt.all]==0.6.0
@amansharma-gj7eu
6 ай бұрын
did you got below error during execution of jaffle_shop dag? improper relation name (too many dotted names): public.***.public.customers__dbt_backup
I have an etl process in place in the ADF. In our team, we wanted to implement the table and views transformation and implementation with dbt core. We were wondering if we could orchestrate the dbt with Azure. If so, then how? One of the approaches I could think of was to use Azure Managed Airflow Instance. But, will it allow us to install astronomer cosmos? I have never implemented dbt this way before, so needed to know if this would be the right approach or is there anything else you would suggest me?
@sridharstreakssri
14 күн бұрын
guess dbt doesn't have something for azure. But if u have access to fabric u could take a look at it as it offers a complete analytics platform. But if u r looking for making SQL dynamic the way dbt does using jinja templating then idk.
To Anyone, can we do this setup using AWS managed service Airflow? where we don't have the access to get to the command line. Any idea. Please share your thoughts.
This is mind-blowing man ... But Amazing as it is ... ... We still need to execute dbt commands ... 1 by 1 😅... But again... Great video
@MarcLamberti
Жыл бұрын
No you won’t 🥹 Comos translates your dbt project into a DAG with tasks corresponding to your models, tests etc. It’s a much better integration than running *indeed* one command at a time with the BashOperator. Thank you for your kind words 🙏
@maximilianopadula5470
Жыл бұрын
@@MarcLamberti Thanks a lot for the video, my question is. can you run task at different schedules? ie. I'd like my stg models to run every 5 minutes but my intermediate every day. I couldn't find an answer in the cosmos documentation. Many thanks
@amansharma-gj7eu
6 ай бұрын
did you got below error during execution of jaffle_shop dag? improper relation name (too many dotted names): public.***.public.customers__dbt_backup @@maximilianopadula5470
hey Marc, it seems the API for this package has changed quite a bit recently, and I'm having a really hard time getting the Execution Modes figured out given the lack of a proper example that uses the most current version of cosmos. Is there any chance you could do a deepdive on how to configure the latest version of cosmos with Docker / K8s executors?
@MarcLamberti
9 ай бұрын
Yes! I will make an updated video. What execution modes are you referring to?
@user-mm1mh8wk8o
9 ай бұрын
@@MarcLamberti thanks for getting back to me. I'm specifically referring to the ExecutionMode.DOCKER and ExecutionMode.KUBERNETES. My company generally prefers keeping their airflow instances as clean as possible and running everything on k8s where possible
@amansharma-gj7eu
6 ай бұрын
did you got below error during execution of jaffle_shop dag? improper relation name (too many dotted names): public.***.public.customers__dbt_backup @@MarcLamberti
Cool integration. But can someone explain me please is it possible to generate dbt docs somehow using this approach?
@MarcLamberti
10 ай бұрын
Here astronomer.github.io/astronomer-cosmos/configuration/generating-docs.html 🫶
If I want to add Cosmos to my existing Airflow. Is it possible? How?
i did integration like this before (but i built my own dbt loader), but it comes up with Memory Error in airflow because too many concurrent job in dbt models run. what do you suggest to tweak it?
@MarcLamberti
Жыл бұрын
Is that an issue you have with Cosmos or is it with your own dbt loader?
@as_sulthoni
Жыл бұрын
@@MarcLamberti i use my own dbt loader, so technically my airflow (Cloud Composer) was crashed because the RAM and CPU usage was spiked. Ideally i can increase my RAM and CPU, but unfortunately it was not possible due to cost limitation on my side. So my current solution is deploy standalone dbt to on-prem server (google CE). the integration is looks like Cloud Run integration.
So it's like legianires? Air flow and database...
This will be added in you airflow course in Udemy
@MarcLamberti
Жыл бұрын
Yes
Does not work
hi i got an error on airflow ui like this. Is there any ideas about this error? ModuleNotFoundError: No module named 'cosmos.providers'
Good
Is this a full replacement for dbt cloud?
@MarcLamberti
Жыл бұрын
Nop but it helps to integrate dbt core in Airflow :)
can this work with airflow in aws MWAA?
@maximilianopadula5470
Жыл бұрын
Interested on this too. I imagine it can? mostly curious on the CI CD part which i guess will be a cosmos build to s3.
Hi, the tutorial looks good but it doesn't work anymore. Can you please share the versions you are using in it please? Thanks a lot!
@amansharma-gj7eu
6 ай бұрын
did you got below error during execution of jaffle_shop dag? improper relation name (too many dotted names): public.***.public.customers__dbt_backup
magic
@MarcLamberti
11 ай бұрын
🪄🪄🪄🪄🪄
Unfortunately outdated & useless
@MarcLamberti
6 ай бұрын
How useless? Doesn’t work anymore?
@CarbonsHDTuts
6 ай бұрын
@@MarcLambertiany updates ?
cosmos has a very poor documentation .do not recommend to anyone
@MarcLamberti
Ай бұрын
Anything you were looking for specifically?
@samsonleul7667
Ай бұрын
@@MarcLamberti from cosmos.providers.dbt.core.operators import ( DbtDepsOperator, DbtRunOperationOperator, DbtSeedOperator, ) this imports do no work on the latest version of cosmos and couldn't find their alternatives
I'm getting this error Broken DAG: [/usr/local/airflow/dags/import-seeds.py] Traceback (most recent call last): File "", line 241, in _call_with_frames_removed File "/usr/local/airflow/dags/import-seeds.py", line 7, in from cosmos.providers.dbt.core.operators import ( ModuleNotFoundError: No module named 'cosmos.providers'
Broken DAG: [/usr/local/airflow/dags/import-seeds.py] Traceback (most recent call last): File "", line 241, in _call_with_frames_removed File "/usr/local/airflow/dags/import-seeds.py", line 6, in from cosmos.providers.dbt.core.operators import ( ModuleNotFoundError: No module named 'cosmos.providers'
@kartheekgummaluri7430
2 ай бұрын
I'm also getting the same error Broken DAG: [/usr/local/airflow/dags/import-seeds.py] Traceback (most recent call last): File "", line 241, in _call_with_frames_removed File "/usr/local/airflow/dags/import-seeds.py", line 7, in from cosmos.providers.dbt.core.operators import ( ModuleNotFoundError: No module named 'cosmos.providers'
@kartheekgummaluri7430
2 ай бұрын
@marclamberti please help
did you get below error during execution of jaffle_shop dag? improper relation name (too many dotted names): public.***.public.customers__dbt_backup