In addition to have a different url for each environment, you could leverage another configuration setting to change the Airflow DAG UI Header, to distinguish better each environment. There is a good chance that you have multiple Airflow environments.
Then, in your DAG: from airflow import DAGįrom datetime import my_very_unsecure_task(): Idem if one of the JSON keys in the extra_field of a connection contains one of those keywords. 'password', 'secret', 'passwd', 'authorization', 'api_key', 'apikey', 'access_token'
AIRFLOW 2.0 HOW TO
How to remember those dependencies? That's where the new DAG dependencies view comes up! On this view, you can can quickly see your DAG dependencies. But, having a lot of DAG dependencies can become a nightmare to deal with. A DAG that waits for another DAG to complete and so on.
You might want to have a DAG that triggers one or multiple DAGs. Lost with all of your DAG dependencies? Hard to remember which DAG waits for which DAG? Not anymore!Ĭreating DAG dependencies is a really common use case.
Whereas SubDAGs are complex and will be deprecated, TaskGroups do the same job in a faster and easier way. TaskGroup is the new of way of grouping your tasks in your DAGs. Group your tasks with the TaskGroup decorator Order_data_dict = json.loads(data_string) Import json # You must make your imports within the functionĭata_string = '' Your tasks run in a python virtual environment so that they can have their own dependencies without altering the "host" system. Running tasks in a python virtual environment with the PythonVirtualenvDecoratorĮasiest way of instantiating the PythonVirtualenvOperator.