Airflow Architecture
Apache Airflow is a workflow management system which makes dependency graphing an ease through the use of Directed Acyclic Graphs (DAGs). DAGs are generally programmed as a flow path for data to reliably move within a system and to ensure the integrity of all import tasks. Airflow's DAG management tools are excellent at error handling and process management.
DAG graphing is a collation of tasks via execution flows. When developing raw data sourcing and integrations we prefer to build separate DAG's for each data source whether these be API calls, database queries, blob downloads (such as text files), or other sources.
Import Flows
Airflow is used to manage import flow executions into the data lake which define the structure of your data lake. The data lake should be a product of your data flows, not the other way around. The data lake architectural philosophy defines the categorization and collation (Metadata) of the data, but not the data itself.

Data flows should be segregated by type (API calls, data transfers, blob/text file movements, etc.). This is not technically necessary, but we find that airflow behaves better this way by making debugging, maintenance, and upgrades easier. The DAGs will execute along a common pathway for given types of data movements while enabling extensibility via source systems, rather than thru source processes. This is akin to horizontal integration, rather than vertical. This streamlines coding strategies, facilitates task collation, looping, or dynamic programming, and assists Airflow task management and processing systems.
Segregating by process rather than system also makes debugging easier due to collation of pathway flows. If a task is failing due to a process failure then those process failure errors will all be in the same DAG, which minimizes the search effort. Additionally, it enables easy task management of complex, but related dependencies. If one task is failing it is possible to disable the task without upsetting the remaining executing tasks.
Airflow DAG and Task Organization
Airflow can help to organize your data flows, but it must be well organized and managed. It is important to have clearly delineated process flows and organized substructures within your overall Airflow architecture. By being methodical and purposeful with your DAG organization you can become more efficient, produce fewer errors, and reduce technical debt. We've found that being able to effectively manage your DAGs pays dividends over time, contributes to a more sustainable architecture, and helps reduce technical debt.
It is often a good idea to have more than one Airflow instance to handle different execution environments. It is entirely reasonable to have one airflow instance to handle your incoming text files, one to handle your API flows, one to handle database extractions, etc. This enables a cleaner execution environment and allows customization of internal airflow components to suit similar workflows. Additionally, tools such as Cloud Composer allow highly customizable hardware configurations to suit various execution strategies which can aid experimentation and facilitate more efficent workflow executions.