You can view the DAGs glance view of success, failure and the number of tasks currently running.
Select one of the DAG
Tree View
DAG across time tree representation. If the pipeline (pipeline) is delayed, you can quickly see where the error occurred and identify the steps clogging process.
Chart View
It is probably the most comprehensive graphical view of a form of. It can visualize your dependence DAG and the current state of a running instance.
Task duration chart
The duration of different tasks running past N times. By this view, you can quickly learn to find outliers and DAG time spent in multiple runs.
Gantt chart
Gantt chart allows you to analyze and task durations overlap. You can quickly identify system bottlenecks and what specific DAG spent a lot of time in operation.
Code View
Everything is transparent. Although your pipeline (pipeline) code source control, but this is a quick access to code and provide more ways DAG context.
Task instance context menu
From the above page (tree view, graphical view, Gantt charts ......), the task can always click the instance and enter this rich context menu, the menu will take you to more detailed metadata and to perform certain operations.
Task instance details
Examples of tasks
View Log
Scheduling period
All task instances
All jobs
Record the operation of all of the DAG
Pool
When there are too many processes running simultaneously, some systems may be overwhelmed. Airflow may be used to limit the pool of execution parallelism in an arbitrary set of tasks. By naming the pool and assigning the plurality of grooves to work in UI
connection
Connection information storage system in the external database Airflow metadata, and manages the UI ( Menu -> Admin -> Connections
). There are defined conn_id
, and additional host name / login / password / structural information. Airflow pipeline may simply refer to a centrally managed conn_id
without hard-coding any such information anywhere.
It can be defined with the same conn_id
number of connection, and in this case, and when using the hook from BaseHook
the get_connection
time of the method, a randomly chosen Airflow connector, allowing basic load balancing and fault tolerance when used in conjunction with the retry.
Airflow can also be referenced via the operating system environment variables. But it only supports URI format. If you need to specify connection extra
information, use the Web UI.
If Airflow metadata database, and environment variables are defined in the same conn_id
connection, then Airflow will refer only to the connection environment variables (e.g., given conn_id
postgres_master
,, Airflow will first search before the start of metadata for searching the database in the environment variable AIRFLOW_CONN_POSTGRES_MASTER
and direct reference to it).
Has a default number of hooks conn_id
, the hooks used Operator requires no explicit connection ID. For example, PostgresHook
the default conn_id
Shi postgres_default
.
XComs
XComs allows tasks to exchange messages, allowing more subtle forms of control and shared state. The name is an abbreviation of "cross-talk" of. XComs mainly defined by the key, and a time stamp value,
But the task of tracking create XCom / DAG and when it should be visible attributes. Any object can be XCom pickle may be used as the value, the user should use to secure an appropriate size.
variable
Variable is set to the content or any storage and retrieval of general simple key stored in Airflow. Can be Admin -> Variables
(from ), or CLI codes listed,
Create, update, and delete variables. In addition, json set up batch files can be uploaded via the UI. Although the definition of plumbing code and most of constants and variables should be defined and stored in the source code control, but you can access and modify certain variables or configuration items may be useful by UI code.