Understanding Apache Airflow

dazevedo

Daniel Azevedo

Posted on October 1, 2024

Understanding Apache Airflow

Hi devs,

In the world of data engineering, orchestration tools are essential for managing complex workflows. One of the most popular tools in this space is Apache Airflow. But what exactly is it, and how can you get started with it? Let's break it down.

What is Apache Airflow?

Apache Airflow is an open-source platform to programmatically author, schedule, and monitor workflows. It allows you to define your data pipelines in Python code, making it easy to create complex workflows and manage dependencies.

Key Features of Airflow:

  • Dynamic: Workflows are defined as code, allowing for dynamic generation of tasks and workflows.
  • Extensible: You can create custom operators and integrate with various tools and services.
  • Rich User Interface: Airflow provides a web-based UI to visualize your workflows and monitor their progress.

Why Use Apache Airflow?

With Airflow, you can automate repetitive tasks, manage dependencies, and ensure that your data pipelines run smoothly. It's particularly useful in scenarios where you need to run tasks on a schedule, such as ETL processes, machine learning model training, or any workflow that involves data processing.

Getting Started with a Basic Example

To illustrate how Airflow works, let’s set up a simple workflow that prints "Hello, World!" and then sleeps for 5 seconds.

Step 1: Installation

First, you need to install Apache Airflow. You can do this using pip:

pip install apache-airflow
Enter fullscreen mode Exit fullscreen mode

Step 2: Define a DAG

Once Airflow is installed, you can create a Directed Acyclic Graph (DAG) to define your workflow. Create a file called hello_world.py in the dags folder of your Airflow installation. This file will contain the following code:

from airflow import DAG
from airflow.operators.dummy_operator import DummyOperator
from airflow.operators.python_operator import PythonOperator
from datetime import datetime
import time

def print_hello():
    print("Hello, World!")
    time.sleep(5)

default_args = {
    'owner': 'airflow',
    'start_date': datetime(2023, 1, 1),
}

dag = DAG('hello_world_dag', default_args=default_args, schedule_interval='@once')

start_task = DummyOperator(task_id='start', dag=dag)

hello_task = PythonOperator(task_id='hello_task', python_callable=print_hello, dag=dag)

start_task >> hello_task
Enter fullscreen mode Exit fullscreen mode

Breakdown of the Code:

  • DAG: The DAG object is created with a unique identifier (hello_world_dag). The default_args parameter contains default settings for the tasks.
  • Tasks: Two tasks are defined: a DummyOperator to signify the start of the workflow and a PythonOperator that calls the print_hello function.
  • Task Dependencies: The >> operator is used to set the order in which tasks should run.

Step 3: Running Airflow

To start Airflow, you need to initialize the database and run the web server. Run the following commands:

airflow db init
airflow webserver --port 8080
Enter fullscreen mode Exit fullscreen mode

In a new terminal, start the scheduler:

airflow scheduler
Enter fullscreen mode Exit fullscreen mode

Step 4: Triggering the DAG

  1. Open your web browser and navigate to http://localhost:8080.
  2. You should see the Airflow UI. Find the hello_world_dag and trigger it manually.
  3. You can monitor the progress and see the logs for each task in the UI.

Conclusion

Apache Airflow is a powerful tool for orchestrating workflows in data engineering. With its easy-to-use interface and flexibility, it allows you to manage complex workflows efficiently. The example provided is just the tip of the iceberg; as you become more familiar with Airflow, you can create more intricate workflows tailored to your specific needs.

💖 💪 🙅 🚩
dazevedo
Daniel Azevedo

Posted on October 1, 2024

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related