Skip to content

Commit 54c74ce

Browse files
authored
Merge pull request #19 from aelzeiny/github-actions
Update ReadMe & Yaml to include github actions badges
2 parents 4459665 + 3e7ba55 commit 54c74ce

File tree

2 files changed

+8
-37
lines changed

2 files changed

+8
-37
lines changed

.github/workflows/python-package.yml .github/workflows/main.yml

+4-4
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
# This workflow will install Python dependencies, run tests and lint with a variety of Python versions
22
# For more information see: https://help.github.com/actions/language-and-framework-guides/using-python-with-github-actions
33

4-
name: Python package
4+
name: AWS Airflow Executor Tester
55

66
on:
77
push:
@@ -28,11 +28,11 @@ jobs:
2828
run: |
2929
python -m pip install --upgrade pip
3030
python -m pip install pytest apache-airflow boto3 pylint isort marshmallow
31-
- name: Lint with flake8
31+
- name: Lint with Pylint
3232
run: |
33-
# Pylint files
3433
pylint --fail-under=9 ./airflow_aws_executors
35-
# ISort files
34+
- name: Lint with ISort
35+
run: |
3636
isort -c -rc ./airflow_aws_executors
3737
- name: Test with pytest
3838
run: |

readme.md

+4-33
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,7 @@
11
# Apache Airflow: Native AWS Executors
2-
[![Build Status](https://travis-ci.com/aelzeiny/airflow-aws-executors.svg?branch=master)](https://travis-ci.com/aelzeiny/airflow-aws-executors)
2+
3+
![Build Status](https://github.com/aelzeiny/airflow-aws-executors/actions/workflows/main.yml/badge.svg)
4+
35

46
This is an AWS Executor that delegates every task to a scheduled container on either AWS Batch, AWS Fargate, or AWS ECS.
57

@@ -12,37 +14,6 @@ For `AWS Batch`: [Getting Started with AWS Batch ReadMe](getting_started_batch.m
1214

1315
For `AWS ECS/Fargate`: [Getting Started with AWS ECS/Fargate ReadMe](getting_started_ecs_fargate.md)
1416

15-
16-
## How Airflow Executors Work
17-
Every time Apache Airflow wants to run a task, the Scheduler generates a shell command that needs to be executed **somewhere**.
18-
Under the hood this command will run Python code, and it looks something like this:
19-
```bash
20-
airflow run <DAG_ID> <TASK_ID> <EXECUTION_DATE>
21-
```
22-
What people refer to as the Executor, is actually an API and not a thread/process. The process that runs is the Airflow
23-
Scheduler, and the Executor API is part of that process. When the Scheduler process generates the shell command above,
24-
it is the executor that decides how this is ran. Here's how what different Executors handle this command:
25-
* **LocalExecutor**
26-
* `execute_async()` Uses Python's Subprocess library to spin up a new process with the shell command
27-
* `sync()` Monitors exit code on every heartbeat
28-
* **CeleryExecutor**
29-
* `execute_async()` Uses the Celery library to put this shell command in a message queue
30-
* `sync()` Monitors the Celery Backend to determine task completion
31-
* **KubernetesExecutor**
32-
* `execute_async()` Launches K8 Pod with the shell command
33-
* `sync()` Monitors K8 Pod
34-
35-
So, how do these executors work? Well, on the highest level, it just calls Boto3 APIs to schedule containers onto
36-
compute clusters.
37-
38-
* **AwsEcsFargateExecutor**
39-
* `execute_async()` Uses the Boto3 library to call [ECS.run_task()][run_task], and launches a container with the shell command onto a EC2 or Serverless cluster
40-
* `sync()` Uses the Boto3 library to call ECS.describe_tasks() to monitor the exit-code of the Airflow Container.
41-
* **AwsBatchExecutor**
42-
* `executor_async()`Uses the Boto3 library to call [Batch.submit_job()][submit_job], and puts this message in a job-queue
43-
* `sync()` Uses the Boto3 library to call Batch.describe_jobs() to monitor the status of the Airflow Container
44-
45-
4617
## But Why?
4718
There's so much to unpack here.
4819

@@ -271,7 +242,7 @@ Please file a ticket in GitHub for issues. Be persistent and be polite.
271242

272243

273244
## Contribution & Development
274-
This repository uses Travis-CI for CI, pytest for Integration/Unit tests, and isort+pylint for code-style.
245+
This repository uses Github Actions for CI, pytest for Integration/Unit tests, and isort+pylint for code-style.
275246
Pythonic Type-Hinting is encouraged. From the bottom of my heart, thank you to everyone who has contributed
276247
to making Airflow better.
277248

0 commit comments

Comments
 (0)