Skip to content

Commit

Permalink
Pass SchedulerJob.subdir to Dagbag (apache#13291)
Browse files Browse the repository at this point in the history
Because `SchedulerJob.subdir` was not used in Airflow 2.0, whenever SchedulerJob() would be initialized, it would serialize all the DAGs to the DB from settings.DAG_FOLDER.

```
root@b11b273fdffb:/opt/airflow# pytest tests/jobs/test_scheduler_job.py -k test_dag_file_processor_process_task_instances --durations=0

Before:

 9 passed, 120 deselected, 2 warnings in 22.11s =======================================================================================================

After:

 9 passed, 120 deselected, 2 warnings in 10.56s =======================================================================================================

```
  • Loading branch information
kaxil authored Dec 24, 2020
1 parent e2bfac9 commit 3f52f1a
Show file tree
Hide file tree
Showing 3 changed files with 47 additions and 48 deletions.
2 changes: 1 addition & 1 deletion airflow/jobs/scheduler_job.py
Original file line number Diff line number Diff line change
Expand Up @@ -734,7 +734,7 @@ def __init__(
self.max_tis_per_query: int = conf.getint('scheduler', 'max_tis_per_query')
self.processor_agent: Optional[DagFileProcessorAgent] = None

self.dagbag = DagBag(read_dags_from_db=True)
self.dagbag = DagBag(dag_folder=self.subdir, read_dags_from_db=True)

def register_signals(self) -> None:
"""Register signals that stop child processes"""
Expand Down
Loading

0 comments on commit 3f52f1a

Please sign in to comment.