Skip to content

Commit

Permalink
[AIRFLOW-548] Load DAGs immediately & continually
Browse files Browse the repository at this point in the history
A recent commit has changed the scheduler behavior
that it now
always stops after a specified period of time. The
operation scripts
(systemd etc) are not updated for this behavior
and many users actually
prefer to run the scheduler continously.

Secondly the default behavior was changed to not
pickup new DAGs
immediately, this has lead to confusion with
users.

Closes apache#1823 from bolkedebruin/fix_duration
  • Loading branch information
bolkedebruin authored and r39132 committed Oct 6, 2016
1 parent 4d567f4 commit fe5eaab
Show file tree
Hide file tree
Showing 2 changed files with 10 additions and 3 deletions.
11 changes: 9 additions & 2 deletions airflow/configuration.py
Original file line number Diff line number Diff line change
Expand Up @@ -319,10 +319,17 @@ def run_command(command):
# how often the scheduler should run (in seconds).
scheduler_heartbeat_sec = 5
run_duration = 1800
# after how much time should the scheduler terminate in seconds
# -1 indicates to run continuously (see also num_runs)
run_duration = -1
# after how much time a new DAGs should be picked up from the filesystem
min_file_process_interval = 0
dag_dir_list_interval = 300
# How often should stats be printed to the logs
print_stats_interval = 30
min_file_process_interval = 180
child_process_log_directory = /tmp/airflow/scheduler/logs
Expand Down
2 changes: 1 addition & 1 deletion airflow/jobs.py
Original file line number Diff line number Diff line change
Expand Up @@ -1324,7 +1324,7 @@ def _execute_helper(self, processor_manager):

# For the execute duration, parse and schedule DAGs
while (datetime.now() - execute_start_time).total_seconds() < \
self.run_duration:
self.run_duration or self.run_duration < 0:
self.logger.debug("Starting Loop...")
loop_start_time = time.time()

Expand Down

0 comments on commit fe5eaab

Please sign in to comment.