Skip to content

Commit

Permalink
[AIRFLOW-6663] Prepare backporting packages (apache#7391)
Browse files Browse the repository at this point in the history
  • Loading branch information
potiuk authored Feb 24, 2020
1 parent 0ec2774 commit ccb2899
Show file tree
Hide file tree
Showing 24 changed files with 538 additions and 18 deletions.
11 changes: 9 additions & 2 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -202,6 +202,13 @@ repos:
files: ^BREEZE.rst$|^breeze$|^breeze-complete$
pass_filenames: false
require_serial: true
- id: update-setup-cfg-file
name: Update setup.cfg file with all licenses
entry: "./scripts/ci/pre_commit_setup_cfg_file.sh"
language: system
files: ^setup.cfg$
pass_filenames: false
require_serial: true
- id: pydevd
language: pygrep
name: Check for pydevd debug statements accidentally left
Expand Down Expand Up @@ -285,14 +292,14 @@ repos:
language: system
entry: "./scripts/ci/pre_commit_mypy.sh"
files: \.py$
exclude: ^airflow/_vendor/.*$|^dev
exclude: ^airflow/_vendor/.*$|^dev|^backport_packages
require_serial: true
- id: pylint
name: Run pylint for main sources
language: system
entry: "./scripts/ci/pre_commit_pylint_main.sh"
files: \.py$
exclude: ^tests/.*\.py$|^airflow/_vendor/.*|^scripts/.*\.py$|^dev
exclude: ^tests/.*\.py$|^airflow/_vendor/.*|^scripts/.*\.py$|^dev|^backport_packages
pass_filenames: true
require_serial: true # Pylint tests should be run in one chunk to detect all cycles
- id: pylint-tests
Expand Down
4 changes: 4 additions & 0 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,10 @@ services:
- docker
jobs:
include:
- name: "Prepare backport packages"
before_install: echo
stage: pre-test
script: ./scripts/ci/ci_prepare_backport_packages.sh
- name: "Static checks"
stage: pre-test
script: ./scripts/ci/ci_run_all_static_checks.sh
Expand Down
40 changes: 40 additions & 0 deletions CONTRIBUTING.rst
Original file line number Diff line number Diff line change
Expand Up @@ -826,3 +826,43 @@ Resources & Links
- `Airflow’s official documentation <http://airflow.apache.org/>`__

- `More resources and links to Airflow related content on the Wiki <https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Links>`__

Preparing backport packages
===========================

As part of preparation to Airflow 2.0 we decided to prepare backport of providers package that will be
possible to install in the Airflow 1.10.*, Python 3.6+ environment.
Some of those packages will be soon (after testing) officially released via PyPi, but you can build and
prepare such packages on your own easily.

* The setuptools.py script only works in python3.6+. This is also our minimally supported python
version to use the packages in.

* Make sure you have ``setuptools`` and ``wheel`` installed in your python environment. The easiest way
to do it is to run ``pip install setuptools wheel``

* Enter the ``backport_packages`` directory

* Usually you only build some of the providers package. The ``providers`` directory is separated into
separate providers. You can see the list of all available providers by running
``python setup_backport_packages.py list-backport-packages``. You can build the backport package
by running ``python setup.py <PROVIDER_NAME> bdist_wheel``. Note that there
might be (and are) dependencies between some packages that might prevent subset of the packages
to be used without installing the packages they depend on. This will be solved soon by
adding cross-dependencies between packages.

* You can build 'all providers' package by running
``python setup_backport_packages.py providers bdist_wheel``. This package contains all providers thus
it does not have issues with cross-dependencies.

* This creates a wheel package in your ``dist`` folder with a name similar to:
``apache_airflow_providers-0.0.1-py2.py3-none-any.whl``

* You can install this package with ``pip install <PACKAGE_FILE>``


* You can also build sdist (source distribution packages) by running
``python setup.py <PROVIDER_NAME> sdist`` but this is only needed in case of distribution of the packages.

Note that those are unofficial packages yet - they are not yet released in PyPi, but you might use them to
test the master versions of operators/hooks/sensors in a 1.10.* environment of airflow with Python3.6+
2 changes: 1 addition & 1 deletion MANIFEST.in
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ include NOTICE
include LICENSE
include CHANGELOG.txt
include README.md
graft licenses/
graft licenses
graft airflow/www
graft airflow/www/static
graft airflow/www/templates
Expand Down
25 changes: 25 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,7 @@ Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The
- [Beyond the Horizon](#beyond-the-horizon)
- [Principles](#principles)
- [User Interface](#user-interface)
- [Using hooks and Operators from "master" in Airflow 1.10](#using-hooks-and-operators-from-master-in-airflow-110)
- [Contributing](#contributing)
- [Who uses Apache Airflow?](#who-uses-apache-airflow)
- [Who Maintains Apache Airflow?](#who-maintains-apache-airflow)
Expand Down Expand Up @@ -107,6 +108,30 @@ unit of work and continuity.
![](/docs/img/code.png)


## Using hooks and Operators from "master" in Airflow 1.10

Currently stable versions of Apache Airflow are released in 1.10.* series. We are working on the
future, major version of Airflow from the 2.0.* series. It is going to be released in
in 2020. However the exact time of release depends on many factors and is yet unknown.
We have already a lot of changes in the hooks/operators/sensors for many external systems
and they are not used because they are part of the master/2.0 release.

In the Airflow 2.0 - following AIP-21 "change in import paths" all the non-core operators/hooks/sensors
of Apache Airflow have been moved to the "airflow.providers" package. This opened a possibility to
use the operators from Airflow 2.0 in Airflow 1.10 - with the constraint that those
packages can only be used in python3.6+ environment.

Therefore we decided to prepare and release backport packages that can be installed
for older Airflow versions. Those backport packages are released more frequently. Users do not
have to upgrade their Airflow version to use those packages. There are a number of changes
between Airflow 2.0 and 1.10.* - documented in [UPDATING.md](UPDATING.md). With backported
providers package users can migrate their DAGs to the new providers package incrementally
and once they convert to the new operators/sensors/hooks they can seamlessly migrate their
environments to Airflow 2.0.

More information about the status and releases of the back-ported packages are available
at [Backported providers package page](https://cwiki.apache.org/confluence/display/AIRFLOW/Backported+providers+packages+for+Airflow+1.10.*+series)

## Contributing

Want to help build Apache Airflow? Check out our [contributing documentation](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst).
Expand Down
1 change: 1 addition & 0 deletions backport_packages/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
*.egg-info
1 change: 1 addition & 0 deletions backport_packages/CHANGELOG.txt
1 change: 1 addition & 0 deletions backport_packages/LICENSE
25 changes: 25 additions & 0 deletions backport_packages/MANIFEST.in
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.

include NOTICE
include LICENSE
include CHANGELOG.txt
include README.md
include ../airflow/git_version
graft licenses
global-exclude __pycache__ *.pyc
1 change: 1 addition & 0 deletions backport_packages/NOTICE
1 change: 1 addition & 0 deletions backport_packages/README.md
1 change: 1 addition & 0 deletions backport_packages/airflow/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
providers
16 changes: 16 additions & 0 deletions backport_packages/airflow/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
1 change: 1 addition & 0 deletions backport_packages/airflow/version.py
1 change: 1 addition & 0 deletions backport_packages/dist
1 change: 1 addition & 0 deletions backport_packages/licenses
1 change: 1 addition & 0 deletions backport_packages/setup.cfg
Loading

0 comments on commit ccb2899

Please sign in to comment.