Skip to content

Commit

Permalink
[FLINK-27058][python][build] Add support for Python 3.9
Browse files Browse the repository at this point in the history
This closes apache#19895.
  • Loading branch information
a49a authored and HuangXingBo committed Jun 10, 2022
1 parent 18c13fe commit 21ae10d
Show file tree
Hide file tree
Showing 16 changed files with 26 additions and 24 deletions.
2 changes: 1 addition & 1 deletion docs/content.zh/docs/dev/python/datastream_tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ Apache Flink 提供了 DataStream API,用于构建健壮的、有状态的流
首先,你需要在你的电脑上准备以下环境:

* Java 11
* Python 3.6, 3.7 or 3.8
* Python 3.6, 3.7, 3.8 or 3.9

使用 Python DataStream API 需要安装 PyFlink,PyFlink 发布在 [PyPI](https://pypi.org/project/apache-flink/)上,可以通过 `pip` 快速安装。

Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/dev/python/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ under the License.

```bash
$ python --version
# the version printed here must be 3.6, 3.7 or 3.8
# the version printed here must be 3.6, 3.7, 3.8 or 3.9
```

## 环境设置
Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/dev/python/table_api_tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ Apache Flink 提供 Table API 关系型 API 来统一处理流和批,即查询
如果要继续我们的旅程,你需要一台具有以下功能的计算机:

* Java 11
* Python 3.6, 3.7 or 3.8
* Python 3.6, 3.7, 3.8 or 3.9

使用 Python Table API 需要安装 PyFlink,它已经被发布到 [PyPi](https://pypi.org/project/apache-flink/),你可以通过如下方式安装 PyFlink:

Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/flinkDev/building.md
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ mvn clean install -DskipTests -Dfast -Pskip-webui-build -T 1C

```shell
$ python --version
# the version printed here must be 3.6, 3.7 or 3.8
# the version printed here must be 3.6, 3.7, 3.8 or 3.9
```

3. 构建 PyFlink 的 Cython 扩展模块(可选的)
Expand Down
2 changes: 1 addition & 1 deletion docs/content/docs/dev/python/datastream_tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ In particular, Apache Flink's [user mailing list](https://flink.apache.org/commu
If you want to follow along, you will require a computer with:

* Java 11
* Python 3.6, 3.7 or 3.8
* Python 3.6, 3.7, 3.8 or 3.9

Using Python DataStream API requires installing PyFlink, which is available on [PyPI](https://pypi.org/project/apache-flink/) and can be easily installed using `pip`.

Expand Down
4 changes: 2 additions & 2 deletions docs/content/docs/dev/python/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,12 +29,12 @@ under the License.
## Environment Requirements

{{< hint info >}}
Python version (3.6, 3.7 or 3.8) is required for PyFlink. Please run the following command to make sure that it meets the requirements:
Python version (3.6, 3.7, 3.8 or 3.9) is required for PyFlink. Please run the following command to make sure that it meets the requirements:
{{< /hint >}}

```bash
$ python --version
# the version printed here must be 3.6, 3.7 or 3.8
# the version printed here must be 3.6, 3.7, 3.8 or 3.9
```

## Environment Setup
Expand Down
2 changes: 1 addition & 1 deletion docs/content/docs/dev/python/table/udfs/python_udfs.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ under the License.

User-defined functions are important features, because they significantly extend the expressiveness of Python Table API programs.

**NOTE:** Python UDF execution requires Python version (3.6, 3.7 or 3.8) with PyFlink installed. It's required on both the client side and the cluster side.
**NOTE:** Python UDF execution requires Python version (3.6, 3.7, 3.8 or 3.9) with PyFlink installed. It's required on both the client side and the cluster side.

## Scalar Functions

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ These Python libraries are highly optimized and provide high-performance data st
[non-vectorized user-defined functions]({{< ref "docs/dev/python/table/udfs/python_udfs" >}}) on how to define vectorized user-defined functions.
Users only need to add an extra parameter `func_type="pandas"` in the decorator `udf` or `udaf` to mark it as a vectorized user-defined function.

**NOTE:** Python UDF execution requires Python version (3.6, 3.7 or 3.8) with PyFlink installed. It's required on both the client side and the cluster side.
**NOTE:** Python UDF execution requires Python version (3.6, 3.7, 3.8 or 3.9) with PyFlink installed. It's required on both the client side and the cluster side.

## Vectorized Scalar Functions

Expand Down
2 changes: 1 addition & 1 deletion docs/content/docs/dev/python/table_api_tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ In particular, Apache Flink's [user mailing list](https://flink.apache.org/commu
If you want to follow along, you will require a computer with:

* Java 11
* Python 3.6, 3.7 or 3.8
* Python 3.6, 3.7, 3.8 or 3.9

Using Python Table API requires installing PyFlink, which is available on [PyPI](https://pypi.org/project/apache-flink/) and can be easily installed using `pip`.

Expand Down
4 changes: 2 additions & 2 deletions docs/content/docs/flinkDev/building.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,11 +74,11 @@ The `fast` and `skip-webui-build` profiles have a significant impact on the buil

If you want to build a PyFlink package that can be used for pip installation, you need to build the Flink project first, as described in [Build Flink](#build-flink).

2. Python version(3.6, 3.7 or 3.8) is required
2. Python version(3.6, 3.7, 3.8 or 3.9) is required

```shell
$ python --version
# the version printed here must be 3.6, 3.7 or 3.8
# the version printed here must be 3.6, 3.7, 3.8 or 3.9
```

3. Build PyFlink with Cython extension support (optional)
Expand Down
3 changes: 2 additions & 1 deletion flink-python/apache-flink-libraries/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -225,7 +225,8 @@ def find_file_path(pattern):
'License :: OSI Approved :: Apache Software License',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8'],
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9'],
)
finally:
if in_flink_source:
Expand Down
2 changes: 1 addition & 1 deletion flink-python/dev/build-wheels.sh
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ set -e -x
dev/lint-python.sh -s py_env

PY_ENV_DIR=`pwd`/dev/.conda/envs
py_env=("3.6" "3.7" "3.8")
py_env=("3.6" "3.7" "3.8" "3.9")
## 2. install dependency
for ((i=0;i<${#py_env[@]};i++)) do
${PY_ENV_DIR}/${py_env[i]}/bin/pip install -r dev/dev-requirements.txt
Expand Down
2 changes: 1 addition & 1 deletion flink-python/dev/dev-requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ pyarrow>=0.15.1,<7.0.0; python_version < '3.7'
pytz>=2018.3
numpy>=1.21.4,<1.22.0; python_version >= '3.7'
numpy>=1.14.3,<1.20; python_version < '3.7'
fastavro>=0.21.4,<0.24
fastavro>=1.1.0,<1.4.8
grpcio>=1.29.0,<2
grpcio-tools>=1.3.5,<=1.14.2
pemja==0.1.5; python_version >= '3.7' and platform_system != 'Windows'
Expand Down
10 changes: 5 additions & 5 deletions flink-python/dev/lint-python.sh
Original file line number Diff line number Diff line change
Expand Up @@ -226,9 +226,9 @@ function install_miniconda() {
# Install some kinds of py env.
function install_py_env() {
if [[ ${BUILD_REASON} = 'IndividualCI' ]]; then
py_env=("3.8")
py_env=("3.9")
else
py_env=("3.6" "3.7" "3.8")
py_env=("3.6" "3.7" "3.8" "3.9")
fi
for ((i=0;i<${#py_env[@]};i++)) do
if [ -d "$CURRENT_DIR/.conda/envs/${py_env[i]}" ]; then
Expand Down Expand Up @@ -403,7 +403,7 @@ function install_environment() {
fi

# step-3 install python environment which includes
# 3.6 3.7 3.8
# 3.6 3.7 3.8 3.9
if [ $STEP -lt 3 ] && [ `need_install_component "py_env"` = true ]; then
print_function "STEP" "installing python environment..."
install_py_env
Expand Down Expand Up @@ -584,7 +584,7 @@ function check_stage() {
#########################
# Tox check
function tox_check() {
LATEST_PYTHON="py38"
LATEST_PYTHON="py39"
print_function "STAGE" "tox checks"
# Set created py-env in $PATH for tox's creating virtual env
activate
Expand Down Expand Up @@ -778,7 +778,7 @@ usage: $0 [options]
-l list all checks supported.
Examples:
./lint-python -s basic => install environment with basic components.
./lint-python -s py_env => install environment with python env(3.6,3.7,3.8).
./lint-python -s py_env => install environment with python env(3.6,3.7,3.8,3.9).
./lint-python -s all => install environment with all components such as python env,tox,flake8,sphinx,mypy etc.
./lint-python -s tox,flake8 => install environment with tox,flake8.
./lint-python -s tox -f => reinstall environment with tox.
Expand Down
7 changes: 4 additions & 3 deletions flink-python/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -299,7 +299,7 @@ def extracted_output_files(base_dir, file_path, output_directory):

install_requires = ['py4j==0.10.9.3', 'python-dateutil==2.8.0', 'apache-beam==2.38.0',
'cloudpickle==2.1.0', 'avro-python3>=1.8.1,!=1.9.2,<1.10.0',
'pytz>=2018.3', 'fastavro>=0.21.4,<0.24', 'requests>=2.26.0',
'pytz>=2018.3', 'fastavro>=1.1.0,<1.4.8', 'requests>=2.26.0',
'protobuf<3.18',
'pemja==0.1.5;'
'python_full_version >= "3.7" and platform_system != "Windows"',
Expand All @@ -311,7 +311,7 @@ def extracted_output_files(base_dir, file_path, output_directory):
install_requires.append('pandas>=1.0,<1.2.0')
install_requires.append('pyarrow>=0.15.1,<7.0.0')
else:
# python 3.7 3.8 upper limit and M1 chip lower limit,
# python 3.7, 3.8 and 3.9 upper limit and M1 chip lower limit,
install_requires.append('numpy>=1.21.4,<1.22.0')
install_requires.append('pandas>=1.3.0,<1.4.0')
install_requires.append('pyarrow>=5.0.0,<9.0.0')
Expand Down Expand Up @@ -341,7 +341,8 @@ def extracted_output_files(base_dir, file_path, output_directory):
'License :: OSI Approved :: Apache Software License',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8'],
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9'],
ext_modules=extensions
)
finally:
Expand Down
2 changes: 1 addition & 1 deletion flink-python/tox.ini
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
# in multiple virtualenvs. This configuration file will run the
# test suite on all supported python versions.
# new environments will be excluded by default unless explicitly added to envlist.
envlist = {py36, py37, py38}-cython
envlist = {py36, py37, py38, py39}-cython

[testenv]
whitelist_externals=
Expand Down

0 comments on commit 21ae10d

Please sign in to comment.