Skip to content

Commit

Permalink
[SPARK-44222][BUILD][PYTHON] Upgrade grpc to 1.56.0 with lower/upperb…
Browse files Browse the repository at this point in the history
…ound

### What changes were proposed in this pull request?

This PR revert the revert of apache#41767 with setting grpc lowerbounds.

### Why are the changes needed?

See apache#41767

### Does this PR introduce _any_ user-facing change?

See apache#41767

### How was this patch tested?

Manually tested with Conda environment, with `pip install -r dev/requirements.txt` in Python 3.9, Python 3.10 and Python 3.11.

Closes apache#41997 from HyukjinKwon/SPARK-44222.

Authored-by: Hyukjin Kwon <[email protected]>
Signed-off-by: Hyukjin Kwon <[email protected]>
  • Loading branch information
HyukjinKwon committed Jul 14, 2023
1 parent a47bde2 commit de59caa
Show file tree
Hide file tree
Showing 8 changed files with 18 additions and 18 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/build_and_test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -256,7 +256,7 @@ jobs:
- name: Install Python packages (Python 3.8)
if: (contains(matrix.modules, 'sql') && !contains(matrix.modules, 'sql-'))
run: |
python3.8 -m pip install 'numpy>=1.20.0' pyarrow pandas scipy unittest-xml-reporting 'grpcio==1.48.1' 'protobuf==3.19.5'
python3.8 -m pip install 'numpy>=1.20.0' pyarrow pandas scipy unittest-xml-reporting 'grpcio==1.56.0' 'protobuf==3.19.5'
python3.8 -m pip list
# Run the tests.
- name: Run tests
Expand Down Expand Up @@ -625,7 +625,7 @@ jobs:
# Jinja2 3.0.0+ causes error when building with Sphinx.
# See also https://issues.apache.org/jira/browse/SPARK-35375.
python3.9 -m pip install 'flake8==3.9.0' pydata_sphinx_theme 'mypy==0.982' 'pytest==7.1.3' 'pytest-mypy-plugins==1.9.3' numpydoc 'jinja2<3.0.0' 'black==22.6.0'
python3.9 -m pip install 'pandas-stubs==1.2.0.53' ipython 'grpcio==1.48.1' 'grpc-stubs==1.24.11' 'googleapis-common-protos-stubs==2.2.0'
python3.9 -m pip install 'pandas-stubs==1.2.0.53' ipython 'grpcio==1.56.0' 'grpc-stubs==1.24.11' 'googleapis-common-protos-stubs==2.2.0'
- name: Python linter
run: PYTHON_EXECUTABLE=python3.9 ./dev/lint-python
- name: Install dependencies for Python code generation check
Expand Down
4 changes: 2 additions & 2 deletions connector/connect/common/src/main/buf.gen.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@ plugins:
out: gen/proto/csharp
- plugin: buf.build/protocolbuffers/java:v21.7
out: gen/proto/java
- remote: buf.build/grpc/plugins/ruby:v1.47.0-1
- plugin: buf.build/grpc/ruby:v1.56.0
out: gen/proto/ruby
- plugin: buf.build/protocolbuffers/ruby:v21.7
out: gen/proto/ruby
# Building the Python build and building the mypy interfaces.
- plugin: buf.build/protocolbuffers/python:v21.7
out: gen/proto/python
- remote: buf.build/grpc/plugins/python:v1.47.0-1
- plugin: buf.build/grpc/python:v1.56.0
out: gen/proto/python
- name: mypy
out: gen/proto/python
Expand Down
2 changes: 1 addition & 1 deletion dev/create-release/spark-rm/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ ARG APT_INSTALL="apt-get install --no-install-recommends -y"
# We should use the latest Sphinx version once this is fixed.
# TODO(SPARK-35375): Jinja2 3.0.0+ causes error when building with Sphinx.
# See also https://issues.apache.org/jira/browse/SPARK-35375.
ARG PIP_PKGS="sphinx==3.0.4 mkdocs==1.1.2 numpy==1.20.3 pydata_sphinx_theme==0.4.1 ipython==7.19.0 nbsphinx==0.8.0 numpydoc==1.1.0 jinja2==2.11.3 twine==3.4.1 sphinx-plotly-directive==0.1.3 pandas==1.5.3 pyarrow==3.0.0 plotly==5.4.0 markupsafe==2.0.1 docutils<0.17 grpcio==1.48.1 protobuf==4.21.6 grpcio-status==1.48.1 googleapis-common-protos==1.56.4"
ARG PIP_PKGS="sphinx==3.0.4 mkdocs==1.1.2 numpy==1.20.3 pydata_sphinx_theme==0.4.1 ipython==7.19.0 nbsphinx==0.8.0 numpydoc==1.1.0 jinja2==2.11.3 twine==3.4.1 sphinx-plotly-directive==0.1.3 pandas==1.5.3 pyarrow==3.0.0 plotly==5.4.0 markupsafe==2.0.1 docutils<0.17 grpcio==1.56.0 protobuf==4.21.6 grpcio-status==1.56.0 googleapis-common-protos==1.56.4"
ARG GEM_PKGS="bundler:2.3.8"

# Install extra needed repos and refresh.
Expand Down
4 changes: 2 additions & 2 deletions dev/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -50,8 +50,8 @@ black==22.6.0
py

# Spark Connect (required)
grpcio==1.48.1
grpcio-status==1.48.1
grpcio>=1.48,<1.57
grpcio-status>=1.48,<1.57
protobuf==3.19.5
googleapis-common-protos==1.56.4

Expand Down
2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -288,7 +288,7 @@
<!-- Version used in Connect -->
<connect.guava.version>32.0.1-jre</connect.guava.version>
<guava.failureaccess.version>1.0.1</guava.failureaccess.version>
<io.grpc.version>1.47.0</io.grpc.version>
<io.grpc.version>1.56.0</io.grpc.version>
<mima.version>1.1.2</mima.version>
<tomcat.annotations.api.version>6.0.53</tomcat.annotations.api.version>

Expand Down
2 changes: 1 addition & 1 deletion project/SparkBuild.scala
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@ object BuildCommons {
// SPARK-41247: needs to be consistent with `protobuf.version` in `pom.xml`.
val protoVersion = "3.23.2"
// GRPC version used for Spark Connect.
val gprcVersion = "1.47.0"
val gprcVersion = "1.56.0"
}

object SparkBuild extends PomBuild {
Expand Down
16 changes: 8 additions & 8 deletions python/docs/source/getting_started/install.rst
Original file line number Diff line number Diff line change
Expand Up @@ -153,15 +153,15 @@ To install PySpark from source, refer to |building_spark|_.
Dependencies
------------
========================== ========================= ======================================================================================
Package Minimum supported version Note
Package Supported version Note
========================== ========================= ======================================================================================
`py4j` 0.10.9.7 Required
`pandas` 1.0.5 Required for pandas API on Spark and Spark Connect; Optional for Spark SQL
`pyarrow` 4.0.0 Required for pandas API on Spark and Spark Connect; Optional for Spark SQL
`numpy` 1.15 Required for pandas API on Spark and MLLib DataFrame-based API; Optional for Spark SQL
`grpc` 1.48.1 Required for Spark Connect
`grpcio-status` 1.48.1 Required for Spark Connect
`googleapis-common-protos` 1.56.4 Required for Spark Connect
`py4j` >=0.10.9.7 Required
`pandas` >=1.0.5 Required for pandas API on Spark and Spark Connect; Optional for Spark SQL
`pyarrow` >=4.0.0 Required for pandas API on Spark and Spark Connect; Optional for Spark SQL
`numpy` >=1.15 Required for pandas API on Spark and MLLib DataFrame-based API; Optional for Spark SQL
`grpcio` >=1.48,<1.57 Required for Spark Connect
`grpcio-status` >=1.48,<1.57 Required for Spark Connect
`googleapis-common-protos` ==1.56.4 Required for Spark Connect
========================== ========================= ======================================================================================

Note that PySpark requires Java 8 or later with ``JAVA_HOME`` properly set.
Expand Down
2 changes: 1 addition & 1 deletion python/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -132,7 +132,7 @@ def _supports_symlinks():
# Also don't forget to update python/docs/source/getting_started/install.rst.
_minimum_pandas_version = "1.0.5"
_minimum_pyarrow_version = "4.0.0"
_minimum_grpc_version = "1.48.1"
_minimum_grpc_version = "1.56.0"
_minimum_googleapis_common_protos_version = "1.56.4"


Expand Down

0 comments on commit de59caa

Please sign in to comment.