Skip to content

Commit

Permalink
[SPARK-40073][INFRA][BUILD][CORE][SQL][AVRO][PYTHON] Replace `externa…
Browse files Browse the repository at this point in the history
…l/{moduleName}` with `connector/{moduleName}`

### What changes were proposed in this pull request?
SPARK-38569 rename `external` top level dir to `connector`, but the use of `external/${moduleName}` still remains in Spark, this pr replaces them all with `connector/{moduleName}`

### Why are the changes needed?
Should use  `connector/{moduleName}` instead of `external/${moduleName}

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
Pass GitHub Actions

Closes apache#37512 from LuciferYang/external-2-connector.

Authored-by: yangjie01 <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
  • Loading branch information
LuciferYang authored and cloud-fan committed Aug 15, 2022
1 parent 87094f8 commit c7c03f1
Show file tree
Hide file tree
Showing 8 changed files with 18 additions and 18 deletions.
4 changes: 2 additions & 2 deletions .github/labeler.yml
Original file line number Diff line number Diff line change
Expand Up @@ -103,7 +103,7 @@ SQL:
- "**/*schema.R"
- "**/*types.R"
AVRO:
- "external/avro/**/*"
- "connector/avro/**/*"
- "python/pyspark/sql/avro/**/*"
DSTREAM:
- "streaming/**/*"
Expand All @@ -123,7 +123,7 @@ MLLIB:
- "python/pyspark/mllib/**/*"
STRUCTURED STREAMING:
- "**/sql/**/streaming/**/*"
- "external/kafka-0-10-sql/**/*"
- "connector/kafka-0-10-sql/**/*"
- "python/pyspark/sql/streaming/**/*"
- "python/pyspark/sql/tests/test_streaming.py"
- "**/*streaming.R"
Expand Down
2 changes: 1 addition & 1 deletion LICENSE
Original file line number Diff line number Diff line change
Expand Up @@ -216,7 +216,7 @@ core/src/main/resources/org/apache/spark/ui/static/bootstrap*
core/src/main/resources/org/apache/spark/ui/static/jsonFormatter*
core/src/main/resources/org/apache/spark/ui/static/vis*
docs/js/vendor/bootstrap.js
external/spark-ganglia-lgpl/src/main/java/com/codahale/metrics/ganglia/GangliaReporter.java
connector/spark-ganglia-lgpl/src/main/java/com/codahale/metrics/ganglia/GangliaReporter.java


Python Software Foundation License
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1816,7 +1816,7 @@ abstract class AvroSuite
// It generates input files for the test below:
// "SPARK-31183, SPARK-37705: compatibility with Spark 2.4/3.2 in reading dates/timestamps"
ignore("SPARK-31855: generate test files for checking compatibility with Spark 2.4/3.2") {
val resourceDir = "external/avro/src/test/resources"
val resourceDir = "connector/avro/src/test/resources"
val version = SPARK_VERSION_SHORT.replaceAll("\\.", "_")
def save(
in: Seq[String],
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -36,8 +36,8 @@
# run the example
$ bin/spark-submit --jars \
'external/kinesis-asl-assembly/target/spark-streaming-kinesis-asl-assembly_*.jar' \
external/kinesis-asl/src/main/python/examples/streaming/kinesis_wordcount_asl.py \
'connector/kinesis-asl-assembly/target/spark-streaming-kinesis-asl-assembly_*.jar' \
connector/kinesis-asl/src/main/python/examples/streaming/kinesis_wordcount_asl.py \
myAppName mySparkStream https://kinesis.us-east-1.amazonaws.com us-east-1
There is a companion helper class called KinesisWordProducerASL which puts dummy data
Expand Down
2 changes: 1 addition & 1 deletion dev/checkstyle-suppressions.xml
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
<suppress checks=".*"
files="core/src/main/java/org/apache/spark/util/collection/TimSort.java"/>
<suppress checks=".*"
files="external/spark-ganglia-lgpl/src/main/java/com/codahale/metrics/ganglia/GangliaReporter.java"/>
files="connector/spark-ganglia-lgpl/src/main/java/com/codahale/metrics/ganglia/GangliaReporter.java"/>
<suppress checks=".*"
files="sql/core/src/main/java/org/apache/spark/sql/api.java/*"/>
<suppress checks="LineLength"
Expand Down
18 changes: 9 additions & 9 deletions dev/sparktestsupport/modules.py
Original file line number Diff line number Diff line change
Expand Up @@ -253,7 +253,7 @@ def __hash__(self):
name="avro",
dependencies=[sql],
source_file_regexes=[
"external/avro",
"connector/avro",
],
sbt_test_goals=[
"avro/test",
Expand All @@ -264,7 +264,7 @@ def __hash__(self):
name="sql-kafka-0-10",
dependencies=[sql],
source_file_regexes=[
"external/kafka-0-10-sql",
"connector/kafka-0-10-sql",
],
sbt_test_goals=[
"sql-kafka-0-10/test",
Expand Down Expand Up @@ -309,8 +309,8 @@ def __hash__(self):
name="streaming-kinesis-asl",
dependencies=[tags, core],
source_file_regexes=[
"external/kinesis-asl/",
"external/kinesis-asl-assembly/",
"connector/kinesis-asl/",
"connector/kinesis-asl-assembly/",
],
build_profile_flags=[
"-Pkinesis-asl",
Expand All @@ -327,9 +327,9 @@ def __hash__(self):
dependencies=[streaming, core],
source_file_regexes=[
# The ending "/" is necessary otherwise it will include "sql-kafka" codes
"external/kafka-0-10/",
"external/kafka-0-10-assembly",
"external/kafka-0-10-token-provider",
"connector/kafka-0-10/",
"connector/kafka-0-10-assembly",
"connector/kafka-0-10-token-provider",
],
sbt_test_goals=["streaming-kafka-0-10/test", "token-provider-kafka-0-10/test"],
)
Expand Down Expand Up @@ -761,15 +761,15 @@ def __hash__(self):
dependencies=[],
build_profile_flags=["-Pspark-ganglia-lgpl"],
source_file_regexes=[
"external/spark-ganglia-lgpl",
"connector/spark-ganglia-lgpl",
],
)

docker_integration_tests = Module(
name="docker-integration-tests",
dependencies=[sql],
build_profile_flags=["-Pdocker-integration-tests"],
source_file_regexes=["external/docker-integration-tests"],
source_file_regexes=["connector/docker-integration-tests"],
sbt_test_goals=["docker-integration-tests/test"],
environ=None
if "GITHUB_ACTIONS" not in os.environ
Expand Down
2 changes: 1 addition & 1 deletion python/pyspark/sql/avro/functions.py
Original file line number Diff line number Diff line change
Expand Up @@ -140,7 +140,7 @@ def _test() -> None:
import sys
from pyspark.testing.utils import search_jar

avro_jar = search_jar("external/avro", "spark-avro", "spark-avro")
avro_jar = search_jar("connector/avro", "spark-avro", "spark-avro")
if avro_jar is None:
print(
"Skipping all Avro Python tests as the optional Avro project was "
Expand Down
2 changes: 1 addition & 1 deletion python/pyspark/testing/streamingutils.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@
)
else:
kinesis_asl_assembly_jar = search_jar(
"external/kinesis-asl-assembly",
"connector/kinesis-asl-assembly",
"spark-streaming-kinesis-asl-assembly-",
"spark-streaming-kinesis-asl-assembly_",
)
Expand Down

0 comments on commit c7c03f1

Please sign in to comment.