Skip to content

Commit

Permalink
[FLINK-24018][build] Remove Scala dependencies from Java APIs
Browse files Browse the repository at this point in the history
  • Loading branch information
zentol authored Oct 25, 2021
1 parent 055c8c8 commit dd48d05
Show file tree
Hide file tree
Showing 153 changed files with 426 additions and 439 deletions.
6 changes: 3 additions & 3 deletions docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -103,14 +103,14 @@ to its documentation markdown. The following are available for use:

#### Flink Artifact

{{< artifact flink-streaming-java withScalaVersion >}}
{{< artifact flink-streaming-scala withScalaVersion >}}

This will be replaced by the maven artifact for flink-streaming-java that users should copy into their pom.xml file. It will render out to:
This will be replaced by the maven artifact for flink-streaming-scala that users should copy into their pom.xml file. It will render out to:

```xml
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.12</artifactId>
<artifactId>flink-streaming-scala_2.12</artifactId>
<version><!-- current flink version --></version>
</dependency>
```
Expand Down
6 changes: 3 additions & 3 deletions docs/content.zh/docs/connectors/datastream/elasticsearch.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,15 +42,15 @@ under the License.
<tbody>
<tr>
<td>5.x</td>
<td>{{< artifact flink-connector-elasticsearch5 withScalaVersion >}}</td>
<td>{{< artifact flink-connector-elasticsearch5 >}}</td>
</tr>
<tr>
<td>6.x</td>
<td>{{< artifact flink-connector-elasticsearch6 withScalaVersion >}}</td>
<td>{{< artifact flink-connector-elasticsearch6 >}}</td>
</tr>
<tr>
<td>7 及更高版本</td>
<td>{{< artifact flink-connector-elasticsearch7 withScalaVersion >}}</td>
<td>{{< artifact flink-connector-elasticsearch7 >}}</td>
</tr>
</tbody>
</table>
Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/connectors/datastream/jdbc.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ under the License.

添加下面的依赖以便使用该连接器(同时添加 JDBC 驱动):

{{< artifact flink-connector-jdbc withScalaVersion >}}
{{< artifact flink-connector-jdbc >}}

注意该连接器目前还 __不是__ 二进制发行版的一部分,如何在集群中运行请参考 [这里]({{< ref "docs/dev/datastream/project-configuration" >}})。

Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/connectors/datastream/kafka.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ Apache Flink 集成了通用的 Kafka 连接器,它会尽力与 Kafka client
当前 Kafka client 向后兼容 0.10.0 或更高版本的 Kafka broker。
有关 Kafka 兼容性的更多细节,请参考 [Kafka 官方文档](https://kafka.apache.org/protocol.html#protocol_compatibility)

{{< artifact flink-connector-kafka withScalaVersion >}}
{{< artifact flink-connector-kafka >}}

如果使用 Kafka source,```flink-connector-base``` 也需要包含在依赖中:

Expand Down
6 changes: 3 additions & 3 deletions docs/content.zh/docs/connectors/datastream/kinesis.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,14 +31,14 @@ The Kinesis connector provides access to [Amazon AWS Kinesis Streams](http://aws

To use the connector, add the following Maven dependency to your project:

{{< artifact flink-connector-kinesis withScalaVersion >}}
{{< artifact flink-connector-kinesis >}}

{{< hint warning >}}
**Attention** Prior to Flink version 1.10.0 the `flink-connector-kinesis{{< scala_version >}}` has a dependency on code licensed under the [Amazon Software License](https://aws.amazon.com/asl/).
**Attention** Prior to Flink version 1.10.0 the `flink-connector-kinesis` has a dependency on code licensed under the [Amazon Software License](https://aws.amazon.com/asl/).
Linking to the prior versions of flink-connector-kinesis will include this code into your application.
{{< /hint >}}

Due to the licensing issue, the `flink-connector-kinesis{{< scala_version >}}` artifact is not deployed to Maven central for the prior versions. Please see the version specific documentation for further information.
Due to the licensing issue, the `flink-connector-kinesis` artifact is not deployed to Maven central for the prior versions. Please see the version specific documentation for further information.

## Using the Amazon Kinesis Streams Service
Follow the instructions from the [Amazon Kinesis Streams Developer Guide](https://docs.aws.amazon.com/streams/latest/dev/learning-kinesis-module-one-create-stream.html)
Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/connectors/datastream/nifi.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ under the License.
[Apache NiFi](https://nifi.apache.org/) 连接器提供了可以读取和写入的 Source 和 Sink。
使用这个连接器,需要在工程中添加下面的依赖:

{{< artifact flink-connector-nifi withScalaVersion >}}
{{< artifact flink-connector-nifi >}}

注意这些连接器目前还没有包含在二进制发行版中。添加依赖、打包配置以及集群运行的相关信息请参考 [这里]({{< ref "docs/dev/datastream/project-configuration" >}})。

Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/connectors/datastream/pubsub.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ under the License.

这个连接器可向 [Google Cloud PubSub](https://cloud.google.com/pubsub) 读取与写入数据。添加下面的依赖来使用此连接器:

{{< artifact flink-connector-pubsub withScalaVersion >}}
{{< artifact flink-connector-pubsub >}}

<p style="border-radius: 5px; padding: 5px" class="bg-danger">
<b>注意</b>:此连接器最近才加到 Flink 里,还未接受广泛测试。
Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/connectors/datastream/pulsar.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ Flink 当前只提供 [Apache Pulsar](https://pulsar.apache.org) 数据源,用

如果想要了解更多关于 Pulsar API 兼容性设计,可以阅读文档 [PIP-72](https://github.com/apache/pulsar/wiki/PIP-72%3A-Introduce-Pulsar-Interface-Taxonomy%3A-Audience-and-Stability-Classification)

{{< artifact flink-connector-pulsar withScalaVersion >}}
{{< artifact flink-connector-pulsar >}}

Flink 的流连接器并不会放到发行文件里面一同发布,阅读[此文档]({{< ref "docs/dev/datastream/project-configuration" >}}),了解如何将连接器添加到集群实例内。

Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/connectors/datastream/rabbitmq.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ Flink 自身既没有复用 "RabbitMQ AMQP Java Client" 的代码,也没有将

这个连接器可以访问 [RabbitMQ](http://www.rabbitmq.com/) 的数据流。使用这个连接器,需要在工程里添加下面的依赖:

{{< artifact flink-connector-rabbitmq withScalaVersion >}}
{{< artifact flink-connector-rabbitmq >}}

注意连接器现在没有包含在二进制发行版中。集群执行的相关信息请参考 [这里]({{< ref "docs/dev/datastream/project-configuration" >}}).

Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/connectors/datastream/twitter.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ under the License.
Flink Streaming 通过一个内置的 `TwitterSource` 类来创建到 tweets 流的连接。
使用 Twitter 连接器,需要在工程中添加下面的依赖:

{{< artifact flink-connector-twitter withScalaVersion >}}
{{< artifact flink-connector-twitter >}}

注意:当前的二进制发行版还没有这些连接器。集群执行请参考[这里]({{< ref "docs/dev/datastream/project-configuration" >}}).

Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/dev/dataset/cluster_execution.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ Flink 程序可以分布式运行在多机器集群上。有两种方式可以
```xml
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients{{< scala_version >}}</artifactId>
<artifactId>flink-clients</artifactId>
<version>{{< version >}}</version>
</dependency>
```
Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/dev/dataset/local_execution.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ If you are developing your program in a Maven project, you have to add the `flin
```xml
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients{{< scala_version >}}</artifactId>
<artifactId>flink-clients</artifactId>
<version>{{< version >}}</version>
</dependency>
```
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ under the License.

为了在 Flink 集群上使用 queryable state,需要进行以下操作:

1.`flink-queryable-state-runtime{{< scala_version >}}-{{< version >}}.jar`
1.`flink-queryable-state-runtime-{{< version >}}.jar`
[Flink distribution]({{< downloads >}} "Apache Flink: Downloads") 的 `opt/` 目录拷贝到 `lib/` 目录;
2. 将参数 `queryable-state.enable` 设置为 `true`。详细信息以及其它配置可参考文档 [Configuration]({{< ref "docs/deployment/config" >}}#queryable-state)。

Expand Down
8 changes: 4 additions & 4 deletions docs/content.zh/docs/dev/datastream/project-configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ When setting up a project manually, you need to add the following dependencies f
```xml
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java{{< scala_version >}}</artifactId>
<artifactId>flink-streaming-java</artifactId>
<version>{{< version >}}</version>
<scope>provided</scope>
</dependency>
Expand Down Expand Up @@ -124,7 +124,7 @@ Below is an example adding the connector for Kafka as a dependency (Maven syntax
```xml
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka{{< scala_version >}}</artifactId>
<artifactId>flink-connector-kafka</artifactId>
<version>{{< version >}}</version>
</dependency>
```
Expand Down Expand Up @@ -373,13 +373,13 @@ dependencies {
// Compile-time dependencies that should NOT be part of the
// shadow jar and are provided in the lib folder of Flink
// --------------------------------------------------------------
compile "org.apache.flink:flink-streaming-java_${scalaBinaryVersion}:${flinkVersion}"
compile "org.apache.flink:flink-streaming-java:${flinkVersion}"
// --------------------------------------------------------------
// Dependencies that should be part of the shadow jar, e.g.
// connectors. These must be in the flinkShadowJar configuration!
// --------------------------------------------------------------
//flinkShadowJar "org.apache.flink:flink-connector-kafka_${scalaBinaryVersion}:${flinkVersion}"
//flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"
compile "org.apache.logging.log4j:log4j-api:${log4jVersion}"
compile "org.apache.logging.log4j:log4j-core:${log4jVersion}"
Expand Down
6 changes: 3 additions & 3 deletions docs/content.zh/docs/dev/datastream/testing.md
Original file line number Diff line number Diff line change
Expand Up @@ -153,9 +153,9 @@ class IncrementFlatMapFunctionTest extends FlatSpec with MockFactory {

要使用测试工具,还需要一组其他的依赖项(测试范围)。

{{< artifact flink-test-utils withScalaVersion withTestScope >}}
{{< artifact flink-test-utils withTestScope >}}
{{< artifact flink-runtime withTestScope >}}
{{< artifact flink-streaming-java withScalaVersion withTestScope withTestClassifier >}}
{{< artifact flink-streaming-java withTestScope withTestClassifier >}}

现在,可以使用测试工具将记录和 watermark 推送到用户自定义函数或自定义算子中,控制处理时间,最后对算子的输出(包括旁路输出)进行校验。

Expand Down Expand Up @@ -401,7 +401,7 @@ Apache Flink 提供了一个名为 `MiniClusterWithClientResource` 的 Junit 规

要使用 `MiniClusterWithClientResource`,需要添加一个额外的依赖项(测试范围)。

{{< artifact flink-test-utils withScalaVersion withTestScope >}}
{{< artifact flink-test-utils withTestScope >}}

让我们采用与前面几节相同的简单 `MapFunction`来做示例。

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ Flink 的 `MATCH_RECOGNIZE` 子句实现是一个完整标准子集。仅支持
```xml
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-cep{{< scala_version >}}</artifactId>
<artifactId>flink-cep</artifactId>
<version>{{< version >}}</version>
</dependency>
```
Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/libs/cep.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ FlinkCEP是在Flink上层实现的复杂事件处理库。

{{< tabs "722d55a5-7f12-4bcc-b080-b28d5e8860ac" >}}
{{< tab "Java" >}}
{{< artifact flink-cep withScalaVersion >}}
{{< artifact flink-cep >}}
{{< /tab >}}
{{< tab "Scala" >}}
{{< artifact flink-cep-scala withScalaVersion >}}
Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/libs/gelly/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ Add the following dependency to your `pom.xml` to use Gelly.

{{< tabs "96de5128-3c66-4942-9498-e9a8ae439314" >}}
{{< tab "Java" >}}
{{< artifact flink-gelly withScalaVersion >}}
{{< artifact flink-gelly >}}
{{< /tab >}}
{{< tab "Scala" >}}
{{< artifact flink-gelly-scala withScalaVersion >}}
Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/libs/state_processor_api.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ For example, you can now arbitrarily modify the data types of states, adjust the

To get started with the state processor api, include the following library in your application.

{{< artifact flink-state-processor-api withScalaVersion >}}
{{< artifact flink-state-processor-api >}}

## Mapping Application State to DataSets

Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/ops/state/state_backends.md
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,7 @@ env.setStateBackend(new HashMapStateBackend())
```xml
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-statebackend-rocksdb{{< scala_version >}}</artifactId>
<artifactId>flink-statebackend-rocksdb</artifactId>
<version>{{< version >}}</version>
<scope>provided</scope>
</dependency>
Expand Down
6 changes: 3 additions & 3 deletions docs/content/docs/connectors/datastream/elasticsearch.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,15 +44,15 @@ of the Elasticsearch installation:
<tbody>
<tr>
<td>5.x</td>
<td>{{< artifact flink-connector-elasticsearch5 withScalaVersion >}}</td>
<td>{{< artifact flink-connector-elasticsearch5 >}}</td>
</tr>
<tr>
<td>6.x</td>
<td>{{< artifact flink-connector-elasticsearch6 withScalaVersion >}}</td>
<td>{{< artifact flink-connector-elasticsearch6 >}}</td>
</tr>
<tr>
<td>7 and later versions</td>
<td>{{< artifact flink-connector-elasticsearch7 withScalaVersion >}}</td>
<td>{{< artifact flink-connector-elasticsearch7 >}}</td>
</tr>
</tbody>
</table>
Expand Down
2 changes: 1 addition & 1 deletion docs/content/docs/connectors/datastream/jdbc.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ This connector provides a sink that writes data to a JDBC database.

To use it, add the following dependency to your project (along with your JDBC driver):

{{< artifact flink-connector-jdbc withScalaVersion >}}
{{< artifact flink-connector-jdbc >}}

Note that the streaming connectors are currently __NOT__ part of the binary distribution. See how to link with them for cluster execution [here]({{< ref "docs/dev/datastream/project-configuration" >}}).
A driver dependency is also required to connect to a specified database. Please consult your database documentation on how to add the corresponding driver.
Expand Down
2 changes: 1 addition & 1 deletion docs/content/docs/connectors/datastream/kafka.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ The version of the client it uses may change between Flink releases.
Modern Kafka clients are backwards compatible with broker versions 0.10.0 or later.
For details on Kafka compatibility, please refer to the official [Kafka documentation](https://kafka.apache.org/protocol.html#protocol_compatibility).

{{< artifact flink-connector-kafka withScalaVersion >}}
{{< artifact flink-connector-kafka >}}

Flink's streaming connectors are not currently part of the binary distribution.
See how to link with them for cluster execution [here]({{< ref "docs/dev/datastream/project-configuration" >}}).
Expand Down
6 changes: 3 additions & 3 deletions docs/content/docs/connectors/datastream/kinesis.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,14 +31,14 @@ The Kinesis connector provides access to [Amazon AWS Kinesis Streams](http://aws

To use the connector, add the following Maven dependency to your project:

{{< artifact flink-connector-kinesis withScalaVersion >}}
{{< artifact flink-connector-kinesis >}}

{{< hint warning >}}
**Attention** Prior to Flink version 1.10.0 the `flink-connector-kinesis{{< scala_version >}}` has a dependency on code licensed under the [Amazon Software License](https://aws.amazon.com/asl/).
**Attention** Prior to Flink version 1.10.0 the `flink-connector-kinesis` has a dependency on code licensed under the [Amazon Software License](https://aws.amazon.com/asl/).
Linking to the prior versions of flink-connector-kinesis will include this code into your application.
{{< /hint >}}

Due to the licensing issue, the `flink-connector-kinesis{{< scala_version >}}` artifact is not deployed to Maven central for the prior versions. Please see the version specific documentation for further information.
Due to the licensing issue, the `flink-connector-kinesis` artifact is not deployed to Maven central for the prior versions. Please see the version specific documentation for further information.

## Using the Amazon Kinesis Streams Service
Follow the instructions from the [Amazon Kinesis Streams Developer Guide](https://docs.aws.amazon.com/streams/latest/dev/learning-kinesis-module-one-create-stream.html)
Expand Down
2 changes: 1 addition & 1 deletion docs/content/docs/connectors/datastream/nifi.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ This connector provides a Source and Sink that can read from and write to
[Apache NiFi](https://nifi.apache.org/). To use this connector, add the
following dependency to your project:

{{< artifact flink-connector-nifi withScalaVersion >}}
{{< artifact flink-connector-nifi >}}

Note that the streaming connectors are currently not part of the binary
distribution. See
Expand Down
2 changes: 1 addition & 1 deletion docs/content/docs/connectors/datastream/pubsub.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ This connector provides a Source and Sink that can read from and write to
[Google Cloud PubSub](https://cloud.google.com/pubsub). To use this connector, add the
following dependency to your project:

{{< artifact flink-connector-gcp-pubsub withScalaVersion >}}
{{< artifact flink-connector-gcp-pubsub >}}

{{< hint warning >}}
<b>Note</b>: This connector has been added to Flink recently. It has not received widespread testing yet.
Expand Down
2 changes: 1 addition & 1 deletion docs/content/docs/connectors/datastream/pulsar.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ Pulsar [transactions](https://pulsar.apache.org/docs/en/txn-what/),
it is recommended to use Pulsar 2.8.0 or higher releases.
For details on Pulsar compatibility, please refer to the [PIP-72](https://github.com/apache/pulsar/wiki/PIP-72%3A-Introduce-Pulsar-Interface-Taxonomy%3A-Audience-and-Stability-Classification).

{{< artifact flink-connector-pulsar withScalaVersion >}}
{{< artifact flink-connector-pulsar >}}

Flink's streaming connectors are not currently part of the binary distribution.
See how to link with them for cluster execution [here]({{< ref "docs/dev/datastream/project-configuration" >}}).
Expand Down
2 changes: 1 addition & 1 deletion docs/content/docs/connectors/datastream/rabbitmq.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ must be aware that this may be subject to conditions declared in the Mozilla Pub

This connector provides access to data streams from [RabbitMQ](http://www.rabbitmq.com/). To use this connector, add the following dependency to your project:

{{< artifact flink-connector-rabbitmq withScalaVersion >}}
{{< artifact flink-connector-rabbitmq >}}

Note that the streaming connectors are currently not part of the binary distribution. See linking with them for cluster execution [here]({{< ref "docs/dev/datastream/project-configuration" >}}).

Expand Down
2 changes: 1 addition & 1 deletion docs/content/docs/connectors/datastream/twitter.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ The [Twitter Streaming API](https://dev.twitter.com/docs/streaming-apis) provide
Flink Streaming comes with a built-in `TwitterSource` class for establishing a connection to this stream.
To use this connector, add the following dependency to your project:

{{< artifact flink-connector-twitter withScalaVersion >}}
{{< artifact flink-connector-twitter >}}

Note that the streaming connectors are currently not part of the binary distribution.
See linking with them for cluster execution [here]({{< ref "docs/dev/datastream/project-configuration" >}}).
Expand Down
2 changes: 1 addition & 1 deletion docs/content/docs/dev/dataset/cluster_execution.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ If you are developing your program as a Maven project, you have to add the
```xml
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients{{< scala_version >}}</artifactId>
<artifactId>flink-clients</artifactId>
<version>{{< version >}}</version>
</dependency>
```
Expand Down
2 changes: 1 addition & 1 deletion docs/content/docs/dev/dataset/local_execution.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ If you are developing your program in a Maven project, you have to add the `flin
```xml
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients{{< scala_version >}}</artifactId>
<artifactId>flink-clients</artifactId>
<version>{{< version >}}</version>
</dependency>
```
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ response back to the client.

To enable queryable state on your Flink cluster, you need to do the following:

1. copy the `flink-queryable-state-runtime{{< scala_version >}}-{{< version >}}.jar`
1. copy the `flink-queryable-state-runtime-{{< version >}}.jar`
from the `opt/` folder of your [Flink distribution]({{< downloads >}} "Apache Flink: Downloads"),
to the `lib/` folder.
2. set the property `queryable-state.enable` to `true`. See the [Configuration]({{< ref "docs/deployment/config" >}}#queryable-state) documentation for details and additional parameters.
Expand Down
Loading

0 comments on commit dd48d05

Please sign in to comment.