Skip to content

Commit

Permalink
[hotfix][docs] Fix parts of broken links
Browse files Browse the repository at this point in the history
  • Loading branch information
wuchong committed Jun 11, 2020
1 parent 5b20de1 commit 8ee8f29
Show file tree
Hide file tree
Showing 40 changed files with 51 additions and 829 deletions.
778 changes: 0 additions & 778 deletions docs/dev/api_concepts.zh.md

This file was deleted.

2 changes: 1 addition & 1 deletion docs/dev/batch/connectors.md
Original file line number Diff line number Diff line change
Expand Up @@ -186,7 +186,7 @@ This [GitHub repository documents how to use MongoDB with Apache Flink (starting
## Hive Connector

Starting from 1.9.0, Apache Flink provides Hive connector to access Apache Hive tables. [HiveCatalog]({{ site.baseurl }}/dev/table/catalogs.html#hivecatalog) is required in order to use the Hive connector.
After HiveCatalog is setup, please refer to [Reading & Writing Hive Tables]({{ site.baseurl }}/dev/table/hive/read_write_hive.html) for the usage of the Hive connector and its limitations.
After HiveCatalog is setup, please refer to [Reading & Writing Hive Tables]({{ site.baseurl }}/dev/table/hive/hive_read_write.html) for the usage of the Hive connector and its limitations.
Same as HiveCatalog, the officially supported Apache Hive versions for Hive connector are 2.3.4 and 1.2.1.

{% top %}
2 changes: 1 addition & 1 deletion docs/dev/batch/connectors.zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -186,7 +186,7 @@ This [GitHub repository documents how to use MongoDB with Apache Flink (starting
## Hive Connector

Starting from 1.9.0, Apache Flink provides Hive connector to access Apache Hive tables. [HiveCatalog]({{ site.baseurl }}/zh/dev/table/catalogs.html#hivecatalog) is required in order to use the Hive connector.
After HiveCatalog is setup, please refer to [Reading & Writing Hive Tables]({{ site.baseurl }}/zh/dev/table/hive/read_write_hive.html) for the usage of the Hive connector and its limitations.
After HiveCatalog is setup, please refer to [Reading & Writing Hive Tables]({{ site.baseurl }}/zh/dev/table/hive/hive_read_write.html) for the usage of the Hive connector and its limitations.
Same as HiveCatalog, the officially supported Apache Hive versions for Hive connector are 2.3.4 and 1.2.1.

{% top %}
2 changes: 1 addition & 1 deletion docs/dev/batch/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ Example Program

The following program is a complete, working example of WordCount. You can copy & paste the code
to run it locally. You only have to include the correct Flink's library into your project
(see Section [Linking with Flink]({{ site.baseurl }}/dev/projectsetup/dependencies.html)) and specify the imports. Then you are ready
(see Section [Linking with Flink]({{ site.baseurl }}/dev/project-configuration.html)) and specify the imports. Then you are ready
to go!

<div class="codetabs" markdown="1">
Expand Down
2 changes: 1 addition & 1 deletion docs/dev/batch/index.zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ Example Program

The following program is a complete, working example of WordCount. You can copy &amp; paste the code
to run it locally. You only have to include the correct Flink's library into your project
(see Section [Linking with Flink]({{ site.baseurl }}/dev/projectsetup/dependencies.html)) and specify the imports. Then you are ready
(see Section [Linking with Flink]({{ site.baseurl }}/dev/project-configuration.html)) and specify the imports. Then you are ready
to go!

<div class="codetabs" markdown="1">
Expand Down
2 changes: 1 addition & 1 deletion docs/dev/connectors/cassandra.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ To use this connector, add the following dependency to your project:
</dependency>
{% endhighlight %}

Note that the streaming connectors are currently __NOT__ part of the binary distribution. See how to link with them for cluster execution [here]({{ site.baseurl}}/dev/projectsetup/dependencies.html).
Note that the streaming connectors are currently __NOT__ part of the binary distribution. See how to link with them for cluster execution [here]({{ site.baseurl}}/dev/project-configuration.html).

## Installing Apache Cassandra
There are multiple ways to bring up a Cassandra instance on local machine:
Expand Down
2 changes: 1 addition & 1 deletion docs/dev/connectors/cassandra.zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ To use this connector, add the following dependency to your project:
</dependency>
{% endhighlight %}

Note that the streaming connectors are currently __NOT__ part of the binary distribution. See how to link with them for cluster execution [here]({{ site.baseurl}}/dev/projectsetup/dependencies.html).
Note that the streaming connectors are currently __NOT__ part of the binary distribution. See how to link with them for cluster execution [here]({{ site.baseurl}}/dev/project-configuration.html).

## Installing Apache Cassandra
There are multiple ways to bring up a Cassandra instance on local machine:
Expand Down
4 changes: 2 additions & 2 deletions docs/dev/connectors/elasticsearch.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ of the Elasticsearch installation:
</table>

Note that the streaming connectors are currently not part of the binary
distribution. See [here]({{site.baseurl}}/dev/projectsetup/dependencies.html) for information
distribution. See [here]({{site.baseurl}}/dev/project-configuration.html) for information
about how to package the program with the libraries for cluster execution.

## Installing Elasticsearch
Expand Down Expand Up @@ -464,7 +464,7 @@ More information about Elasticsearch can be found [here](https://elastic.co).

For the execution of your Flink program, it is recommended to build a
so-called uber-jar (executable jar) containing all your dependencies
(see [here]({{site.baseurl}}/dev/projectsetup/dependencies.html) for further information).
(see [here]({{site.baseurl}}/dev/project-configuration.html) for further information).

Alternatively, you can put the connector's jar file into Flink's `lib/` folder to make it available
system-wide, i.e. for all job being run.
Expand Down
4 changes: 2 additions & 2 deletions docs/dev/connectors/elasticsearch.zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ of the Elasticsearch installation:
</table>

Note that the streaming connectors are currently not part of the binary
distribution. See [here]({{site.baseurl}}/dev/projectsetup/dependencies.html) for information
distribution. See [here]({{site.baseurl}}/dev/project-configuration.html) for information
about how to package the program with the libraries for cluster execution.

## Installing Elasticsearch
Expand Down Expand Up @@ -464,7 +464,7 @@ More information about Elasticsearch can be found [here](https://elastic.co).

For the execution of your Flink program, it is recommended to build a
so-called uber-jar (executable jar) containing all your dependencies
(see [here]({{site.baseurl}}/dev/projectsetup/dependencies.html) for further information).
(see [here]({{site.baseurl}}/dev/project-configuration.html) for further information).

Alternatively, you can put the connector's jar file into Flink's `lib/` folder to make it available
system-wide, i.e. for all job being run.
Expand Down
2 changes: 1 addition & 1 deletion docs/dev/connectors/filesystem_sink.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ following dependency to your project:

Note that the streaming connectors are currently not part of the binary
distribution. See
[here]({{site.baseurl}}/dev/projectsetup/dependencies.html)
[here]({{site.baseurl}}/dev/project-configuration.html)
for information about how to package the program with the libraries for
cluster execution.

Expand Down
2 changes: 1 addition & 1 deletion docs/dev/connectors/filesystem_sink.zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ under the License.
</dependency>
{% endhighlight %}

注意连接器目前还不是二进制发行版的一部分,添加依赖、打包配置以及集群运行信息请参考 [这里]({{site.baseurl}}/zh/getting-started/project-setup/dependencies.html)
注意连接器目前还不是二进制发行版的一部分,添加依赖、打包配置以及集群运行信息请参考 [这里]({{site.baseurl}}/zh/dev/project-configuration.html)

#### 分桶文件 Sink

Expand Down
2 changes: 1 addition & 1 deletion docs/dev/connectors/jdbc.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ To use it, add the following dependency to your project (along with your JDBC-dr
</dependency>
{% endhighlight %}

Note that the streaming connectors are currently __NOT__ part of the binary distribution. See how to link with them for cluster execution [here]({{ site.baseurl}}/dev/projectsetup/dependencies.html).
Note that the streaming connectors are currently __NOT__ part of the binary distribution. See how to link with them for cluster execution [here]({{ site.baseurl}}/dev/project-configuration.html).

Created JDBC sink provides at-least-once guarantee.
Effectively exactly-once can be achived using upsert statements or idempotent updates.
Expand Down
2 changes: 1 addition & 1 deletion docs/dev/connectors/jdbc.zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ To use it, add the following dependency to your project (along with your JDBC-dr
</dependency>
{% endhighlight %}

Note that the streaming connectors are currently __NOT__ part of the binary distribution. See how to link with them for cluster execution [here]({{ site.baseurl}}/dev/projectsetup/dependencies.html).
Note that the streaming connectors are currently __NOT__ part of the binary distribution. See how to link with them for cluster execution [here]({{ site.baseurl}}/dev/project-configuration.html).

Created JDBC sink provides at-least-once guarantee.
Effectively exactly-once can be achived using upsert statements or idempotent updates.
Expand Down
2 changes: 1 addition & 1 deletion docs/dev/connectors/kafka.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ For details on Kafka compatibility, please refer to the official [Kafka document
</div>

Flink's streaming connectors are not currently part of the binary distribution.
See how to link with them for cluster execution [here]({{ site.baseurl}}/dev/projectsetup/dependencies.html).
See how to link with them for cluster execution [here]({{ site.baseurl}}/dev/project-configuration.html).

## Kafka Consumer

Expand Down
2 changes: 1 addition & 1 deletion docs/dev/connectors/kafka.zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ Flink 提供了专门的 Kafka 连接器,向 Kafka topic 中读取或者写入
{% endhighlight %}

请注意:目前流连接器还不是二进制分发的一部分。
[在此处]({{ site.baseurl }}/zh/getting-started/project-setup/dependencies.html)可以了解到如何链接它们以实现在集群中执行。
[在此处]({{ site.baseurl }}/zh/dev/project-configuration.html)可以了解到如何链接它们以实现在集群中执行。

## 安装 Apache Kafka

Expand Down
2 changes: 1 addition & 1 deletion docs/dev/connectors/nifi.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ following dependency to your project:

Note that the streaming connectors are currently not part of the binary
distribution. See
[here]({{site.baseurl}}/dev/projectsetup/dependencies.html)
[here]({{site.baseurl}}/dev/project-configuration.html)
for information about how to package the program with the libraries for
cluster execution.

Expand Down
2 changes: 1 addition & 1 deletion docs/dev/connectors/nifi.zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ under the License.
</dependency>
{% endhighlight %}

注意这些连接器目前还没有包含在二进制发行版中。添加依赖、打包配置以及集群运行的相关信息请参考 [这里]({{site.baseurl}}/zh/getting-started/project-setup/dependencies.html)
注意这些连接器目前还没有包含在二进制发行版中。添加依赖、打包配置以及集群运行的相关信息请参考 [这里]({{site.baseurl}}/zh/dev/project-configuration.html)

#### 安装 Apache NiFi

Expand Down
2 changes: 1 addition & 1 deletion docs/dev/connectors/pubsub.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ following dependency to your project:

Note that the streaming connectors are currently not part of the binary
distribution. See
[here]({{ site.baseurl }}/dev/projectsetup/dependencies.html)
[here]({{ site.baseurl }}/dev/project-configuration.html)
for information about how to package the program with the libraries for
cluster execution.

Expand Down
2 changes: 1 addition & 1 deletion docs/dev/connectors/pubsub.zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ under the License.
<b>注意</b>:此连接器最近才加到 Flink 里,还未接受广泛测试。
</p>

注意连接器目前还不是二进制发行版的一部分,添加依赖、打包配置以及集群运行信息请参考[这里]({{ site.baseurl }}/zh/getting-started/project-setup/dependencies.html)
注意连接器目前还不是二进制发行版的一部分,添加依赖、打包配置以及集群运行信息请参考[这里]({{ site.baseurl }}/zh/dev/project-configuration.html)

## Consuming or Producing PubSubMessages

Expand Down
2 changes: 1 addition & 1 deletion docs/dev/connectors/rabbitmq.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ This connector provides access to data streams from [RabbitMQ](http://www.rabbit
</dependency>
{% endhighlight %}

Note that the streaming connectors are currently not part of the binary distribution. See linking with them for cluster execution [here]({{site.baseurl}}/dev/projectsetup/dependencies.html).
Note that the streaming connectors are currently not part of the binary distribution. See linking with them for cluster execution [here]({{site.baseurl}}/dev/project-configuration.html).

### Installing RabbitMQ
Follow the instructions from the [RabbitMQ download page](http://www.rabbitmq.com/download.html). After the installation the server automatically starts, and the application connecting to RabbitMQ can be launched.
Expand Down
2 changes: 1 addition & 1 deletion docs/dev/connectors/rabbitmq.zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ Flink 自身既没有复用 "RabbitMQ AMQP Java Client" 的代码,也没有将
</dependency>
{% endhighlight %}

注意连接器现在没有包含在二进制发行版中。集群执行的相关信息请参考 [这里]({{site.baseurl}}/zh/getting-started/project-setup/dependencies.html).
注意连接器现在没有包含在二进制发行版中。集群执行的相关信息请参考 [这里]({{site.baseurl}}/zh/dev/project-configuration.html).

### 安装 RabbitMQ
安装 RabbitMQ 请参考 [RabbitMQ 下载页面](http://www.rabbitmq.com/download.html)。安装完成之后,服务会自动拉起,应用程序就可以尝试连接到 RabbitMQ 了。
Expand Down
2 changes: 1 addition & 1 deletion docs/dev/connectors/twitter.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ To use this connector, add the following dependency to your project:
{% endhighlight %}

Note that the streaming connectors are currently not part of the binary distribution.
See linking with them for cluster execution [here]({{site.baseurl}}/dev/projectsetup/dependencies.html).
See linking with them for cluster execution [here]({{site.baseurl}}/dev/project-configuration.html).

#### Authentication
In order to connect to the Twitter stream the user has to register their program and acquire the necessary information for the authentication. The process is described below.
Expand Down
2 changes: 1 addition & 1 deletion docs/dev/connectors/twitter.zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ Flink Streaming 通过一个内置的 `TwitterSource` 类来创建到 tweets 流
</dependency>
{% endhighlight %}

注意:当前的二进制发行版还没有这些连接器。集群执行请参考[这里]({{site.baseurl}}/zh/getting-started/project-setup/dependencies.html).
注意:当前的二进制发行版还没有这些连接器。集群执行请参考[这里]({{site.baseurl}}/zh/dev/project-configuration.html).

#### 认证
使用 Twitter 流,用户需要先注册自己的程序,获取认证相关的必要信息。过程如下:
Expand Down
4 changes: 2 additions & 2 deletions docs/dev/libs/cep.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ library makes when [dealing with lateness](#handling-lateness-in-event-time) in

## Getting Started

If you want to jump right in, [set up a Flink program]({{ site.baseurl }}/dev/projectsetup/dependencies.html) and
If you want to jump right in, [set up a Flink program]({{ site.baseurl }}/dev/project-configuration.html) and
add the FlinkCEP dependency to the `pom.xml` of your project.

<div class="codetabs" markdown="1">
Expand All @@ -63,7 +63,7 @@ add the FlinkCEP dependency to the `pom.xml` of your project.
</div>
</div>

{% info %} FlinkCEP is not part of the binary distribution. See how to link with it for cluster execution [here]({{site.baseurl}}/dev/projectsetup/dependencies.html).
{% info %} FlinkCEP is not part of the binary distribution. See how to link with it for cluster execution [here]({{site.baseurl}}/dev/project-configuration.html).

Now you can start writing your first CEP program using the Pattern API.

Expand Down
4 changes: 2 additions & 2 deletions docs/dev/libs/cep.zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ FlinkCEP是在Flink上层实现的复杂事件处理库。

## 开始

如果你想现在开始尝试,[创建一个Flink程序]({{ site.baseurl }}/zh/getting-started/project-setup/dependencies.html),
如果你想现在开始尝试,[创建一个Flink程序]({{ site.baseurl }}/zh/dev/project-configuration.html),
添加FlinkCEP的依赖到项目的`pom.xml`文件中。

<div class="codetabs" markdown="1">
Expand All @@ -60,7 +60,7 @@ FlinkCEP是在Flink上层实现的复杂事件处理库。
</div>
</div>

{% info 提示 %} FlinkCEP不是二进制发布包的一部分。在集群上执行如何链接它可以看[这里]({{site.baseurl}}/zh/getting-started/project-setup/dependencies.html)
{% info 提示 %} FlinkCEP不是二进制发布包的一部分。在集群上执行如何链接它可以看[这里]({{site.baseurl}}/zh/dev/project-configuration.html)

现在可以开始使用Pattern API写你的第一个CEP程序了。

Expand Down
2 changes: 1 addition & 1 deletion docs/dev/libs/gelly/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ Add the following dependency to your `pom.xml` to use Gelly.
</div>
</div>

Note that Gelly is not part of the binary distribution. See [linking]({{ site.baseurl }}/dev/projectsetup/dependencies.html) for
Note that Gelly is not part of the binary distribution. See [linking]({{ site.baseurl }}/dev/project-configuration.html) for
instructions on packaging Gelly libraries into Flink user programs.

The remaining sections provide a description of available methods and present several examples of how to use Gelly and how to mix it with the Flink DataSet API.
Expand Down
2 changes: 1 addition & 1 deletion docs/dev/libs/gelly/index.zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ Add the following dependency to your `pom.xml` to use Gelly.
</div>
</div>

Note that Gelly is not part of the binary distribution. See [linking]({{ site.baseurl }}/dev/projectsetup/dependencies.html) for
Note that Gelly is not part of the binary distribution. See [linking]({{ site.baseurl }}/dev/project-configuration.html) for
instructions on packaging Gelly libraries into Flink user programs.

The remaining sections provide a description of available methods and present several examples of how to use Gelly and how to mix it with the Flink DataSet API.
Expand Down
4 changes: 2 additions & 2 deletions docs/dev/project-configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -120,8 +120,8 @@ We recommend packaging the application code and all its required dependencies in
we refer to as the *application jar*. The application jar can be submitted to an already running Flink cluster,
or added to a Flink application container image.

Projects created from the [Java Project Template]({{ site.baseurl }}/dev/projectsetup/java_api_quickstart.html) or
[Scala Project Template]({{ site.baseurl }}/dev/projectsetup/scala_api_quickstart.html) are configured to automatically include
Projects created from the [Java Project Template]({{ site.baseurl }}/dev/project-configuration.html) or
[Scala Project Template]({{ site.baseurl }}/dev/project-configuration) are configured to automatically include
the application dependencies into the application jar when running `mvn clean package`. For projects that are
not set up from those templates, we recommend adding the Maven Shade Plugin (as listed in the Appendix below)
to build the application jar with all required dependencies.
Expand Down
4 changes: 2 additions & 2 deletions docs/dev/project-configuration.zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -120,8 +120,8 @@ We recommend packaging the application code and all its required dependencies in
we refer to as the *application jar*. The application jar can be submitted to an already running Flink cluster,
or added to a Flink application container image.

Projects created from the [Java Project Template]({{ site.baseurl }}/dev/projectsetup/java_api_quickstart.html) or
[Scala Project Template]({{ site.baseurl }}/dev/projectsetup/scala_api_quickstart.html) are configured to automatically include
Projects created from the [Java Project Template]({{ site.baseurl }}/dev/project-configuration.html) or
[Scala Project Template]({{ site.baseurl }}/dev/project-configuration) are configured to automatically include
the application dependencies into the application jar when running `mvn clean package`. For projects that are
not set up from those templates, we recommend adding the Maven Shade Plugin (as listed in the Appendix below)
to build the application jar with all required dependencies.
Expand Down
2 changes: 1 addition & 1 deletion docs/dev/stream/state/queryable_state.md
Original file line number Diff line number Diff line change
Expand Up @@ -180,7 +180,7 @@ jar which must be explicitly included as a dependency in the `pom.xml` of your p
{% endhighlight %}
</div>

For more on this, you can check how to [set up a Flink program]({{ site.baseurl }}/dev/projectsetup/dependencies.html).
For more on this, you can check how to [set up a Flink program]({{ site.baseurl }}/dev/project-configuration.html).

The `QueryableStateClient` will submit your query to the internal proxy, which will then process your query and return
the final result. The only requirement to initialize the client is to provide a valid `TaskManager` hostname (remember
Expand Down
2 changes: 1 addition & 1 deletion docs/dev/stream/state/queryable_state.zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -149,7 +149,7 @@ descriptor.setQueryable("query-name"); // queryable state name
{% endhighlight %}
</div>

关于依赖的更多信息, 可以参考如何 [配置 Flink 项目]({{ site.baseurl }}/zh/getting-started/project-setup/dependencies.html).
关于依赖的更多信息, 可以参考如何 [配置 Flink 项目]({{ site.baseurl }}/zh/dev/project-configuration.html).

`QueryableStateClient` 将提交你的请求到内部代理,代理会处理请求并返回结果。客户端的初始化只需要提供一个有效的 `TaskManager` 主机名
(每个 task manager 上都运行着一个 queryable state 代理),以及代理监听的端口号。关于如何配置代理以及端口号可以参考 [Configuration Section](#configuration).
Expand Down
Loading

0 comments on commit 8ee8f29

Please sign in to comment.