Skip to content

Commit

Permalink
[FLINK-2559] Clean up JavaDocs
Browse files Browse the repository at this point in the history
- Remove broken HTML tags like <br/>, <p/>, ...
- close unclosed HTML tags
- replaces special chars by HTML escaping, e.g., '<' by &lt;
- wrap code examples by {@code}
- fix incorrect @see and @link references
- fix incorrect @throws
- fix typos

This closes apache#1298
  • Loading branch information
hczerpak authored and fhueske committed Oct 29, 2015
1 parent ec7bf50 commit 680b5a9
Show file tree
Hide file tree
Showing 151 changed files with 513 additions and 570 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -696,7 +696,8 @@ protected int executeProgramBlocking(PackagedProgram program, Client client, int
* Creates a Packaged program from the given command line options.
*
* @return A PackagedProgram (upon success)
* @throws java.io.FileNotFoundException, org.apache.flink.client.program.ProgramInvocationException, java.lang.Throwable
* @throws java.io.FileNotFoundException
* @throws org.apache.flink.client.program.ProgramInvocationException
*/
protected PackagedProgram buildProgram(ProgramOptions options)
throws FileNotFoundException, ProgramInvocationException
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,19 +28,16 @@
* Implements the "Exclamation" program that attaches five exclamation mark to every line of a text files in a streaming
* fashion. The program is constructed as a regular {@link backtype.storm.generated.StormTopology} and submitted to
* Flink for execution in the same way as to a Storm {@link backtype.storm.LocalCluster}.
* <p/>
* <p>
* This example shows how to run program directly within Java, thus it cannot be used to submit a
* {@link backtype.storm.generated.StormTopology} via Flink command line clients (ie, bin/flink).
* <p/>
* <p/>
* <p>
* The input is a plain text file with lines separated by newline characters.
* <p/>
* <p/>
* Usage: <code>ExclamationLocal &lt;text path&gt; &lt;result path&gt;</code><br/>
* <p>
* Usage: <code>ExclamationLocal &lt;text path&gt; &lt;result path&gt;</code><br>
* If no parameters are provided, the program is run with default data from
* {@link org.apache.flink.examples.java.wordcount.util.WordCountData}.
* <p/>
* <p/>
* <p>
* This example shows how to:
* <ul>
* <li>run a regular Storm program locally on Flink</li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,17 +29,14 @@

/**
* Implements the "Exclamation" program that attaches two exclamation marks to every line of a text files in a streaming
* fashion. The program is constructed as a regular {@link StormTopology}.
* <p/>
* <p/>
* fashion. The program is constructed as a regular {@link backtype.storm.generated.StormTopology}.
* <p>
* The input is a plain text file with lines separated by newline characters.
* <p/>
* <p/>
* <p>
* Usage: <code>Exclamation[Local|RemoteByClient|RemoteBySubmitter] &lt;text path&gt;
* &lt;result path&gt;</code><br/>
* &lt;result path&gt;</code><br>
* If no parameters are provided, the program is run with default data from {@link WordCountData}.
* <p/>
* <p/>
* <p>
* This example shows how to:
* <ul>
* <li>construct a regular Storm topology as Flink program</li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,17 +31,14 @@

/**
* Implements the "Exclamation" program that attaches 3+x exclamation marks to every line of a text files in a streaming
* fashion. The program is constructed as a regular {@link StormTopology}.
* <p/>
* <p/>
* fashion. The program is constructed as a regular {@link backtype.storm.generated.StormTopology}.
* <p>
* The input is a plain text file with lines separated by newline characters.
* <p/>
* <p/>
* <p>
* Usage:
* <code>ExclamationWithmBolt &lt;text path&gt; &lt;result path&gt; &lt;number of exclamation marks&gt;</code><br/>
* <code>ExclamationWithmBolt &lt;text path&gt; &lt;result path&gt; &lt;number of exclamation marks&gt;</code><br>
* If no parameters are provided, the program is run with default data from {@link WordCountData} with x=2.
* <p/>
* <p/>
* <p>
* This example shows how to:
* <ul>
* <li>use a Bolt within a Flink Streaming program</li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,16 +32,13 @@

/**
* Implements the "Exclamation" program that attaches six exclamation marks to every line of a text files in a streaming
* fashion. The program is constructed as a regular {@link StormTopology}.
* <p/>
* <p/>
* fashion. The program is constructed as a regular {@link backtype.storm.generated.StormTopology}.
* <p>
* The input is a plain text file with lines separated by newline characters.
* <p/>
* <p/>
* Usage: <code>ExclamationWithSpout &lt;text path&gt; &lt;result path&gt;</code><br/>
* <p>
* Usage: <code>ExclamationWithSpout &lt;text path&gt; &lt;result path&gt;</code><br>
* If no parameters are provided, the program is run with default data from {@link WordCountData}.
* <p/>
* <p/>
* <p>
* This example shows how to:
* <ul>
* <li>use a Storm spout within a Flink Streaming program</li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,14 +33,14 @@

/**
* Implements a simple example with two declared output streams for the embedded spout.
* <p/>
* <p>
* This example shows how to:
* <ul>
* <li>handle multiple output stream of a spout</li>
* <li>accessing each stream by .split(...) and .select(...)</li>
* <li>strip wrapper data type SplitStreamType for further processing in Flink</li>
* </ul>
* <p/>
* <p>
* This example would work the same way for multiple bolt output streams.
*/
public class SpoutSplitExample {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,15 +30,12 @@
/**
* Implements the "WordCount" program that computes a simple word occurrence histogram over text files in a streaming
* fashion. The tokenizer step is performed by a {@link IRichBolt Bolt}.
* <p/>
* <p/>
* <p>
* The input is a plain text file with lines separated by newline characters.
* <p/>
* <p/>
* <p>
* Usage: <code>WordCount &lt;text path&gt; &lt;result path&gt;</code><br>
* If no parameters are provided, the program is run with default data from {@link WordCountData}.
* <p/>
* <p/>
* <p>
* This example shows how to:
* <ul>
* <li>use a Bolt within a Flink Streaming program.</li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -36,15 +36,12 @@
* Implements the "WordCount" program that computes a simple word occurrence histogram over text files in a streaming
* fashion. The tokenizer step is performed by a {@link IRichBolt Bolt}. In contrast to {@link BoltTokenizerWordCount}
* the tokenizer's input is a POJO type and the single field is accessed by name.
* <p/>
* <p/>
* <p>
* The input is a plain text file with lines separated by newline characters.
* <p/>
* <p/>
* <p>
* Usage: <code>WordCount &lt;text path&gt; &lt;result path&gt;</code><br>
* If no parameters are provided, the program is run with default data from {@link WordCountData}.
* <p/>
* <p/>
* <p>
* This example shows how to:
* <ul>
* <li>how to access attributes by name within a Bolt for POJO type input streams
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -38,15 +38,12 @@
* Implements the "WordCount" program that computes a simple word occurrence histogram over text files in a streaming
* fashion. The tokenizer step is performed by a {@link IRichBolt Bolt}. In contrast to {@link BoltTokenizerWordCount}
* the tokenizer's input is a {@link Tuple} type and the single field is accessed by name.
* <p/>
* <p/>
* <p>
* The input is a plain text file with lines separated by newline characters.
* <p/>
* <p/>
* <p>
* Usage: <code>WordCount &lt;text path&gt; &lt;result path&gt;</code><br>
* If no parameters are provided, the program is run with default data from {@link WordCountData}.
* <p/>
* <p/>
* <p>
* This example shows how to:
* <ul>
* <li>how to access attributes by name within a Bolt for {@link Tuple} type input streams
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -34,15 +34,12 @@
/**
* Implements the "WordCount" program that computes a simple word occurrence histogram over text files in a streaming
* fashion. The used data source is a {@link IRichSpout Spout}.
* <p/>
* <p/>
* <p>
* The input is a plain text file with lines separated by newline characters.
* <p/>
* <p/>
* <p>
* Usage: <code>WordCount &lt;text path&gt; &lt;result path&gt;</code><br>
* If no parameters are provided, the program is run with default data from {@link WordCountData}.
* <p/>
* <p/>
* <p>
* This example shows how to:
* <ul>
* <li>use a Spout within a Flink Streaming program.</li>
Expand Down Expand Up @@ -89,7 +86,7 @@ public static void main(final String[] args) throws Exception {

/**
* Implements the string tokenizer that splits sentences into words as a user-defined FlatMapFunction. The function
* takes a line (String) and splits it into multiple pairs in the form of "(word,1)" (Tuple2<String, Integer>).
* takes a line (String) and splits it into multiple pairs in the form of "(word,1)" ({@code Tuple2<String, Integer>}).
*/
public static final class Tokenizer implements FlatMapFunction<String, Tuple2<String, Integer>> {
private static final long serialVersionUID = 1L;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,18 +29,15 @@
* Implements the "WordCount" program that computes a simple word occurrence histogram over text files in a streaming
* fashion. The program is constructed as a regular {@link StormTopology} and submitted to Flink for execution in the
* same way as to a Storm {@link LocalCluster}.
* <p/>
* <p>
* This example shows how to run program directly within Java, thus it cannot be used to submit a {@link StormTopology}
* via Flink command line clients (ie, bin/flink).
* <p/>
* <p/>
* <p>
* The input is a plain text file with lines separated by newline characters.
* <p/>
* <p/>
* <p>
* Usage: <code>WordCountLocal &lt;text path&gt; &lt;result path&gt;</code><br>
* If no parameters are provided, the program is run with default data from {@link WordCountData}.
* <p/>
* <p/>
* <p>
* This example shows how to:
* <ul>
* <li>run a regular Storm program locally on Flink</li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,18 +30,15 @@
* fashion. The program is constructed as a regular {@link StormTopology} and submitted to Flink for execution in the
* same way as to a Storm {@link LocalCluster}. In contrast to {@link WordCountLocal} all bolts access the field of
* input tuples by name instead of index.
* <p/>
* <p>
* This example shows how to run program directly within Java, thus it cannot be used to submit a {@link StormTopology}
* via Flink command line clients (ie, bin/flink).
* <p/>
* <p/>
* <p>
* The input is a plain text file with lines separated by newline characters.
* <p/>
* <p/>
* <p>
* Usage: <code>WordCountLocalByName &lt;text path&gt; &lt;result path&gt;</code><br>
* If no parameters are provided, the program is run with default data from {@link WordCountData}.
* <p/>
* <p/>
* <p>
* This example shows how to:
* <ul>
* <li>run a regular Storm program locally on Flink
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,18 +33,15 @@
* Implements the "WordCount" program that computes a simple word occurrence histogram over text files in a streaming
* fashion. The program is constructed as a regular {@link StormTopology} and submitted to Flink for execution in the
* same way as to a Storm cluster similar to {@link NimbusClient}. The Flink cluster can be local or remote.
* <p/>
* <p>
* This example shows how to submit the program via Java, thus it cannot be used to submit a {@link StormTopology} via
* Flink command line clients (ie, bin/flink).
* <p/>
* <p/>
* <p>
* The input is a plain text file with lines separated by newline characters.
* <p/>
* <p/>
* <p>
* Usage: <code>WordCountRemoteByClient &lt;text path&gt; &lt;result path&gt;</code><br>
* If no parameters are provided, the program is run with default data from {@link WordCountData}.
* <p/>
* <p/>
* <p>
* This example shows how to:
* <ul>
* <li>submit a regular Storm program to a local or remote Flink cluster.</li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,17 +30,14 @@
* Implements the "WordCount" program that computes a simple word occurrence histogram over text files in a streaming
* fashion. The program is constructed as a regular {@link StormTopology} and submitted to Flink for execution in the
* same way as to a Storm cluster similar to {@link StormSubmitter}. The Flink cluster can be local or remote.
* <p/>
* <p>
* This example shows how to submit the program via Java as well as Flink's command line client (ie, bin/flink).
* <p/>
* <p/>
* <p>
* The input is a plain text file with lines separated by newline characters.
* <p/>
* <p/>
* <p>
* Usage: <code>WordCountRemoteBySubmitter &lt;text path&gt; &lt;result path&gt;</code><br>
* If no parameters are provided, the program is run with default data from {@link WordCountData}.
* <p/>
* <p/>
* <p>
* This example shows how to:
* <ul>
* <li>submit a regular Storm program to a local or remote Flink cluster.</li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -36,16 +36,13 @@
/**
* Implements the "WordCount" program that computes a simple word occurrence histogram over text files in a streaming
* fashion. The program is constructed as a regular {@link StormTopology}.
* <p/>
* <p/>
* <p>
* The input is a plain text file with lines separated by newline characters.
* <p/>
* <p/>
* <p>
* Usage:
* <code>WordCount[Local|LocalByName|RemoteByClient|RemoteBySubmitter] &lt;text path&gt; &lt;result path&gt;</code><br>
* If no parameters are provided, the program is run with default data from {@link WordCountData}.
* <p/>
* <p/>
* <p>
* This example shows how to:
* <ul>
* <li>how to construct a regular Storm topology as Flink program</li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -138,7 +138,7 @@ public static FlinkClient getConfiguredClient(final Map conf) {

/**
* Return a reference to itself.
* <p/>
* <p>
* {@link FlinkClient} mimics both, {@link NimbusClient} and {@link Nimbus}{@code .Client}, at once.
*
* @return A reference to itself.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -55,8 +55,8 @@
/**
* {@link FlinkTopologyBuilder} mimics a {@link TopologyBuilder}, but builds a Flink program instead of a Storm
* topology. Most methods (except {@link #createTopology()} are copied from the original {@link TopologyBuilder}
* implementation to ensure equal behavior.<br />
* <br />
* implementation to ensure equal behavior.<br>
* <br>
* <strong>CAUTION: {@link IRichStateSpout StateSpout}s are currently not supported.</strong>
*/
public class FlinkTopologyBuilder {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,10 +20,11 @@
import org.apache.flink.streaming.api.datastream.DataStream;

/**
* Used by {@link org.apache.flink.storm.wrappers.AbstractStormCollector AbstractStormCollector} to wrap
* Used by org.apache.flink.storm.wrappers.AbstractStormCollector to wrap
* output tuples if multiple output streams are declared. For this case, the Flink output data stream must be split via
* {@link DataStream#split(org.apache.flink.streaming.api.collector.selector.OutputSelector) .split(...)} using
* {@link StormStreamSelector}.
*
*/
public class SplitStreamType<T> {

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -42,8 +42,8 @@
* A {@link BoltWrapper} wraps an {@link IRichBolt} in order to execute the Storm bolt within a Flink Streaming
* program. It takes the Flink input tuples of type {@code IN} and transforms them into {@link StormTuple}s that the
* bolt can process. Furthermore, it takes the bolt's output tuples and transforms them into Flink tuples of type
* {@code OUT} (see {@link AbstractStormCollector} for supported types).<br />
* <br />
* {@code OUT} (see {@link AbstractStormCollector} for supported types).<br>
* <br>
* <strong>CAUTION: currently, only simple bolts are supported! (ie, bolts that do not use the Storm configuration
* <code>Map</code> or <code>TopologyContext</code> that is provided by the bolt's <code>open(..)</code> method.
* Furthermore, acking and failing of tuples as well as accessing tuple attributes by field names is not supported so
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -38,15 +38,15 @@
/**
* A {@link SpoutWrapper} wraps an {@link IRichSpout} in order to execute it within a Flink Streaming program. It
* takes the spout's output tuples and transforms them into Flink tuples of type {@code OUT} (see
* {@link SpoutCollector} for supported types).<br />
* <br />
* {@link SpoutCollector} for supported types).<br>
* <br>
* Per default, {@link SpoutWrapper} calls the wrapped spout's {@link IRichSpout#nextTuple() nextTuple()} method in
* an infinite loop.<br />
* an infinite loop.<br>
* Alternatively, {@link SpoutWrapper} can call {@link IRichSpout#nextTuple() nextTuple()} for a finite number of
* times and terminate automatically afterwards (for finite input streams). The number of {@code nextTuple()} calls can
* be specified as a certain number of invocations or can be undefined. In the undefined case, {@link SpoutWrapper}
* terminates if no record was emitted to the output collector for the first time during a call to
* {@link IRichSpout#nextTuple() nextTuple()}.<br />
* {@link IRichSpout#nextTuple() nextTuple()}.<br>
* If the given spout implements {@link FiniteSpout} interface and {@link #numberOfInvocations} is not provided or
* is {@code null}, {@link SpoutWrapper} calls {@link IRichSpout#nextTuple() nextTuple()} method until
* {@link FiniteSpout#reachedEnd()} returns true.
Expand Down Expand Up @@ -258,7 +258,7 @@ public final void run(final SourceContext<OUT> ctx) throws Exception {

/**
* {@inheritDoc}
* <p/>
* <p>
* Sets the {@link #isRunning} flag to {@code false}.
*/
@Override
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@

/**
* Entities which have been parsed out of the text of the
* {@link package org.apache.flink.contrib.tweetinputformat.model.tweet.Tweet}.
* {@link org.apache.flink.contrib.tweetinputformat.model.tweet.Tweet}.
*/
public class Entities {

Expand Down
Loading

0 comments on commit 680b5a9

Please sign in to comment.