Skip to content

Commit

Permalink
[FLINK-8354] Update Kafka doc for new KafkaDeserializationSchema
Browse files Browse the repository at this point in the history
  • Loading branch information
aljoscha committed Feb 22, 2019
1 parent 9e64951 commit c6d9b37
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions docs/dev/connectors/kafka.md
Original file line number Diff line number Diff line change
Expand Up @@ -159,7 +159,7 @@ or just `FlinkKafkaConsumer` for Kafka >= 1.0.0 versions). It provides access to
The constructor accepts the following arguments:

1. The topic name / list of topic names
2. A DeserializationSchema / KeyedDeserializationSchema for deserializing the data from Kafka
2. A DeserializationSchema / KafkaDeserializationSchema for deserializing the data from Kafka
3. Properties for the Kafka consumer.
The following properties are required:
- "bootstrap.servers" (comma separated list of Kafka brokers)
Expand Down Expand Up @@ -204,8 +204,8 @@ It is usually helpful to start from the `AbstractDeserializationSchema`, which t
produced Java/Scala type to Flink's type system. Users that implement a vanilla `DeserializationSchema` need
to implement the `getProducedType(...)` method themselves.

For accessing both the key and value of the Kafka message, the `KeyedDeserializationSchema` has
the following deserialize method `T deserialize(byte[] messageKey, byte[] message, String topic, int partition, long offset)`.
For accessing the key, value and metadata of the Kafka message, the `KafkaDeserializationSchema` has
the following deserialize method `T deserialize(ConsumerRecord<byte[], byte[]> record)`.

For convenience, Flink provides the following schemas:

Expand Down

0 comments on commit c6d9b37

Please sign in to comment.