Skip to content

Commit

Permalink
[FLINK-25565][Formats][Parquet] timestamp int64 option tidy up (apach…
Browse files Browse the repository at this point in the history
  • Loading branch information
tweise authored Dec 11, 2023
1 parent 3aa70df commit 548e4b5
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 5 deletions.
5 changes: 3 additions & 2 deletions docs/content/docs/connectors/table/formats/parquet.md
Original file line number Diff line number Diff line change
Expand Up @@ -107,9 +107,10 @@ For example, you can configure `parquet.compression=GZIP` to enable gzip compres
Data Type Mapping
----------------

Currently, Parquet format type mapping is compatible with Apache Hive, but different with Apache Spark:
Currently, Parquet format type mapping is compatible with Apache Hive, but by default not with Apache Spark:

- Timestamp: mapping timestamp type to int96 whatever the precision is.
- Spark compatibility requires int64 via config option `timestamp.time.unit` (see above).
- Decimal: mapping decimal type to fixed length byte array according to the precision.

The following table lists the type mapping from Flink type to Parquet type.
Expand Down Expand Up @@ -185,7 +186,7 @@ The following table lists the type mapping from Flink type to Parquet type.
</tr>
<tr>
<td>TIMESTAMP</td>
<td>INT96</td>
<td>INT96 (or INT64)</td>
<td></td>
</tr>
<tr>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -159,8 +159,8 @@ public static TimestampData int96ToTimestamp(

public static TimestampData int64ToTimestamp(
boolean utcTimestamp, long value, LogicalTypeAnnotation.TimeUnit timeUnit) {
long nanosOfMillisecond = 0L;
long milliseconds = 0L;
final long nanosOfMillisecond;
final long milliseconds;

switch (timeUnit) {
case MILLIS:
Expand All @@ -176,7 +176,7 @@ public static TimestampData int64ToTimestamp(
nanosOfMillisecond = value % NANOS_PER_SECOND;
break;
default:
break;
throw new IllegalArgumentException("Invalid time unit: " + timeUnit);
}

if (utcTimestamp) {
Expand Down

0 comments on commit 548e4b5

Please sign in to comment.