forked from apache/spark
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[SPARK-37750][SQL] ANSI mode: optionally return null result if elemen…
…t not exists in array/map ### What changes were proposed in this pull request? Add a new configuration `spark.sql.ansi.failOnElementNotExists` which controls whether throwing exceptions or returning null results when element not exists in the [] operator in array/map type The default value of the new configuration is true. ### Why are the changes needed? * Provide an alternative way for Spark SQL users who replies on null results when element not exists in array/map, e.g. `select .. where array[index] is not null` or `select .. where map[key] is not null` * Map type is not part of the ANSI SQL type. There can be arguments that map[key] should return null if key not exist. ### Does this PR introduce _any_ user-facing change? Yes, providing a new option `spark.sql.ansi.failOnElementNotExists` which can optionally return null result if element not exists in array/map. However, the default behavior is not changed. ### How was this patch tested? Unit tests Closes apache#35031 from gengliangwang/safeAccess. Authored-by: Gengliang Wang <[email protected]> Signed-off-by: Gengliang Wang <[email protected]>
- Loading branch information
1 parent
3a2da70
commit 08fd501
Showing
9 changed files
with
182 additions
and
21 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
16 changes: 16 additions & 0 deletions
16
sql/core/src/test/resources/sql-tests/inputs/ansi/array.sql
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1 +1,17 @@ | ||
--IMPORT array.sql | ||
|
||
-- index out of range for array elements | ||
-- return null results if array index in [] operator is out of bound | ||
set spark.sql.ansi.strictIndexOperator=false; | ||
select array(1, 2, 3)[5]; | ||
select array(1, 2, 3)[-1]; | ||
|
||
-- the configuration spark.sql.ansi.strictIndexOperator doesn't affect the function element_at | ||
select element_at(array(1, 2, 3), 5); | ||
select element_at(array(1, 2, 3), -5); | ||
select element_at(array(1, 2, 3), 0); | ||
|
||
-- -- the configuration spark.sql.ansi.strictIndexOperator doesn't affect the function elt | ||
select elt(4, '123', '456'); | ||
select elt(0, '123', '456'); | ||
select elt(-1, '123', '456'); |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1 +1,8 @@ | ||
--IMPORT map.sql | ||
|
||
-- key does not exist | ||
-- return null results if the map key in [] operator doesn't exist | ||
set spark.sql.ansi.strictIndexOperator=false; | ||
select map(1, 'a', 2, 'b')[5]; | ||
-- the configuration spark.sql.ansi.strictIndexOperator doesn't affect the function element_at | ||
select element_at(map(1, 'a', 2, 'b'), 5); |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters