Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[SPARK-28708][SQL] IsolatedClientLoader will not load hive classes fr…
…om application jars on JDK9+ ## What changes were proposed in this pull request? We have 8 test cases in `HiveSparkSubmitSuite` still fail with `java.lang.ClassNotFoundException` when running on JDK9+: ``` [info] - SPARK-18989: DESC TABLE should not fail with format class not found *** FAILED *** (9 seconds, 927 milliseconds) [info] spark-submit returned with exit code 1. [info] Command line: './bin/spark-submit' '--class' 'org.apache.spark.sql.hive.SPARK_18989_CREATE_TABLE' '--name' 'SPARK-18947' '--master' 'local-cluster[2,1,1024]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--jars' '/root/.m2/repository/org/apache/hive/hive-contrib/2.3.6-SNAPSHOT/hive-contrib-2.3.6-SNAPSHOT.jar' 'file:/root/opensource/spark/target/tmp/spark-36d27542-7b82-4962-a362-bb51ef3e457d/testJar-1565682620744.jar' [info] [info] 2019-08-13 00:50:22.073 - stderr> WARNING: An illegal reflective access operation has occurred [info] 2019-08-13 00:50:22.073 - stderr> WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/root/opensource/spark/common/unsafe/target/scala-2.12/classes/) to constructor java.nio.DirectByteBuffer(long,int) [info] 2019-08-13 00:50:22.073 - stderr> WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform [info] 2019-08-13 00:50:22.073 - stderr> WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations [info] 2019-08-13 00:50:22.073 - stderr> WARNING: All illegal access operations will be denied in a future release [info] 2019-08-13 00:50:28.31 - stderr> Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/ql/metadata/HiveException [info] 2019-08-13 00:50:28.31 - stderr> at java.base/java.lang.Class.getDeclaredConstructors0(Native Method) [info] 2019-08-13 00:50:28.31 - stderr> at java.base/java.lang.Class.privateGetDeclaredConstructors(Class.java:3138) [info] 2019-08-13 00:50:28.31 - stderr> at java.base/java.lang.Class.getConstructors(Class.java:1944) [info] 2019-08-13 00:50:28.31 - stderr> at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:294) [info] 2019-08-13 00:50:28.31 - stderr> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:410) [info] 2019-08-13 00:50:28.31 - stderr> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:305) [info] 2019-08-13 00:50:28.31 - stderr> at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:68) [info] 2019-08-13 00:50:28.31 - stderr> at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:67) [info] 2019-08-13 00:50:28.31 - stderr> at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$databaseExists$1(HiveExternalCatalog.scala:221) [info] 2019-08-13 00:50:28.31 - stderr> at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23) [info] 2019-08-13 00:50:28.31 - stderr> at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:99) [info] 2019-08-13 00:50:28.31 - stderr> at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:221) [info] 2019-08-13 00:50:28.31 - stderr> at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:139) [info] 2019-08-13 00:50:28.31 - stderr> at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:129) [info] 2019-08-13 00:50:28.31 - stderr> at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:42) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.hive.HiveSessionStateBuilder.$anonfun$catalog$1(HiveSessionStateBuilder.scala:57) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:91) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:91) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.databaseExists(SessionCatalog.scala:244) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.requireDbExists(SessionCatalog.scala:178) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.createTable(SessionCatalog.scala:317) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.execution.command.CreateTableCommand.run(tables.scala:132) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:213) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3431) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$4(SQLExecution.scala:100) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:160) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:87) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3427) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.Dataset.<init>(Dataset.scala:213) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:95) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:653) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.hive.SPARK_18989_CREATE_TABLE$.main(HiveSparkSubmitSuite.scala:829) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.hive.SPARK_18989_CREATE_TABLE.main(HiveSparkSubmitSuite.scala) [info] 2019-08-13 00:50:28.311 - stderr> at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) [info] 2019-08-13 00:50:28.311 - stderr> at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) [info] 2019-08-13 00:50:28.311 - stderr> at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) [info] 2019-08-13 00:50:28.311 - stderr> at java.base/java.lang.reflect.Method.invoke(Method.java:566) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:920) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:179) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:202) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:89) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:999) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1008) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) [info] 2019-08-13 00:50:28.311 - stderr> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.ql.metadata.HiveException [info] 2019-08-13 00:50:28.311 - stderr> at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:471) [info] 2019-08-13 00:50:28.311 - stderr> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:588) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.doLoadClass(IsolatedClientLoader.scala:250) [info] 2019-08-13 00:50:28.311 - stderr> at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.loadClass(IsolatedClientLoader.scala:239) [info] 2019-08-13 00:50:28.311 - stderr> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521) [info] 2019-08-13 00:50:28.311 - stderr> ... 48 more ``` Note that this pr fixes `java.lang.ClassNotFoundException`, but the test will fail again with a different reason, the Hive-side `java.lang.ClassCastException` which will be resolved in the official Hive 2.3.6 release. ``` [info] - SPARK-18989: DESC TABLE should not fail with format class not found *** FAILED *** (7 seconds, 649 milliseconds) [info] spark-submit returned with exit code 1. [info] Command line: './bin/spark-submit' '--class' 'org.apache.spark.sql.hive.SPARK_18989_CREATE_TABLE' '--name' 'SPARK-18947' '--master' 'local-cluster[2,1,1024]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--jars' '/Users/dongjoon/.ivy2/cache/org.apache.hive/hive-contrib/jars/hive-contrib-2.3.5.jar' 'file:/Users/dongjoon/PRS/PR-25429/target/tmp/spark-48b7c936-0ec2-4311-9fb5-0de4bf86a0eb/testJar-1565710418275.jar' [info] [info] 2019-08-13 08:33:39.221 - stderr> WARNING: An illegal reflective access operation has occurred [info] 2019-08-13 08:33:39.221 - stderr> WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/Users/dongjoon/PRS/PR-25429/common/unsafe/target/scala-2.12/classes/) to constructor java.nio.DirectByteBuffer(long,int) [info] 2019-08-13 08:33:39.221 - stderr> WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform [info] 2019-08-13 08:33:39.221 - stderr> WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations [info] 2019-08-13 08:33:39.221 - stderr> WARNING: All illegal access operations will be denied in a future release [info] 2019-08-13 08:33:43.59 - stderr> Exception in thread "main" org.apache.spark.sql.AnalysisException: java.lang.ClassCastException: class jdk.internal.loader.ClassLoaders$AppClassLoader cannot be cast to class java.net.URLClassLoader (jdk.internal.loader.ClassLoaders$AppClassLoader and java.net.URLClassLoader are in module java.base of loader 'bootstrap'); [info] 2019-08-13 08:33:43.59 - stderr> at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:109) ``` ## How was this patch tested? manual tests: 1. Install [Hive 2.3.6-SNAPSHOT](https://github.com/wangyum/hive/tree/HIVE-21584-branch-2.3) to local maven repository: ``` mvn clean install -DskipTests=true ``` 2. Upgrade our built-in Hive to 2.3.6-SNAPSHOT, you can checkout [this branch](https://github.com/wangyum/spark/tree/SPARK-28708-Hive-2.3.6) to test. 3. Test with hadoop-3.2: ``` build/sbt "hive/test-only *. HiveSparkSubmitSuite" -Phive -Phadoop-3.2 -Phive-thriftserver ... [info] Run completed in 3 minutes, 8 seconds. [info] Total number of tests run: 11 [info] Suites: completed 1, aborted 0 [info] Tests: succeeded 11, failed 0, canceled 3, ignored 0, pending 0 [info] All tests passed. ``` Closes apache#25429 from wangyum/SPARK-28708. Authored-by: Yuming Wang <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
- Loading branch information