Skip to content

Commit

Permalink
[SPARK-37072][CORE] Pass all UTs in repl with Java 17
Browse files Browse the repository at this point in the history
### What changes were proposed in this pull request?
There are 9 UTs failed in `repl` run with Java 17, the all failed tests have similar error stack as follows:

```
java.lang.IllegalAccessException: Can not set final $iw field $Lambda$2879/0x000000080188b928.arg$1 to $iw
    at java.base/jdk.internal.reflect.UnsafeFieldAccessorImpl.throwFinalFieldIllegalAccessException(UnsafeFieldAccessorImpl.java:76)
    at java.base/jdk.internal.reflect.UnsafeFieldAccessorImpl.throwFinalFieldIllegalAccessException(UnsafeFieldAccessorImpl.java:80)
    at java.base/jdk.internal.reflect.UnsafeQualifiedObjectFieldAccessorImpl.set(UnsafeQualifiedObjectFieldAccessorImpl.java:79)
    at java.base/java.lang.reflect.Field.set(Field.java:799)
    at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:398)
    at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)
    at org.apache.spark.SparkContext.clean(SparkContext.scala:2490)
    at org.apache.spark.rdd.RDD.$anonfun$map$1(RDD.scala:414)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
    at org.apache.spark.rdd.RDD.withScope(RDD.scala:406)
    at org.apache.spark.rdd.RDD.map(RDD.scala:413)
    ... 95 elided

```

We know from error stack that  we can't reset a `read-only` field  through the reflection api when use Java 17, the `read-only` field is defined as `private final field (trustedFinal=true)` .

So this pr try to remove the `final` modifier before `outerField.set(func, clonedOuterThis)` in `ClosureCleaner#clean` method  and reset  the modifier value after `outerField.set(func, clonedOuterThis)`  when use Java 17.

At the same time, in order to use the API  in `java.lang.reflect` mod, `--add-opens=java.base/java.lang.reflect=ALL-UNNAMED` is added to `pom.xml`, `SparkBuild.scala` and `JavaModuleOptions.java`

### Why are the changes needed?
Pass UT with JDK 17

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?

- Pass the Jenkins or GitHub Action
- Manual test use Java17 build + Java17 run (both openjdk version "17" 2021-09-14 LTS)

Use Java 17 to run the following two commands

```
mvn clean install -pl repl -am -DskipTests
mvn test -pl repl
```

**Before**

```
Run completed in 30 seconds, 826 milliseconds.
Total number of tests run: 42
Suites: completed 6, aborted 0
Tests: succeeded 33, failed 9, canceled 0, ignored 0, pending 0
*** 9 TESTS FAILED ***
```

**After**

```
Run completed in 45 seconds, 86 milliseconds.
Total number of tests run: 44
Suites: completed 7, aborted 0
Tests: succeeded 44, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
```

- Manual test use Java8 build + Java17 run

Use Java 8(openjdk version "1.8.0_292") to run the build command:
```
mvn clean install -pl repl -am -DskipTests
```

And Java 17(openjdk version "17" 2021-09-14 LTS) to run the test command:
```
mvn test -pl repl
```

**Before**
```
Run completed in 41 seconds, 855 milliseconds.
Total number of tests run: 44
Suites: completed 7, aborted 0
Tests: succeeded 35, failed 9, canceled 0, ignored 0, pending 0
*** 9 TESTS FAILED ***

```

**After**

```
Run completed in 44 seconds, 596 milliseconds.
Total number of tests run: 44
Suites: completed 7, aborted 0
Tests: succeeded 44, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
```

Closes apache#34368 from LuciferYang/SPARK-37072.

Authored-by: yangjie01 <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
  • Loading branch information
LuciferYang authored and dongjoon-hyun committed Oct 25, 2021
1 parent 2362a6f commit 86f080f
Show file tree
Hide file tree
Showing 4 changed files with 32 additions and 1 deletion.
30 changes: 29 additions & 1 deletion core/src/main/scala/org/apache/spark/util/ClosureCleaner.scala
Original file line number Diff line number Diff line change
Expand Up @@ -19,11 +19,12 @@ package org.apache.spark.util

import java.io.{ByteArrayInputStream, ByteArrayOutputStream}
import java.lang.invoke.{MethodHandleInfo, SerializedLambda}
import java.lang.reflect.{Field, Modifier}

import scala.collection.JavaConverters._
import scala.collection.mutable.{Map, Set, Stack}

import org.apache.commons.lang3.ClassUtils
import org.apache.commons.lang3.{ClassUtils, JavaVersion, SystemUtils}
import org.apache.xbean.asm9.{ClassReader, ClassVisitor, Handle, MethodVisitor, Type}
import org.apache.xbean.asm9.Opcodes._
import org.apache.xbean.asm9.tree.{ClassNode, MethodNode}
Expand Down Expand Up @@ -394,8 +395,17 @@ private[spark] object ClosureCleaner extends Logging {
parent = null, outerThis, capturingClass, accessedFields)

val outerField = func.getClass.getDeclaredField("arg$1")
// SPARK-37072: When Java 17 is used and `outerField` is read-only,
// the content of `outerField` cannot be set by reflect api directly.
// But we can remove the `final` modifier of `outerField` before set value
// and reset the modifier after set value.
val modifiersField = getFinalModifiersFieldForJava17(outerField)
modifiersField
.foreach(m => m.setInt(outerField, outerField.getModifiers & ~Modifier.FINAL))
outerField.setAccessible(true)
outerField.set(func, clonedOuterThis)
modifiersField
.foreach(m => m.setInt(outerField, outerField.getModifiers | Modifier.FINAL))
}
}

Expand All @@ -407,6 +417,24 @@ private[spark] object ClosureCleaner extends Logging {
}
}

/**
* This method is used to get the final modifier field when on Java 17.
*/
private def getFinalModifiersFieldForJava17(field: Field): Option[Field] = {
if (SystemUtils.isJavaVersionAtLeast(JavaVersion.JAVA_17) &&
Modifier.isFinal(field.getModifiers)) {
val methodGetDeclaredFields0 = classOf[Class[_]]
.getDeclaredMethod("getDeclaredFields0", classOf[Boolean])
methodGetDeclaredFields0.setAccessible(true)
val fields = methodGetDeclaredFields0.invoke(classOf[Field], false.asInstanceOf[Object])
.asInstanceOf[Array[Field]]
val modifiersFieldOption = fields.find(field => "modifiers".equals(field.getName))
require(modifiersFieldOption.isDefined)
modifiersFieldOption.foreach(_.setAccessible(true))
modifiersFieldOption
} else None
}

private def ensureSerializable(func: AnyRef): Unit = {
try {
if (SparkEnv.get != null) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ public class JavaModuleOptions {
"-XX:+IgnoreUnrecognizedVMOptions",
"--add-opens=java.base/java.lang=ALL-UNNAMED",
"--add-opens=java.base/java.lang.invoke=ALL-UNNAMED",
"--add-opens=java.base/java.lang.reflect=ALL-UNNAMED",
"--add-opens=java.base/java.io=ALL-UNNAMED",
"--add-opens=java.base/java.net=ALL-UNNAMED",
"--add-opens=java.base/java.nio=ALL-UNNAMED",
Expand Down
1 change: 1 addition & 0 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -294,6 +294,7 @@
-XX:+IgnoreUnrecognizedVMOptions
--add-opens=java.base/java.lang=ALL-UNNAMED
--add-opens=java.base/java.lang.invoke=ALL-UNNAMED
--add-opens=java.base/java.lang.reflect=ALL-UNNAMED
--add-opens=java.base/java.io=ALL-UNNAMED
--add-opens=java.base/java.net=ALL-UNNAMED
--add-opens=java.base/java.nio=ALL-UNNAMED
Expand Down
1 change: 1 addition & 0 deletions project/SparkBuild.scala
Original file line number Diff line number Diff line change
Expand Up @@ -1131,6 +1131,7 @@ object TestSettings {
val extraTestJavaArgs = Array("-XX:+IgnoreUnrecognizedVMOptions",
"--add-opens=java.base/java.lang=ALL-UNNAMED",
"--add-opens=java.base/java.lang.invoke=ALL-UNNAMED",
"--add-opens=java.base/java.lang.reflect=ALL-UNNAMED",
"--add-opens=java.base/java.io=ALL-UNNAMED",
"--add-opens=java.base/java.net=ALL-UNNAMED",
"--add-opens=java.base/java.nio=ALL-UNNAMED",
Expand Down

0 comments on commit 86f080f

Please sign in to comment.