Skip to content

Commit

Permalink
SPARK-2400 : fix spark.yarn.max.executor.failures explaination
Browse files Browse the repository at this point in the history
According to
```scala
  private val maxNumExecutorFailures = sparkConf.getInt("spark.yarn.max.executor.failures",
    sparkConf.getInt("spark.yarn.max.worker.failures", math.max(args.numExecutors * 2, 3)))
```
default value should be numExecutors * 2, with minimum of 3,  and it's same to the config
`spark.yarn.max.worker.failures`

Author: CrazyJvm <[email protected]>

Closes apache#1282 from CrazyJvm/yarn-doc and squashes the following commits:

1a5f25b [CrazyJvm] remove deprecated config
c438aec [CrazyJvm] fix style
86effa6 [CrazyJvm] change expression
211f130 [CrazyJvm] fix html tag
2900d23 [CrazyJvm] fix style
a4b2e27 [CrazyJvm] fix configuration spark.yarn.max.executor.failures
  • Loading branch information
CrazyJvm authored and tgravescs committed Jul 8, 2014
1 parent c8a2313 commit b520b64
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion docs/running-on-yarn.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ Most of the configs are the same for Spark on YARN as for other deployment modes
</tr>
<tr>
<td><code>spark.yarn.max.executor.failures</code></td>
<td>2*numExecutors</td>
<td>numExecutors * 2, with minimum of 3</td>
<td>
The maximum number of executor failures before failing the application.
</td>
Expand Down

0 comments on commit b520b64

Please sign in to comment.