Skip to content

Commit

Permalink
[SPARK-38139][PYTHON][ML][TESTS] Adjust tolerance in ml.recommendatio…
Browse files Browse the repository at this point in the history
…n.ALS doctest

### What changes were proposed in this pull request?

This PR reduces precision of the result in `pyspark.ml.recommendation.ALS` doctest to four decimal digits.

### Why are the changes needed?

In certain configurations, ALS consistently converges to value slightly below  `0.69291`, causing repeated test failures.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Existing tests, executed with configuration where this test normally fails.

Closes apache#35503 from zero323/SPARK-38139.

Authored-by: zero323 <[email protected]>
Signed-off-by: Hyukjin Kwon <[email protected]>
  • Loading branch information
zero323 authored and HyukjinKwon committed Feb 13, 2022
1 parent 25dd425 commit 1c0793a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion python/pyspark/ml/recommendation.py
Original file line number Diff line number Diff line change
Expand Up @@ -320,7 +320,7 @@ class ALS(JavaEstimator, _ALSParams, JavaMLWritable, JavaMLReadable):
>>> test = spark.createDataFrame([(0, 2), (1, 0), (2, 0)], ["user", "item"])
>>> predictions = sorted(model.transform(test).collect(), key=lambda r: r[0])
>>> predictions[0]
Row(user=0, item=2, newPrediction=0.69291...)
Row(user=0, item=2, newPrediction=0.6929...)
>>> predictions[1]
Row(user=1, item=0, newPrediction=3.47356...)
>>> predictions[2]
Expand Down

0 comments on commit 1c0793a

Please sign in to comment.