Skip to content

Commit

Permalink
[LIVY-398][DOC] Update rest API to reflect the changes of shared cont…
Browse files Browse the repository at this point in the history
…ext session

LIVY-194 changes the semantics of session in Livy and add supports of shared context for Livy session, so here change the docs in REST API to reflect the changes of LIVY-194.

Author: jerryshao <[email protected]>

Closes apache#47 from jerryshao/LIVY-398.
  • Loading branch information
jerryshao committed Sep 22, 2017
1 parent 219fdac commit 1bd92b9
Showing 1 changed file with 30 additions and 15 deletions.
45 changes: 30 additions & 15 deletions docs/rest-api.md
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ Creates a new interactive Scala, Python, or R shell in the cluster.
<tr><th>Name</th><th>Description</th><th>Type</th></tr>
<tr>
<td>kind</td>
<td>The session kind (required)</td>
<td>The session kind<sup><a href="#footnote1">[1]</a></sup></td>
<td><a href="#session-kind">session kind</a></td>
</tr>
<tr>
Expand Down Expand Up @@ -153,6 +153,10 @@ Creates a new interactive Scala, Python, or R shell in the cluster.
</tr>
</table>

<a id="footnote1">1</a>: Starting with version 0.5.0-incubating this field is not required. To be
compatible with previous versions users can still specify this with spark, pyspark or sparkr,
implying that the submitted code snippet is the corresponding kind.

#### Response Body

The created <a href="#session">Session</a>.
Expand Down Expand Up @@ -266,8 +270,17 @@ Runs a statement in a session.
<td>The code to execute</td>
<td>string</td>
</tr>
<tr>
<td>kind</td>
<td>The kind of code to execute<sup><a href="#footnote2">[2]</a></sup></td>
<td><a href="#session-kind">code kind</a></td>
</tr>
</table>

<a id="footnote2">2</a>: If session kind is not specified or the submitted code is not the kind
specified in session creation, this field should be filled with correct kind.
Otherwise Livy will use kind specified in session creation as the default code kind.

#### Response Body

The <a href="#statement">statement</a> object.
Expand Down Expand Up @@ -610,34 +623,36 @@ A session represents an interactive shell.
</tr>
<tr>
<td><a href="#pyspark">pyspark</a></td>
<td>Interactive Python 2 Spark session</td>
</tr>
<tr>
<td><a href="#pyspark3">pyspark3</a></td>
<td>Interactive Python 3 Spark session</td>
<td>Interactive Python Spark session</td>
</tr>
<tr>
<td>sparkr</td>
<td>Interactive R Spark session</td>
</tr>
</table>

Starting with version 0.5.0-incubating, each session can support all three Scala, Python and R
interpreters. The ``kind`` field in session creation is no longer required, instead users should
specify code kind (spark, pyspark or sparkr) during statement submission.

To be compatible with previous versions, users can still specify ``kind`` in session creation,
while ignoring ``kind`` in statement submission. Livy will then use this session
``kind`` as default kind for all the submitted statements.

If users want to submit code other than default ``kind`` specified in session creation, users
need to specify code kind (spark, pyspark or sparkr) during statement submission.

#### pyspark

To change the Python executable the session uses, Livy reads the path from environment variable
``PYSPARK_PYTHON`` (Same as pyspark).

Like pyspark, if Livy is running in ``local`` mode, just set the environment variable.
If the session is running in ``yarn-cluster`` mode, please set
``spark.yarn.appMasterEnv.PYSPARK_PYTHON`` in SparkConf so the environment variable is passed to
the driver.

#### pyspark3
To change the Python executable the session uses, Livy reads the path from environment variable
``PYSPARK3_PYTHON``.
Starting with version 0.5.0-incubating, session kind "pyspark3" is removed, instead users require
to set ``PYSPARK_PYTHON`` to python3 executable.

Like pyspark, if Livy is running in ``local`` mode, just set the environment variable.
If the session is running in ``yarn-cluster`` mode, please set
``spark.yarn.appMasterEnv.PYSPARK3_PYTHON`` in SparkConf so the environment variable is passed to
``spark.yarn.appMasterEnv.PYSPARK_PYTHON`` in SparkConf so the environment variable is passed to
the driver.

### Statement
Expand Down

0 comments on commit 1bd92b9

Please sign in to comment.