Skip to content

Commit

Permalink
Merge pull request #7 from IraGR/sklearn-docs-edited
Browse files Browse the repository at this point in the history
Sklearn docs edited
  • Loading branch information
mignev authored Jul 23, 2021
2 parents c0d00a1 + ec2bf48 commit cb2a468
Show file tree
Hide file tree
Showing 3 changed files with 19 additions and 18 deletions.
12 changes: 5 additions & 7 deletions deployment/ludwig.md
Original file line number Diff line number Diff line change
Expand Up @@ -83,17 +83,15 @@ Designed with simplicity in mind, TeachbaleHub provides you with free-of-charge
<a id="how-to-deploy-advanced-deployment-guide"></a>
## Advanced Deployment Guide

We have enriched the Deployment with many options to ensure a seamless and clear process.
The Deployment process in TeachableHub is quite automated and seamless, but it also offers many additional options that help you and your team to keep a neat workflow and speak the same language.

### Schema & Features Validation (optional)

For Data Scientists working with ndarrays and with numbers is business as usual. On the other side for the Back-end or the Front-end engineers working with JSON data and object is an everyday job. Not getting lost in translation is essential and configuring this is the connection between these two worlds. Surely, you'll become your Team's SuperHero right away! 😉
TeachableHub has built-in mechanisms for **`Schema and Features Validation` that are automatically received from your Ludwig model `input_features` and `output_features`**. Still, you can further fine-tune those in accordance with your specific case.

Deployment schema enables the TeachableHub Serving API to accept human-readable features. Furthermore, it ensures those features will be validated before they are sent to the model. The schema validation feature eliminates any involuntary mistakes and errors when you are working with your Teachable Predict API and will generate better documentation free of charge.
For Data Scientists working with ndarrays and with numbers is business as usual. On the other side for the Back-end or the Front-end engineers working with JSON data and object is an everyday job. Not getting lost in translation is essential and configuring this is the connection between these two worlds. You'll surely become your Team's SuperHero right away! 😉

TeachableHub does the heavy lifting for you by automatically getting the Deployment Schema from the input_features you describe.

Although these are already built-in validation options, you can further fine-tune in accordance with your specific case.
Deployment schema enables the TeachableHub Serving API to accept human-readable features. Furthermore, it ensures those features will be validated before they are sent to the model. The cool thing is that it eliminates any involuntary mistakes and errors while working with your Teachable Predict API and will generate better documentation with no effort and free of charge.

#### Structure

Expand Down Expand Up @@ -213,7 +211,7 @@ Congrats, you're almost done! 😊 This is the final stop of the required deploy
> `description` - The description can be used as a changelog when needed. It should contain valuable clues on what's new stuff in this model deployment. I imagine how this can be very helpful for the engineers that are integrating the Serving APIs or maintaining the platforms, using this Teachable. You can change the 'description' from the TeachableHub UI as well.
> `activate` - This option set to 'true' automatically sets the newly deployed model as the latest version of the environment to which it's deployed. Keep in mind that it might be dangerous to execute in the production environment directly. However, it's entirely okay for experimentation or staging environments.
> `activate` - This option set to 'true' automatically sets the newly deployed model as the latest version of the environment to which it's deployed. Keep in mind that it might be dangerous to execute in the production environment directly. However, it's entirely okay for experiments or staging environments.

```
Expand Down
8 changes: 4 additions & 4 deletions deployment/sklearn.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ pip install teachablehub

### 2. Setup Deployment Keys

To deploy any model to your Teachable, you'll need a Deployment Key. Deployment keys can be restricted for a particular environment and for a specific period of time. Such constraints might ensure the better security of your awesome project. It's quite a useful feature for the times when only a few members of your team are responsible for deployments in production and need to have permission or you when have some colleague working on the project only for a month.
To deploy any model to your Teachable, you'll need a Deployment Key. Deployment keys can be restricted for a particular environment and for a specific period of time. Such constraints might ensure the better security of your awesome project. It's quite a useful feature for the times when only a few members of your team are responsible for deployments in production and need having permission or you when have some colleague working on the project only for a month.

You can create New Deployment Key from Settings -> Deploy Keys -> Add Key, where you'll need to supply a Key Name and Environment for which this key will be valid. Best practices suggest the name for each key to be descriptive, some good examples include: `production`, `staging`, `backend-team`, `ds-team`, `john-dev`, `jane-staging` etc.

Expand Down Expand Up @@ -184,7 +184,7 @@ deployment.schema({

### `help`

**TeachableHub use this rule to generate Docs section automatically where explain what is this feature about.** This is very helpful when your backend team or 3rd-party integrator consume the API, because they will be aware what data they should provide to the Teachable Predict API.
**TeachableHub uses this rule to generate Docs section automatically where explain what is this feature about.** This is very helpful when your backend team or 3rd-party integrator consumes the API because they will be aware of what data they should provide to the Teachable Predict API.

```python
deployment.schema({
Expand Down Expand Up @@ -260,7 +260,7 @@ deployment.deploy(
<a id="how-to-deploy-ci-cd-automation-helpers"></a>
## CI/CD Automations Helpers

Here are a couple of useful functions that can assist in automating your model deployment in your CI/CD systems.
Below you may find a couple of useful functions that can assist in automating your model deployment in your CI/CD systems.

### `.successful()`

Expand All @@ -273,7 +273,7 @@ if deployment.successful():

### `.verified(reload=False)`

After every deployment based on your `.samples({...})` explained above, TeachableHub is verifing whether or not the deployment is configured correctly and woking as expected.
After every deployment based on your `.samples({...})` explained above, TeachableHub is verifying whether or not the deployment is configured correctly and working as expected.

The `reload` option is to retrieve the latest updated deployment state from the TeachableHub platform.

Expand Down
17 changes: 10 additions & 7 deletions serving/docs.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,17 @@
# How to make Predictions

<a id="how-to-predict-getting-started"></a>

<a id="how-to-predict-getting-started"></a>

## Getting Started
Teachables are powerful machine learning models deployed as an API, entirely documented, available to be consumed by any server-side or client-side application. Teachables are easily integrated with any platform via the TeachableHub REST API or the Python SDK.

Teachables are a powerful machine learning models deployed as an API, entirely documented, available to be consumed by any server-side or client-side applications. Teachables are easily integrated in any platform via the TeachableHub REST API or the Python SDK.

### 1. Setup Serving Keys

Serving Keys are the authorization you use for controlling who can make predictions and to which environment. Serving keys can be issued for a particular and environment and for a specific period of time. Such constraints ensure the better security of your awesome project. It's quite a useful feature for the times when you have some colleague working on the project only for a month, for example.
Serving Keys are the authorization you use for controlling who can make predictions and to which environment of your Teachable. Serving keys can be issued for a particular and environment and for a specific period of time. Such constraints ensure the better security of your awesome project. It's quite a useful feature for the times when you have some colleague working on the project only for a month, for example.

You can create New Serving Key from Settings -> Serving Keys -> Add Key, where you'll need to supply a Key Name and Environment for which this key will be valid. Best practices suggest the name for each key to be descriptive, some good examples include: `production`, `experiments`, `staging`, `rails-backoffice`, `ios-app-production` etc.
You can create New Serving Key from Settings -> Serving Keys -> Add Key, where you'll need to supply a Key Name and Environment for which this key will be valid. Best practices suggest the name for each key to be descriptive, some good examples include: `production`, `staging`, `rails-backoffice`, `ios-app-production` etc.

{{button: { to: "/{{handler}}/{{teachable}}/settings/serving-keys/new", type: "primary", size: "medium", title: "Create a new Serving Key" } }}

Expand Down Expand Up @@ -73,7 +75,7 @@ The Serving process is quite straightforward and seamless. However, TeachbleHub

## Specific Version Predictions

This is useful when you want to test with an older version of the model or when you want to improve the security and have more control of what will be used in the software when this teachable is integrated. It gives you the option to get predictions from a specific deployment version.
This is useful when you want to test with an older version of the model or when you want to improve the security and to have more control of what will be used in the software when this teachable is integrated. It gives you the option to get predictions from a specific deployment version.

```python
teachable = TeachableHubPredictAPI(
Expand All @@ -87,7 +89,7 @@ teachable = TeachableHubPredictAPI(

## Latest Version Predictions

That's the way to go when you want to use always the latest and greatest version. Do not specify any version and your Teachable will always return predictions from the latest model activated in the specific environment. Such an approach is recommended and very helpful for automating your deployment pipeline.
That's the way to go, when you want to use always latest and greatest version. Do not specify any version and your Teachable will always return predictions from the latest model activated in the specific environment. Such approach is recommended and very helpful for automating your deployment pipeline.

```python
teachable = TeachableHubPredictAPI(
Expand Down Expand Up @@ -167,6 +169,7 @@ The SDK raise the following exceptions:
| `UnsuccessfulRequestError` | General Serving API errors 4xx and 5xx |
| `UnauthorizedError` | Wrong Serving Key or configuration |


<a id="how-to-predict-rest-api"></a>

# REST API
Expand All @@ -193,7 +196,7 @@ For each prediction request towards the Rest API, you can pass the following Que
| :--- | :--- | :--- | :--- | :--- |
| environment | `string` | `your-teachable-env` | `production` | Specifies towards which deployment environment the prediction will be made. |
| version | `int` | **min**: `0` **max** `2000` | `0` | Specifies towards which deployment version the prediction will be made. |
| order | `string` | `desc` or `asc` | `desc` | sorting the results when working with multiple number of classes. That works only when you've defined the classes in the Teachable Deployment |
| order | `string` | `desc` or `asc` | `desc` | Very useful option for sorting the results when working with multiple number of classes. That works only when you've defined the classes in the Teachable Deployment|
| limit | `int` | **min**: `0` **max** `2000` | `-1` | Set limitation on the number of classes that will be returned. That works only when you've defined the classes in the Teachable Deployment|
| threshold | `float` | **min**: `0.0` **max** `1.0` | `0.0` | Returns all classes that the model confidence classifies above the given threshold.|

Expand Down

0 comments on commit cb2a468

Please sign in to comment.