Skip to content

Commit

Permalink
Update docs/docstrings following mlstacks repo name change (zenml-i…
Browse files Browse the repository at this point in the history
…o#1754)

* update docs following mlstacks repo name change

* Update docs/book/stacks-and-components/stack-deployment/contribute-flavors-or-components.md

Co-authored-by: Andrei Vishniakov <[email protected]>

---------

Co-authored-by: Andrei Vishniakov <[email protected]>
  • Loading branch information
strickvl and avishniakov authored Aug 22, 2023
1 parent e936996 commit c397985
Show file tree
Hide file tree
Showing 12 changed files with 20 additions and 20 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ Before we begin, it will help to understand the [architecture](zenml-self-hosted
If you don't have an existing Kubernetes cluster, you have the following two options to set it up:

* Creating it manually using the documentation for your cloud provider. For convenience, here are links for [AWS](https://docs.aws.amazon.com/eks/latest/userguide/create-cluster.html), [Azure](https://learn.microsoft.com/en-us/azure/aks/learn/quick-kubernetes-deploy-portal?tabs=azure-cli), and [GCP](https://cloud.google.com/kubernetes-engine/docs/how-to/creating-a-zonal-cluster#before\_you\_begin).
* Using a [stack recipe](../../stacks-and-components/stack-deployment/deploy-a-stack-using-stack-recipes.md) that sets up a cluster along with other tools that you might need in your cloud stack like artifact stores and secret managers. Take a look at all [available stack recipes](https://github.com/zenml-io/mlops-stacks#-list-of-recipes) to see if there's something that works for you.
* Using a [stack recipe](../../stacks-and-components/stack-deployment/deploy-a-stack-using-stack-recipes.md) that sets up a cluster along with other tools that you might need in your cloud stack like artifact stores and secret managers. Take a look at all [available stack recipes](https://github.com/zenml-io/mlstacks#-list-of-recipes) to see if there's something that works for you.

{% hint style="warning" %}
Once you have created your cluster, make sure that you configure your [kubectl](https://kubernetes.io/docs/tasks/tools/#kubectl) client to talk to it.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ section.
{% hint style="info" %}
Configuring Seldon Core in a Kubernetes cluster can be a complex and error-prone process, so we have provided a set of
Terraform-based recipes to quickly provision popular combinations of MLOps tools. More information about these recipes
can be found in the [Open Source MLOps Stack Recipes](https://github.com/zenml-io/mlops-stacks).
can be found in the [Open Source MLOps Stack Recipes](https://github.com/zenml-io/mlstacks).
{% endhint %}

### Infrastructure Deployment
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ setup is necessary.

There are many options to use a deployed Airflow server:

* Use one of [ZenML's Airflow stack recipes](https://github.com/zenml-io/mlops-stacks). This is the simplest solution to
* Use one of [ZenML's Airflow stack recipes](https://github.com/zenml-io/mlstacks). This is the simplest solution to
get ZenML working with Airflow, as the recipe also takes care of additional steps such as installing required Python
dependencies in your Airflow server environment.
* Use a managed deployment of Airflow such as [Google Cloud Composer](https://cloud.google.com/composer)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -154,7 +154,7 @@ When using the Kubeflow orchestrator locally, you'll additionally need:

To run the pipeline on a local Kubeflow Pipelines deployment, you can use the ZenML Stack recipes to spin up a local
Kubernetes cluster and install Kubeflow Pipelines on it. The stack recipe is called `k3d-modular` and is available in
the ZenML [stack recipe repository](https://github.com/zenml-io/mlops-stacks/tree/main/k3d-modular). The recipe is
the ZenML [stack recipe repository](https://github.com/zenml-io/mlstacks/tree/main/k3d-modular). The recipe is
modular, meaning that you can configure it to use different orchestrators, Model Deployers, and other tools.

To deploy the stack, run the following commands:
Expand All @@ -175,7 +175,7 @@ zenml stack set <STACK_NAME>
kubectl get ingress -n kubeflow -o jsonpath='{.items[0].spec.rules[0].host}'
```

You can read more about the recipes in the [ZenML Stack Recipe Repository](https://github.com/zenml-io/mlops-stacks/tree/main/k3d-modular).
You can read more about the recipes in the [ZenML Stack Recipe Repository](https://github.com/zenml-io/mlstacks/tree/main/k3d-modular).

{% hint style="warning" %}
The local Kubeflow Pipelines deployment requires more than 4 GB of RAM, and 30 GB of disk space, so if you are using
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ The only other thing necessary to use the ZenML Sagemaker orchestrator is enabli
particular role.

In order to quickly enable APIs, and create other resources necessary for to use this integration, we will soon provide
a Sagemaker stack recipe via [our `mlops-stacks` recipe repository](https://github.com/zenml-io/mlops-stacks), which
a Sagemaker stack recipe via [our `mlstacks` recipe repository](https://github.com/zenml-io/mlstacks), which
will help you set up the infrastructure with one click.

### Infrastructure Deployment
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ The only other thing necessary to use the ZenML Vertex orchestrator is enabling
project.

In order to quickly enable APIs, and create other resources necessary for using this integration, you can also consider
using the [Vertex AI stack recipe](https://github.com/zenml-io/mlops-stacks/tree/main/vertex-ai), which helps you set up
using the [Vertex AI stack recipe](https://github.com/zenml-io/mlstacks/tree/main/vertex-ai), which helps you set up
the infrastructure with one click.

## How to use it
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,11 @@ description: Creating your custom stack component solutions.

Before you can start contributing, it is important to know that the stack component deploy CLI also makes use of the stack recipes (specifically, the modular stack recipes) in the background. Thus, contributing a new deployment option for both the deployment methods that we have seen, involves making a contribution to the base recipe.

You can refer to the [`CONTRIBUTING.md`](https://github.com/zenml-io/mlops-stacks/blob/main/CONTRIBUTING.md) on the `mlops-stack` repo to get an overview of how all recipes are designed in general but here are instructions for contributing to the modular recipes specifically.
You can refer to the [`CONTRIBUTING.md`](https://github.com/zenml-io/mlstacks/blob/main/CONTRIBUTING.md) on the `mlstacks` repo to get an overview of how all recipes are designed in general but here are instructions for contributing to the modular recipes specifically.

## Adding new MLOps tools

* Clone the [`mlops-stacks`](https://github.com/zenml-io/mlops-stacks) repo and create a branch off `develop`.
* Clone the [`mlstacks`](https://github.com/zenml-io/mlstacks) repo and create a branch off `develop`.
* Every file inside the modular recipes represents a tool and all code pertaining to the deployment of it resides there. Create a new file with a name that reflects what would be deployed.
* Populate this file with all the terraform code that is relevant to the component. Make sure any dependencies are also included in the same file.
* Add any local values that you might need here to the `locals.tf` file.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ The command currently uses your local credentials for GCP and AWS to provision r

<summary>Want to know what happens in the background?</summary>

The stack component deploy CLI is powered by ZenML's [Stack Recipes](https://github.com/zenml-io/mlops-stacks) in the background, more specifically the [new modular recipes](https://github.com/zenml-io/mlops-stacks/releases/tag/0.6.0). These allow you to configure and deploy select stack components as opposed to deploying the full stack, as with the legacy stack recipes.
The stack component deploy CLI is powered by ZenML's [Stack Recipes](https://github.com/zenml-io/mlstacks) in the background, more specifically the [new modular recipes](https://github.com/zenml-io/mlstacks/releases/tag/0.6.0). These allow you to configure and deploy select stack components as opposed to deploying the full stack, as with the legacy stack recipes.

Using the values you pass for the cloud, the CLI picks up the right modular recipe to use (one of AWS, GCP, or k3d) and then deploys that recipe with the specific stack component enabled.

Expand All @@ -71,7 +71,7 @@ Whenever you pass in a flavor to any stack-component deploy function, the combin
enable_<STACK_COMPONENT>_<FLAVOR>
```

This variable is then passed as input to the underlying modular recipe. If you check the [`variables.tf`](https://github.com/zenml-io/mlops-stacks/blob/main/gcp-modular/variables.tf) file for a given recipe, you can find all the supported flavor-stack component combinations there.
This variable is then passed as input to the underlying modular recipe. If you check the [`variables.tf`](https://github.com/zenml-io/mlstacks/blob/main/gcp-modular/variables.tf) file for a given recipe, you can find all the supported flavor-stack component combinations there.

</details>

Expand All @@ -87,7 +87,7 @@ With simplicity, we didn't want to compromise on the flexibility that this deplo

The flags that you pass to the deploy CLI are passed on as-is to the backing modular recipes as input variables. This means that all the flags need to be defined as variables in the respective recipe.

For example, if you take a look at the [`variables.tf`](https://github.com/zenml-io/mlops-stacks/blob/main/gcp-modular/variables.tf) file for a modular recipe, like the `gcp-modular` recipe, you can find variables like `mlflow_bucket` that correspond to the `--mlflow-bucket` flag that can be passed to the experiment tracker's deploy CLI.
For example, if you take a look at the [`variables.tf`](https://github.com/zenml-io/mlstacks/blob/main/gcp-modular/variables.tf) file for a modular recipe, like the `gcp-modular` recipe, you can find variables like `mlflow_bucket` that correspond to the `--mlflow-bucket` flag that can be passed to the experiment tracker's deploy CLI.

Validation for these flags does not exist yet at the CLI level, so you must be careful in naming them while calling `deploy`.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ description: Deploying an entire stack with the ZenML Stack Recipes.

A Stack Recipe is a collection of carefully crafted Terraform modules and resources which, when executed, creates a range of stack components that can be used to run your pipelines. Each recipe is designed to offer a great deal of flexibility in configuring the resources while preserving the ease of application through the use of sensible defaults.

Check out the full list of available recipes at the [mlops-stacks repository](https://github.com/zenml-io/mlops-stacks#-list-of-recipes).
Check out the full list of available recipes at the [mlstacks repository](https://github.com/zenml-io/mlstacks#-list-of-recipes).

## When should I use the stack recipes?

Expand All @@ -31,7 +31,7 @@ We recommend the use of _modular_ recipes going forward if you're deploying on G

### Deploying a recipe 🚀

Detailed steps are available [in the README](https://github.com/zenml-io/mlops-stacks#-list-of-recipes) of the respective stack recipes but here's what a simple flow could look like:
Detailed steps are available [in the README](https://github.com/zenml-io/mlstacks#-list-of-recipes) of the respective stack recipes but here's what a simple flow could look like:

1. 📃 List all available recipes in the repository.

Expand Down Expand Up @@ -74,7 +74,7 @@ zenml stack import <STACK_NAME> -f <PATH_TO_THE_CREATED_STACK_CONFIG_YAML>
<summary>Want more details on how this works internally?</summary>
The stack recipe CLI interacts with the [mlops-stacks](https://github.com/zenml-io/mlops-stacks) repository to fetch the recipes and stores them locally in the **Global Config** directory. From here, they are pulled to your local directory or whatever directory you specify in the `--path` flag for the CLI.
The stack recipe CLI interacts with the [mlstacks](https://github.com/zenml-io/mlstacks) repository to fetch the recipes and stores them locally in the **Global Config** directory. From here, they are pulled to your local directory or whatever directory you specify in the `--path` flag for the CLI.
This is what you see and where you can make any changes you want to the recipe files. You can also use native terraform commands like `terraform apply` to deploy components but this would require you to pass the variables manually using the `-var-file` flag to the terraform CLI.
Expand Down
2 changes: 1 addition & 1 deletion src/zenml/cli/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -1397,7 +1397,7 @@ def my_pipeline(...):
a few commands. Each recipe uses Terraform modules under the hood and once
executed can set up a ZenML stack, ready to run your pipelines!
A number of stack recipes are already available at [the `mlops-stacks` repository](https://github.com/zenml-io/mlops-stacks/). List them
A number of stack recipes are already available at [the `mlstacks` repository](https://github.com/zenml-io/mlstacks/). List them
using the following command:
```bash
Expand Down
4 changes: 2 additions & 2 deletions src/zenml/cli/stack_recipes.py
Original file line number Diff line number Diff line change
Expand Up @@ -194,12 +194,12 @@ def describe(
cli_utils.declare(metadata["Description"])


@stack_recipe.command(help="The active version of the mlops-stacks repository")
@stack_recipe.command(help="The active version of the mlstacks repository")
@pass_git_stack_recipes_handler
def version(
git_stack_recipes_handler: GitStackRecipesHandler,
) -> None:
"""The active version of the mlops-stacks repository.
"""The active version of the mlstacks repository.
Args:
git_stack_recipes_handler: The GitStackRecipesHandler instance.
Expand Down
4 changes: 2 additions & 2 deletions src/zenml/recipes/stack_recipe_service.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@
ZENML_VERSION_VARIABLE = "zenml-version"

EXCLUDED_RECIPE_DIRS = [""]
STACK_RECIPES_GITHUB_REPO = "https://github.com/zenml-io/mlops-stacks.git"
STACK_RECIPES_GITHUB_REPO = "https://github.com/zenml-io/mlstacks.git"
STACK_RECIPES_REPO_DIR = "zenml_stack_recipes"


Expand Down Expand Up @@ -439,7 +439,7 @@ def clean_current_stack_recipes() -> None:
shutil.rmtree(stack_recipes_directory)

def get_active_version(self) -> Optional[str]:
"""Returns the active version of the mlops-stacks repository.
"""Returns the active version of the mlstacks repository.
Returns:
The active version of the repository.
Expand Down

0 comments on commit c397985

Please sign in to comment.