Skip to content
This repository has been archived by the owner on Dec 17, 2024. It is now read-only.
/ deploy Public archive

Example deploy scripts to run Arc jobs on ephemeral cloud compute.

Notifications You must be signed in to change notification settings

tripl-ai/deploy

Repository files navigation

Deploy

Kubernetes

These examples show how to run against a Kubernetes cluster:

  • spark-on-k8s-operator for running on Kubernetes with the spark-on-k8s-operator.
  • kubernetes-argo for running on Kubernetes and deploy jobs using Argo workflows. Argo allows the definition of dependencies between jobs which can be useful for complex workflows or jobs which require different service-account permissions to execute.
  • jupyterhub-for-kubernetes for how to run JupyterHub on Kubernetes to allow end-users to start their own JupyterLab instances to build Arc jobs.

AWS

These are example Terraform scripts to demonstrate how to execute Arc against a remote cluster.

These both assume the default security group has access to SSH to your EC2 instances. There are sample user-data-* scripts in the ./templates which have commands for mounting the local SSD of certain instance types to /data. If used the following docker run commands should be used:

-v /data/local:/local \
-e "SPARK_LOCAL_DIRS=/local" \
-e "SPARK_WORKER_DIR=/local" \

About

Example deploy scripts to run Arc jobs on ephemeral cloud compute.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •