- Databricks notebooks to perform MLflow export and import operations.
- Use these notebooks when you want to copy MLflow objects from one Databricks workspace (tracking server) to another.
- In order to copy MLflow objects between workspaces, you will need to set up a shared cloud bucket mounted on each workspace's DBFS.
- The notebooks use Git integration with Databricks Repos though they can be run as a simple non-Repo workspace folder.
- See _README.py for more details.
There are two types of notebooks:
- Standard widget-based notebooks that call the MLflow Export Import API.
- Console script notebooks that use the shell to call the standard call Python scripts specified here. Slightly experimental.
Export and import one MLflow object.
Export | Import |
---|---|
Export_Run | Import_Run |
Export_Experiment | Import_Experiment.py |
Export_Model | Import_Model.py |
Export_Model_Version | Import_Model_Version.py |
Copy an MLflow object.
MLflow object |
---|
Copy_Model_Version |
Copy_Run |
Exports and imports multiple MLflow objects.
Export | Import |
---|---|
Export_Experiments | Import_Experiments |
Export_Models | Import_Models |
Export_All | Use Import_Models |
Experimental
Using Databricks %sh
cell mode, you can execute MLflow Export Import scripts from the Linux shell.
See the _README.py and and Console_Scripts notebook.
From a notebook you can then call a script such as:
export-model --help
You can load these notebooks into Databricks either as a workspace folder or a Git Repo.
See the Workspace CLI.
git clone https://github.com/mlflow/mlflow-export-import
databricks workspace import_dir \
databricks_notebooks \
/Users/[email protected]/mlflow-export-import
You can load a Git Repo either through the Databricks UI or via the command line.
See Clone a Git Repo & other common Git operations.
Note it's best to use the curl version since the CLI doesn't appear to support the sparse checkout option.
curl \
https://my.company.com/api/2.0/repos \
-H "Authorization: Bearer MY_TOKEN" \
-X POST \
-d ' {
"url": "https://github.com/mlflow/mlflow-export-import",
"provider": "gitHub",
"path": "/Repos/[email protected]/mlflow-export-import",
"sparse_checkout": {
"patterns": [ "databricks_notebooks" ]
}
}'