You can use Project Migrate application which orchestrates whole process of KBC project migration from one KBC stack to another.
Application for restore KBC project from snapshot generated by keboola.project-backup
app (https://github.com/keboola/app-project-backup).
Destination project must be empty! (Contains any buckets in Storage and component configurations.)
- Validates if destination project is empty
- Creates empty buckets and restore their attributes and metadata
- Skip linked and system buckets
- Do not restore bucket sharing settings
- Use storage backend due
useDefaultBackend
settings
- Creates component configurations
- Including configuration rows
- Including configuration and row state
- Remove OAuth authorizations from configuration
- Orchestrations, GoodData Writers and Snowflake database writers are not restored automatically
- Creates empty tables and restore their attributes and metadata
- Imports data into tables
- Creates table aliases
Use parameters generated by generate-read-credentials
action of keboola.project-backup
app:
backupUri
accessKeyId
#secretAccessKey
#sessionToken
Optional params
useDefaultBackend
(boolean, default false) - Use default storage backend, otherwise buckets will be created in same backend as in source project.
Example request for restore project in Keboola Connection EU from S3 Storage:
curl -X POST \
https://docker-runner.eu-central-1.keboola.com/docker/keboola.project-restore/run \
-H 'Cache-Control: no-cache' \
-H 'X-StorageApi-Token: **EU_STORAGE_API_TOKEN**' \
-d '{
"configData": {
"parameters": {
"s3": {
"backupUri": "**BACKUP_URI**",
"accessKeyId": "**AWS_ACCESS_KEY**",
"#secretAccessKey": "**AWS_SECRET_ACCESS_KEY**",
"#sessionToken": "**AWS_SESSION_TOKEN",
}
"useDefaultBackend": true
}
}
}'
Example request for restore project in Keboola Connection EU from Azure Blob Storage:
curl -X POST \
https://docker-runner.eu-central-1.keboola.com/docker/keboola.project-restore/run \
-H 'Cache-Control: no-cache' \
-H 'X-StorageApi-Token: **EU_STORAGE_API_TOKEN**' \
-d '{
"configData": {
"parameters": {
"abs": {
"container": "**container**",
"#connectionString": "**connectionString**",
}
"useDefaultBackend": true
}
}
}'
You can use prepared applications to migrate Orchestrations, GoodData Writers and Snowflake database writers between projects:
- Project Migration - Orchestrator:
keboola.app-orchestrator-migrate
app (https://github.com/keboola/app-orchestrator-migrate) - Project Migration - Snowflake Writer:
keboola.app-snowflake-writer-migrate
app (https://github.com/keboola/app-snowflake-writer-migrate)
- Clone this repository:
git clone https://github.com/keboola/app-project-restore.git
cd app-project-restore
-
Create AWS services from CloudFormation template aws-tests-cf-template.json
It will create new S3 bucket and IAM User in AWS
-
Create
.env
file an fill variables:TEST_AWS_*
- Output of your CloudFormation stackTEST_STORAGE_API_URL
- KBC Storage API endpointTEST_STORAGE_API_TOKEN
- KBC Storage API tokenTEST_COMPONENT_ID
- Restore APP component ID in KBC (keboola.project-restore)TEST_AZURE_ACCOUNT_
- Storage Account in your Azure SubscriptionTEST_AZURE_CONTAINER_NAME
- Container name where is stored the data
TEST_AWS_ACCESS_KEY_ID=
TEST_AWS_SECRET_ACCESS_KEY=
TEST_AWS_REGION=
TEST_AWS_S3_BUCKET=
TEST_COMPONENT_ID=keboola.project-restore
TEST_STORAGE_API_URL=
TEST_STORAGE_API_TOKEN=
TEST_AZURE_ACCOUNT_NAME=
TEST_AZURE_ACCOUNT_KEY=
TEST_AZURE_CONTAINER_NAME=
- Build Docker image
docker-compose build
-
Run the test suite using this command
Tests will delete all current component configurations and data from the KBC project!
docker-compose run --rm dev composer ci
For information about deployment and integration with KBC, please refer to the deployment section of developers documentation
MIT licensed, see LICENSE file.