Skip to content

Commit

Permalink
Incorporate links and fix code blocks.
Browse files Browse the repository at this point in the history
  • Loading branch information
robinsh committed Dec 14, 2016
2 parents b0e14ec + 7b70736 commit 78abb83
Show file tree
Hide file tree
Showing 17 changed files with 45 additions and 41 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,9 @@ ms.author: renash
---

# Backing Up Drive Manifests
Drive manifests can be automatically backed up to blobs by setting the `BackupDriveManifest` property to `true` in the [Put Job](../importexport/Put-Job.md) or [Update Job Properties](../importexport/Update-Job-Properties.md) operations. By default the drive manifests are not backed up. The drive manifest backups are stored as block blobs in a container within the storage account associated with the job. By default, the container name is `waimportexport`, but you can specify a different name in the `ImportExportStatesPath` property when calling the `Put Job` or `Update Job Properties` operations. The backup manifest blob are named in the following format: `waies/jobname_driveid_timestamp_manifest.xml`.
Drive manifests can be automatically backed up to blobs by setting the `BackupDriveManifest` property to `true` in the [Put Job](/rest/api/storageservices/importexport/Put-Job) or [Update Job Properties](/rest/api/storageservices/importexport/Update-Job-Properties) operations. By default the drive manifests are not backed up. The drive manifest backups are stored as block blobs in a container within the storage account associated with the job. By default, the container name is `waimportexport`, but you can specify a different name in the `ImportExportStatesPath` property when calling the `Put Job` or `Update Job Properties` operations. The backup manifest blob are named in the following format: `waies/jobname_driveid_timestamp_manifest.xml`.

You can retrieve the URI of the backup drive manifests for a job by calling the [Get Job](../importexport/Get-Job3.md) operation. The blob URI is returned in the `ManifestUri` property for each drive.
You can retrieve the URI of the backup drive manifests for a job by calling the [Get Job](/rest/api/storageservices/importexport/Get-Job3) operation. The blob URI is returned in the `ManifestUri` property for each drive.

## See Also
[Using the Import/Export Service REST API](storage-import-export-using-the-rest-api.md)
Original file line number Diff line number Diff line change
Expand Up @@ -21,11 +21,11 @@ ms.author: renash
---

# Cancelling and Deleting Jobs
You can request that a job be cancelled before it is in the `Packaging` state by calling the [Update Job Properties](../importexport/Update-Job-Properties.md) operation and setting the `CancelRequested` element to `true`. The job will be cancelled on a best-effort basis. If drives are in the process of transferring data, data may continue to be transferred even after cancellation has been requested.
You can request that a job be cancelled before it is in the `Packaging` state by calling the [Update Job Properties](/rest/api/storageservices/importexport/Update-Job-Properties) operation and setting the `CancelRequested` element to `true`. The job will be cancelled on a best-effort basis. If drives are in the process of transferring data, data may continue to be transferred even after cancellation has been requested.

A cancelled job will move to the `Completed` state and be kept for 90 days, at which point it will be deleted.

To delete a job, call the [Delete Job](../importexport/Delete-Job1.md) operation before the job has shipped (*i.e.*, while the job is in the `Creating` state). You can also delete a job when it is in the `Completed` state. After a job has been deleted, its information and status are no longer accessible via the REST API or the Azure Management Portal.
To delete a job, call the [Delete Job](/rest/api/storageservices/importexport/Delete-Job1) operation before the job has shipped (*i.e.*, while the job is in the `Creating` state). You can also delete a job when it is in the `Completed` state. After a job has been deleted, its information and status are no longer accessible via the REST API or the Azure Management Portal.

## See Also
[Using the Import/Export Service REST API](storage-import-export-using-the-rest-api.md)
12 changes: 6 additions & 6 deletions articles/storage/storage-import-export-creating-an-export-job.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,14 +48,14 @@ Creating an export job for the Microsoft Azure Import/Export service using the R

- You can export all blobs and snapshots in the storage account.

For more information about specifying blobs to export, see the [Put Job](../importexport/Put-Job.md) operation.
For more information about specifying blobs to export, see the [Put Job](/rest/api/storageservices/importexport/Put-Job) operation.

## Obtaining Your Shipping Location
Before creating an export job, you need to obtain a shipping location name and address by calling the [List Locations](../importexport/List-Locations2.md) operation. `List Locations` will return a list of locations and their mailing addresses based on the location of your storage account. You can select a location from the returned list and ship your hard drives to that address. Note that `List Locations` may return only one possible shipping location.
Before creating an export job, you need to obtain a shipping location name and address by calling the [List Locations](/rest/api/storageservices/importexport/List-Locations2) operation. `List Locations` will return a list of locations and their mailing addresses based on the location of your storage account. You can select a location from the returned list and ship your hard drives to that address. Note that `List Locations` may return only one possible shipping location.

Follow the steps below to obtain the shipping location:

- Identify the name of the location of your storage account. This value can be found under the **Location** field on the storage account’s **Dashboard** in the classic portal or queried for by using the service management API operation [Get Storage Account Properties](../fileservices/Get%20Storage%20Account%20Properties1.md).
- Identify the name of the location of your storage account. This value can be found under the **Location** field on the storage account’s **Dashboard** in the classic portal or queried for by using the service management API operation [Get Storage Account Properties](/rest/api/storagerp/storageaccounts#StorageAccounts_GetProperties).

- Retrieve a list of locations that are available to process this storage account by calling the `List Locations` operation with the query parameter `originlocation=<location-name>`. The list returned will contain one or more locations to which you can ship your drives.

Expand All @@ -65,7 +65,7 @@ Creating an export job for the Microsoft Azure Import/Export service using the R
> For the preview release, `List Locations` will return one location, or none. If no locations are returned, the service is not yet available for your storage account.
## Creating the Export Job
To create the export job, call the [Put Job](../importexport/Put-Job.md) operation. You will need to provide the following information:
To create the export job, call the [Put Job](/rest/api/storageservices/importexport/Put-Job) operation. You will need to provide the following information:

- A name for the job.

Expand All @@ -90,10 +90,10 @@ Creating an export job for the Microsoft Azure Import/Export service using the R
> You must ship your drives via a supported carrier service, which will provide a tracking number for your package.
## Updating the Export Job with Your Package Information
After you have your tracking number, call the [Update Job Properties](../importexport/Update-Job-Properties.md) operation to updated the carrier name and tracking number for the job. You can optionally specify the number of drives, the return address, and the shipping date as well.
After you have your tracking number, call the [Update Job Properties](/rest/api/storageservices/importexport/Update-Job-Properties) operation to updated the carrier name and tracking number for the job. You can optionally specify the number of drives, the return address, and the shipping date as well.

## Receiving the Package
After your export job has been processed, your drives will be returned to you with your encrypted data. You can retrieve the BitLocker key for each of the drives by calling the [Get Job](../importexport/Get-Job3.md) operation. You can then unlock the drive using the key. The drive manifest file on each drive contains the list of files on the drive, as well as the original blob address for each file.
After your export job has been processed, your drives will be returned to you with your encrypted data. You can retrieve the BitLocker key for each of the drives by calling the [Get Job](/rest/api/storageservices/importexport/Get-Job3) operation. You can then unlock the drive using the key. The drive manifest file on each drive contains the list of files on the drive, as well as the original blob address for each file.

## See Also
[Using the Import/Export Service REST API](storage-import-export-using-the-rest-api.md)
Original file line number Diff line number Diff line change
Expand Up @@ -56,11 +56,11 @@ Creating an import job for the Microsoft Azure Import/Export service using the R
- A listing of the action to take if a blob that is being uploaded has the same name as an existing blob in the container. Possible options are: a) overwrite the blob with the file b) keep the existing blob and skip uploading the file, c) append a suffix to the name so that it does not conflict with other files.

## Obtaining Your Shipping Location
Before creating an import job, you need to obtain a shipping location name and address by calling the [List Locations](../importexport/List-Locations2.md) operation. `List Locations` will return a list of locations and their mailing addresses based on the location of your storage account. You can select a location from the returned list and ship your hard drives to that address. Note that `List Locations` returns only one possible shipping location.
Before creating an import job, you need to obtain a shipping location name and address by calling the [List Locations](/rest/api/storageservices/importexport/List-Locations2) operation. `List Locations` will return a list of locations and their mailing addresses based on the location of your storage account. You can select a location from the returned list and ship your hard drives to that address. Note that `List Locations` returns only one possible shipping location.

Follow the steps below to obtain the shipping location:

- Identify the name of the location of your storage account. This value can be found under the **Location** field on the storage account’s **Dashboard** in the classic portal or queried for by using the service management API operation [Get Storage Account Properties](../fileservices/Get%20Storage%20Account%20Properties1.md).
- Identify the name of the location of your storage account. This value can be found under the **Location** field on the storage account’s **Dashboard** in the classic portal or queried for by using the service management API operation [Get Storage Account Properties](/rest/api/storagerp/storageaccounts#StorageAccounts_GetProperties).

- Retrieve a list of locations that are available to process this storage account by calling the `List Locations` operation with the query parameter `originlocation=<location-name>`. The list returned will contain one or more locations to which you can ship your drives.

Expand All @@ -70,7 +70,7 @@ Creating an import job for the Microsoft Azure Import/Export service using the R
> `List Locations` either returns one possible location, or none. If no locations are returned, the service is not yet available for your storage account.
## Creating the Import Job
To create the import job, call the [Put Job](../importexport/Put-Job.md) operation. You will need to provide the following information:
To create the import job, call the [Put Job](/rest/api/storageservices/importexport/Put-Job) operation. You will need to provide the following information:

- A name for the job.

Expand Down Expand Up @@ -101,7 +101,7 @@ Creating an import job for the Microsoft Azure Import/Export service using the R
> You must ship your drives via a supported carrier service, which will provide a tracking number for your package.
## Updating the Import Job with Your Shipping Information
After you have your tracking number, call the [Update Job Properties](../importexport/Update-Job-Properties.md) operation to update the shipping carrier name, the tracking number for the job, and the carrier account number for return shipping. You can optionally specify the number of drives and the shipping date as well.
After you have your tracking number, call the [Update Job Properties](/rest/api/storageservices/importexport/Update-Job-Properties) operation to update the shipping carrier name, the tracking number for the job, and the carrier account number for return shipping. You can optionally specify the number of drives and the shipping date as well.

## See Also
[Using the Import/Export Service REST API](storage-import-export-using-the-rest-api.md)
Original file line number Diff line number Diff line change
Expand Up @@ -19,11 +19,11 @@ ms.author: renash
---

# Diagnostics and Error Recovery for Import-Export Jobs
For each drive processed, the Azure Import/Export service creates an error log in the associated storage account. You can also enable verbose logging by setting the `EnableVerboseLog` property to `true` when calling the [Put Job](../importexport/Put-Job.md) or [Update Job Properties](../importexport/Update-Job-Properties.md) operations.
For each drive processed, the Azure Import/Export service creates an error log in the associated storage account. You can also enable verbose logging by setting the `EnableVerboseLog` property to `true` when calling the [Put Job](/rest/api/storageservices/importexport/Put-Job) or [Update Job Properties](/rest/api/storageservices/importexport/Update-Job-Properties) operations.

By default, logs are written to a container named `waimportexport`. You can specify a different name by setting the `ImportExportStatesPath` property when calling the `Put Job` or `Update Job Properties` operations. The logs are stored as block blobs with the following naming convention: `waies/jobname_driveid_timestamp_logtype.xml`.

You can retrieve the URI of the logs for a job by calling the [Get Job](../importexport/Get-Job3.md) operation. The URI for the verbose log is returned in the `VerboseLogUri` property for each drive, while the URI for the error log is returned in the `ErrorLogUri` property.
You can retrieve the URI of the logs for a job by calling the [Get Job](/rest/api/storageservices/importexport/Get-Job3) operation. The URI for the verbose log is returned in the `VerboseLogUri` property for each drive, while the URI for the error log is returned in the `ErrorLogUri` property.

You can use the logging data to identify the following issues:

Expand Down
2 changes: 1 addition & 1 deletion articles/storage/storage-import-export-enumerating-jobs.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ ms.author: renash
---

# Enumerating Jobs
To enumerate all jobs in a subscription, call the [List Jobs](../importexport/List-Jobs3.md) operation. `List Jobs` returns a list of jobs as well as the following attributes:
To enumerate all jobs in a subscription, call the [List Jobs](/rest/api/storageservices/importexport/List-Jobs3) operation. `List Jobs` returns a list of jobs as well as the following attributes:

- The type of job (Import or Export)

Expand Down
6 changes: 3 additions & 3 deletions articles/storage/storage-import-export-file-format-log.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ There are two logs that may be written by the Import/Export service:

- The error log is always generated in the event of an error.

- The verbose log is not enabled by default, but may be enabled by setting the `EnableVerboseLog` property on a [Put Job](../importexport/Put-Job.md) or [Update Job Properties](../importexport/Update-Job-Properties.md) operation.
- The verbose log is not enabled by default, but may be enabled by setting the `EnableVerboseLog` property on a [Put Job](/rest/api/storageservices/importexport/Put-Job) or [Update Job Properties](/rest/api/storageservices/importexport/Update-Job-Properties) operation.

## Log File Location
The logs are written to block blobs in the container or virtual directory specified by the `ImportExportStatesPath` setting, which you can set on a `Put Job` operation. The location to which the logs are written depends on how authentication is specified for the job, together with the value specified for `ImportExportStatesPath`. Authentication for the job may be specified via a storage account key, or a container SAS (shared access signature).
Expand All @@ -41,7 +41,7 @@ The table below shows the possible options:
|Container SAS|Default value|A virtual directory named `waimportexport`, which is the default name, beneath the container specified in the SAS.<br /><br /> For example, if the SAS specified for the job is `https://myaccount.blob.core.windows.net/mylogcontainer?sv=2012-02-12&se=2015-05-22T06%3A54%3A55Z&sr=c&sp=wl&sig=sigvalue`, then the log location would be `https://myaccount.blob.core.windows.net/mylogcontainer/waimportexport`|
|Container SAS|User-specified value|A virtual directory named by the user, beneath the container specified in the SAS.<br /><br /> For example, if the SAS specified for the job is `https://myaccount.blob.core.windows.net/mylogcontainer?sv=2012-02-12&se=2015-05-22T06%3A54%3A55Z&sr=c&sp=wl&sig=sigvalue`, and the specified virtual directory is named `mylogblobs`, then the log location would be `https://myaccount.blob.core.windows.net/mylogcontainer/waimportexport/mylogblobs`.|

You can retrieve the URL for the error and verbose logs by calling the [Get Job](../importexport/Get-Job3.md) operation. The logs are available after processing of the drive is complete.
You can retrieve the URL for the error and verbose logs by calling the [Get Job](/rest/api/storageservices/importexport/Get-Job3) operation. The logs are available after processing of the drive is complete.

## Log File Format
The format for both logs is the same: a blob containing XML descriptions of the events that occurred while copying blobs between the hard drive and the customer's account.
Expand Down Expand Up @@ -355,4 +355,4 @@ The following error log for an export job indicates that the blob content has be
```

## See Also
[Storage Import/Export REST](../importexport/Storage-Import-Export-Service-REST-API-Reference.md)
[Storage Import/Export REST](/rest/api/storageservices/importexport/Storage-Import-Export-Service-REST-API-Reference)
Loading

0 comments on commit 78abb83

Please sign in to comment.