Skip to content

Commit

Permalink
[stg] link pass MicrosoftDocs#1: update local conceptual links
Browse files Browse the repository at this point in the history
  • Loading branch information
mmacy committed Dec 13, 2016
1 parent e07274d commit a6e5253
Show file tree
Hide file tree
Showing 19 changed files with 70 additions and 70 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -24,4 +24,4 @@ Drive manifests can be automatically backed up to blobs by setting the `BackupDr
You can retrieve the URI of the backup drive manifests for a job by calling the [Get Job](../importexport/Get-Job3.md) operation. The blob URI is returned in the `ManifestUri` property for each drive.

## See Also
[Using the Import/Export Service REST API](../importexport/Using-the-Azure-Import-Export-Service-REST-API.md)
[Using the Import/Export Service REST API](storage-import-export-using-the-rest-api.md)
Original file line number Diff line number Diff line change
Expand Up @@ -28,4 +28,4 @@ You can request that a job be cancelled before it is in the `Packaging` state by
To delete a job, call the [Delete Job](../importexport/Delete-Job1.md) operation before the job has shipped (*i.e.*, while the job is in the `Creating` state). You can also delete a job when it is in the `Completed` state. After a job has been deleted, its information and status are no longer accessible via the REST API or the Azure Management Portal.

## See Also
[Using the Import/Export Service REST API](../importexport/Using-the-Azure-Import-Export-Service-REST-API.md)
[Using the Import/Export Service REST API](storage-import-export-using-the-rest-api.md)
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ Creating an export job for the Microsoft Azure Import/Export service using the R
- The list of blobs (or blob prefixes) to be exported.

## Shipping Your Drives
Next, use the Azure Import/Export tool to determine the number of drives you need to send, based on the blobs you have selected to be exported and the drive size. See the [Azure Import-Export Tool Reference](../importexport/Azure-Import-Export-Tool-Reference.md) for details.
Next, use the Azure Import/Export tool to determine the number of drives you need to send, based on the blobs you have selected to be exported and the drive size. See the [Azure Import-Export Tool Reference](storage-import-export-tool-how-to-v1.md) for details.

Package the drives in a single package and ship them to the address obtained in the earlier step. Note the tracking number of your package for the next step.

Expand All @@ -96,4 +96,4 @@ Creating an export job for the Microsoft Azure Import/Export service using the R
After your export job has been processed, your drives will be returned to you with your encrypted data. You can retrieve the BitLocker key for each of the drives by calling the [Get Job](../importexport/Get-Job3.md) operation. You can then unlock the drive using the key. The drive manifest file on each drive contains the list of files on the drive, as well as the original blob address for each file.

## See Also
[Using the Import/Export Service REST API](../importexport/Using-the-Azure-Import-Export-Service-REST-API.md)
[Using the Import/Export Service REST API](storage-import-export-using-the-rest-api.md)
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ Creating an import job for the Microsoft Azure Import/Export service using the R
## Preparing Drives with the Azure Import/Export Tool
The steps to prepare drives for an import job are the same whether you create the job via the portal or via the REST API.

Below is a brief overview of drive preparation. Refer to the [Azure Import-Export Tool Reference](../importexport/Azure-Import-Export-Tool-Reference.md) for complete instructions. You can download the Azure Import/Export tool [here](http://go.microsoft.com/fwlink/?LinkID=301900).
Below is a brief overview of drive preparation. Refer to the [Azure Import-Export Tool Reference](storage-import-export-tool-how-to-v1.md) for complete instructions. You can download the Azure Import/Export tool [here](http://go.microsoft.com/fwlink/?LinkID=301900).

Preparing your drive involves:

Expand Down Expand Up @@ -104,4 +104,4 @@ Creating an import job for the Microsoft Azure Import/Export service using the R
After you have your tracking number, call the [Update Job Properties](../importexport/Update-Job-Properties.md) operation to update the shipping carrier name, the tracking number for the job, and the carrier account number for return shipping. You can optionally specify the number of drives and the shipping date as well.

## See Also
[Using the Import/Export Service REST API](../importexport/Using-the-Azure-Import-Export-Service-REST-API.md)
[Using the Import/Export Service REST API](storage-import-export-using-the-rest-api.md)
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ For each drive processed, the Azure Import/Export service creates an error log i

- Incorrect schema for the blob properties and/or metadata files

There may be cases where some parts of an import or export job do not complete successfully, while the overall job still completes. In this case, you can either upload or download the missing pieces of the data over network, or you can create a new job to transfer the data. See the [Azure Import-Export Tool Reference](../importexport/Azure-Import-Export-Tool-Reference.md) to learn how to repair the data over network.
There may be cases where some parts of an import or export job do not complete successfully, while the overall job still completes. In this case, you can either upload or download the missing pieces of the data over network, or you can create a new job to transfer the data. See the [Azure Import-Export Tool Reference](storage-import-export-tool-how-to-v1.md) to learn how to repair the data over network.

## See Also
[Using the Import/Export Service REST API](../importexport/Using-the-Azure-Import-Export-Service-REST-API.md)
[Using the Import/Export Service REST API](storage-import-export-using-the-rest-api.md)
2 changes: 1 addition & 1 deletion articles/storage/storage-import-export-enumerating-jobs.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,4 +30,4 @@ To enumerate all jobs in a subscription, call the [List Jobs](../importexport/Li
- The time the job was created

## See Also
[Using the Import/Export Service REST API](../importexport/Using-the-Azure-Import-Export-Service-REST-API.md)
[Using the Import/Export Service REST API](storage-import-export-using-the-rest-api.md)
Original file line number Diff line number Diff line change
Expand Up @@ -87,4 +87,4 @@ You can call the [Get Job](../importexport/Get-Job3.md) operation to retrieve in
When a job or drive fails to progress normally through its expected life cycle, the job or drive will be moved into a `Faulted` state. At that point, the operations team will contact the customer by email or phone. Once the issue is resolved, the faulted job or drive will be taken out of the `Faulted` state and moved into the appropriate state.

## See Also
[Using the Import/Export Service REST API](../importexport/Using-the-Azure-Import-Export-Service-REST-API.md)
[Using the Import/Export Service REST API](storage-import-export-using-the-rest-api.md)
12 changes: 6 additions & 6 deletions articles/storage/storage-import-export-tool-how-to-v1.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,17 +22,17 @@ ms.author: renash
You use the Microsoft Azure Import/Export tool to prepare hard drives for an import job, to repair an import job, or to repair an export job.

## In This Section
[Setting Up the Azure Import-Export Tool](../importexport/Setting-Up-the-Azure-Import-Export-Tool.md)
[Setting Up the Azure Import-Export Tool](storage-import-export-tool-setup-v1.md)

[Preparing Hard Drives for an Import Job](../importexport/Preparing-Hard-Drives-for-an-Import-Job.md)
[Preparing Hard Drives for an Import Job](storage-import-export-tool-preparing-hard-drives-import-v1.md)

[Reviewing Job Status with Copy Log Files](../importexport/Reviewing-Job-Status-with-Copy-Log-Files.md)
[Reviewing Job Status with Copy Log Files](storage-import-export-tool-reviewing-job-status-v1.md)

[Repairing an Import Job](../importexport/Repairing-an-Import-Job.md)
[Repairing an Import Job](storage-import-export-tool-repairing-an-import-job-v1.md)

[Repairing an Export Job](../importexport/Repairing-an-Export-Job.md)
[Repairing an Export Job](storage-import-export-tool-repairing-an-export-job-v1.md)

[Troubleshooting the Azure Import-Export Tool](../importexport/Troubleshooting-the-Azure-Import-Export-Tool.md)
[Troubleshooting the Azure Import-Export Tool](storage-import-export-tool-troubleshooting-v1.md)

## See Also
[Storage Import/Export REST](../importexport/Storage-Import-Export-Service-REST-API-Reference.md)
Original file line number Diff line number Diff line change
Expand Up @@ -68,10 +68,10 @@ To prepare one or more hard drives for an import job, follow these steps:

- A copy session can copy either a single directory or a single blob to the drive. If you are copying multiple directories, multiple blobs, or a combination of both, you'll need to create multiple copy sessions.

- You can specify properties and metadata that will be set on the blobs imported as part of an import job. The properties or metadata that you specify for a copy session will apply to all blobs specified by that copy session. If you want to specify different properties or metadata for some blobs, you'll need to create a separate copy session. See [Setting Properties and Metadata during the Import Process](../importexport/Setting-Properties-and-Metadata-during-the-Import-Process.md) for more information.
- You can specify properties and metadata that will be set on the blobs imported as part of an import job. The properties or metadata that you specify for a copy session will apply to all blobs specified by that copy session. If you want to specify different properties or metadata for some blobs, you'll need to create a separate copy session. See [Setting Properties and Metadata during the Import Process](storage-import-export-tool-setting-properties-metadata-import-v1.md)for more information.

> [!NOTE]
> If you have multiple machines that meet the requirements outlined in [Setting Up the Azure Import-Export Tool](../importexport/Setting-Up-the-Azure-Import-Export-Tool.md), you can copy data to multiple hard drives in parallel by running an instance of this tool on each machine.
> If you have multiple machines that meet the requirements outlined in [Setting Up the Azure Import-Export Tool](storage-import-export-tool-setup-v1.md), you can copy data to multiple hard drives in parallel by running an instance of this tool on each machine.
For each hard drive that you prepare with the Azure Import/Export tool, the tool will create a single journal file. You will need the journal files from all of your drives to create the import job. The journal file can also be used to resume drive preparation if the tool is interrupted.

Expand Down Expand Up @@ -135,8 +135,8 @@ To prepare one or more hard drives for an import job, follow these steps:
|**/dstdir:**<DestinationBlobVirtualDirectory\>|`Required.` The path to the destination virtual directory in your Windows Azure storage account. The virtual directory may or may not already exist.<br /><br /> You can specify a container, or a blob prefix like `music/70s/`. The destination directory must begin with the container name, followed by a forward slash "/", and optionally may include a virtual blob directory that ends with "/".<br /><br /> When the destination container is the root container, you must explicitly specify the root container, including the forward slash, as `$root/`. Since blobs under the root container cannot include "/" in their names, any subdirectories in the source directory will not be copied when the destination directory is the root container.<br /><br /> Be sure to use valid container names when specifying destination virtual directories or blobs. Keep in mind that container names must be lowercase. For container naming rules, see [Naming and Referencing Containers, Blobs, and Metadata](../fileservices/Naming%20and%20Referencing%20Containers,%20Blobs,%20and%20Metadata.md).|
|**/Disposition:**<rename&#124;no-overwrite&#124;overwrite>|`Optional.` Specifies the behavior when a blob with the specified address already exists. Valid values for this parameter are: `rename`, `no-overwrite` and `overwrite`. Note that these values are case-sensitive. If no value is specified, the default is `rename`.<br /><br /> The value specified for this parameter affects all the files in the directory specified by the `/srcdir` parameter.|
|**/BlobType:**<BlockBlob&#124;PageBlob>|`Optional.` Specifies the blob type for the destination blobs. Valid values are: `BlockBlob` and `PageBlob`. Note that these values are case-sensitive. If no value is specified, the default is `BlockBlob`.<br /><br /> In most cases, `BlockBlob` is recommended. If you specify `PageBlob`, the length of each file in the directory must be a multiple of 512, the size of a page for page blobs.|
|**/PropertyFile:**<PropertyFile\>|`Optional.` Path to the property file for the destination blobs. See [Import-Export Service Metadata and Properties File Format](../importexport/Import-Export-Service-Metadata-and-Properties-File-Format.md) for more information.|
|**/MetadataFile:**<MetadataFile\>|`Optional.` Path to the metadata file for the destination blobs. See [Import-Export Service Metadata and Properties File Format](../importexport/Import-Export-Service-Metadata-and-Properties-File-Format.md) for more information.|
|**/PropertyFile:**<PropertyFile\>|`Optional.` Path to the property file for the destination blobs. See [Import-Export Service Metadata and Properties File Format](storage-import-export-file-format-metadata-and-properties.md) for more information.|
|**/MetadataFile:**<MetadataFile\>|`Optional.` Path to the metadata file for the destination blobs. See [Import-Export Service Metadata and Properties File Format](storage-import-export-file-format-metadata-and-properties.md) for more information.|

### Parameters for Copying a Single File
When copying a single file, the following required and optional parameters apply:
Expand All @@ -147,8 +147,8 @@ To prepare one or more hard drives for an import job, follow these steps:
|**/dstblob:**<DestinationBlobPath\>|`Required.` The path to the destination blob in your Windows Azure storage account. The blob may or may not already exist.<br /><br /> Specify the blob name beginning with the container name. The blob name cannot start with "/" or the storage account name. For blob naming rules, see [Naming and Referencing Containers, Blobs, and Metadata](../fileservices/Naming%20and%20Referencing%20Containers,%20Blobs,%20and%20Metadata.md).<br /><br /> When the destination container is the root container, you must explicitly specify `$root` as the container, such as `$root/sample.txt`. Note that blobs under the root container cannot include "/" in their names.|
|**/Disposition:**<rename&#124;no-overwrite&#124;overwrite>|`Optional.` Specifies the behavior when a blob with the specified address already exists. Valid values for this parameter are: `rename`, `no-overwrite` and `overwrite`. Note that these values are case-sensitive. If no value is specified, the default is `rename`.|
|**/BlobType:**<BlockBlob&#124;PageBlob>|`Optional.` Specifies the blob type for the destination blobs. Valid values are: `BlockBlob` and `PageBlob`. Note that these values are case-sensitive. If no value is specified, the default is `BlockBlob`.<br /><br /> In most cases, `BlockBlob` is recommended. If you specify `PageBlob`, the length of each file in the directory must be a multiple of 512, the size of a page for page blobs.|
|**/PropertyFile:**<PropertyFile\>|`Optional.` Path to the property file for the destination blobs. See [Import-Export Service Metadata and Properties File Format](../importexport/Import-Export-Service-Metadata-and-Properties-File-Format.md) for more information.|
|**/MetadataFile:**<MetadataFile\>|`Optional.` Path to the metadata file for the destination blobs. See [Import-Export Service Metadata and Properties File Format](../importexport/Import-Export-Service-Metadata-and-Properties-File-Format.md) for more information.|
|**/PropertyFile:**<PropertyFile\>|`Optional.` Path to the property file for the destination blobs. See [Import-Export Service Metadata and Properties File Format](storage-import-export-file-format-metadata-and-properties.md) for more information.|
|**/MetadataFile:**<MetadataFile\>|`Optional.` Path to the metadata file for the destination blobs. See [Import-Export Service Metadata and Properties File Format](storage-import-export-file-format-metadata-and-properties.md) for more information.|

### Resuming an Interrupted Copy Session
If a copy session is interrupted for any reason, you can resume it by running the tool with only the journal file specified:
Expand All @@ -172,11 +172,11 @@ WAImportExport.exe PrepImport /j:<JournalFile> /id:<SessionId> /AbortSession
Only the last copy session, if terminated abnormally, can be aborted. Note that you cannot abort the first copy session for a drive. Instead you must restart the copy session with a new journal file.

## See Also
[Setting Up the Azure Import-Export Tool](../importexport/Setting-Up-the-Azure-Import-Export-Tool.md)
[Setting Properties and Metadata during the Import Process](../importexport/Setting-Properties-and-Metadata-during-the-Import-Process.md)
[Setting Up the Azure Import-Export Tool](storage-import-export-tool-setup-v1.md)
[Setting Properties and Metadata during the Import Process](storage-import-export-tool-setting-properties-metadata-import-v1.md)
[Sample Workflow to Prepare Hard Drives for an Import Job](../importexport/Sample-Workflow-to-Prepare-Hard-Drives-for-an-Import-Job.md)
[Quick Reference for Frequently Used Commands](../importexport/Quick-Reference-for-Frequently-Used-Commands-for-Import-Jobs.md)
[Reviewing Job Status with Copy Log Files](../importexport/Reviewing-Job-Status-with-Copy-Log-Files.md)
[Repairing an Import Job](../importexport/Repairing-an-Import-Job.md)
[Repairing an Export Job](../importexport/Repairing-an-Export-Job.md)
[Troubleshooting the Azure Import-Export Tool](../importexport/Troubleshooting-the-Azure-Import-Export-Tool.md)
[Quick Reference for Frequently Used Commands](storage-import-export-tool-quick-reference-v1.md) 
[Reviewing Job Status with Copy Log Files](storage-import-export-tool-reviewing-job-status-v1.md)
[Repairing an Import Job](storage-import-export-tool-repairing-an-import-job-v1.md)
[Repairing an Export Job](storage-import-export-tool-repairing-an-export-job-v1.md)
[Troubleshooting the Azure Import-Export Tool](storage-import-export-tool-troubleshooting-v1.md)
Original file line number Diff line number Diff line change
Expand Up @@ -71,4 +71,4 @@ Number of drives needed: 3
```

## See Also
[Azure Import-Export Tool Reference](../importexport/Azure-Import-Export-Tool-Reference.md)
[Azure Import-Export Tool Reference](storage-import-export-tool-how-to-v1.md)
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ translation.priority.mt:
- zh-tw
---
# Quick Reference for Frequently Used Commands for Import Jobs
This section provides a quick references for some frequently used commands. For detailed usage, see [Preparing Hard Drives for an Import Job](../importexport/Preparing-Hard-Drives-for-an-Import-Job.md).
This section provides a quick references for some frequently used commands. For detailed usage, see [Preparing Hard Drives for an Import Job](storage-import-export-tool-preparing-hard-drives-import-v1.md).

## Copy a Single Directory to a Hard Drive
Here is a sample command to copy a single source directory to a hard drive that hasn’t been yet been encrypted with BitLocker:
Expand Down
Loading

0 comments on commit a6e5253

Please sign in to comment.