title | description | services | documentationcenter | author | manager | editor | ms.service | ms.workload | ms.tgt_pltfrm | ms.devlang | ms.topic | ms.date | ms.author | robots |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Move data from SAP Business Warehouse using Azure Data Factory | Microsoft Docs |
Learn about how to move data from SAP Business Warehouse using Azure Data Factory. |
data-factory |
linda33wj |
craigg |
data-factory |
data-services |
na |
na |
conceptual |
01/10/2018 |
jingwang |
noindex |
[!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
Note
This article applies to version 1 of Data Factory. If you are using the current version of the Data Factory service, see SAP Business Warehouse connector in V2.
This article explains how to use the Copy Activity in Azure Data Factory to move data from an on-premises SAP Business Warehouse (BW). It builds on the Data Movement Activities article, which presents a general overview of data movement with the copy activity.
You can copy data from an on-premises SAP Business Warehouse data store to any supported sink data store. For a list of data stores supported as sinks by the copy activity, see the Supported data stores table. Data factory currently supports only moving data from an SAP Business Warehouse to other data stores, but not for moving data from other data stores to an SAP Business Warehouse.
This connector supports SAP Business Warehouse version 7.x. It supports copying data from InfoCubes and QueryCubes (including BEx queries) using MDX queries.
To enable the connectivity to the SAP BW instance, install the following components:
- Data Management Gateway: Data Factory service supports connecting to on-premises data stores (including SAP Business Warehouse) using a component called Data Management Gateway. To learn about Data Management Gateway and step-by-step instructions for setting up the gateway, see Moving data between on-premises data store to cloud data store article. Gateway is required even if the SAP Business Warehouse is hosted in an Azure IaaS virtual machine (VM). You can install the gateway on the same VM as the data store or on a different VM as long as the gateway can connect to the database.
- SAP NetWeaver library on the gateway machine. You can get the SAP Netweaver library from your SAP administrator, or directly from the SAP Software Download Center. Search for the SAP Note #1025361 to get the download location for the most recent version. Make sure that the architecture for the SAP NetWeaver library (32-bit or 64-bit) matches your gateway installation. Then install all files included in the SAP NetWeaver RFC SDK according to the SAP Note. The SAP NetWeaver library is also included in the SAP Client Tools installation.
Tip
Put the dlls extracted from the NetWeaver RFC SDK into system32 folder.
You can create a pipeline with a copy activity that moves data from an on-premises Cassandra data store by using different tools/APIs.
- The easiest way to create a pipeline is to use the Copy Wizard. See Tutorial: Create a pipeline using Copy Wizard for a quick walkthrough on creating a pipeline using the Copy data wizard.
- You can also use the following tools to create a pipeline: Azure portal, Visual Studio, Azure PowerShell, Azure Resource Manager template, .NET API, and REST API. See Copy activity tutorial for step-by-step instructions to create a pipeline with a copy activity.
Whether you use the tools or APIs, you perform the following steps to create a pipeline that moves data from a source data store to a sink data store:
- Create linked services to link input and output data stores to your data factory.
- Create datasets to represent input and output data for the copy operation.
- Create a pipeline with a copy activity that takes a dataset as an input and a dataset as an output.
When you use the wizard, JSON definitions for these Data Factory entities (linked services, datasets, and the pipeline) are automatically created for you. When you use tools/APIs (except .NET API), you define these Data Factory entities by using the JSON format. For a sample with JSON definitions for Data Factory entities that are used to copy data from an on-premises SAP Business Warehouse, see JSON example: Copy data from SAP Business Warehouse to Azure Blob section of this article.
The following sections provide details about JSON properties that are used to define Data Factory entities specific to an SAP BW data store:
The following table provides description for JSON elements specific to SAP Business Warehouse (BW) linked service.
Property | Description | Allowed values | Required |
---|---|---|---|
server | Name of the server on which the SAP BW instance resides. | string | Yes |
systemNumber | System number of the SAP BW system. | Two-digit decimal number represented as a string. | Yes |
clientId | Client ID of the client in the SAP W system. | Three-digit decimal number represented as a string. | Yes |
username | Name of the user who has access to the SAP server | string | Yes |
password | Password for the user. | string | Yes |
gatewayName | Name of the gateway that the Data Factory service should use to connect to the on-premises SAP BW instance. | string | Yes |
encryptedCredential | The encrypted credential string. | string | No |
For a full list of sections & properties available for defining datasets, see the Creating datasets article. Sections such as structure, availability, and policy of a dataset JSON are similar for all dataset types (Azure SQL, Azure blob, Azure table, etc.).
The typeProperties section is different for each type of dataset and provides information about the location of the data in the data store. There are no type-specific properties supported for the SAP BW dataset of type RelationalTable.
For a full list of sections & properties available for defining activities, see the Creating Pipelines article. Properties such as name, description, input and output tables, are policies are available for all types of activities.
Whereas, properties available in the typeProperties section of the activity vary with each activity type. For Copy activity, they vary depending on the types of sources and sinks.
When source in copy activity is of type RelationalSource (which includes SAP BW), the following properties are available in typeProperties section:
Property | Description | Allowed values | Required |
---|---|---|---|
query | Specifies the MDX query to read data from the SAP BW instance. | MDX query. | Yes |
The following example provides sample JSON definitions that you can use to create a pipeline by using Azure portal or Visual Studio or Azure PowerShell. This sample shows how to copy data from an on-premises SAP Business Warehouse to an Azure Blob Storage. However, data can be copied directly to any of the sinks stated here using the Copy Activity in Azure Data Factory.
Important
This sample provides JSON snippets. It does not include step-by-step instructions for creating the data factory. See moving data between on-premises locations and cloud article for step-by-step instructions.
The sample has the following data factory entities:
- A linked service of type SapBw.
- A linked service of type AzureStorage.
- An input dataset of type RelationalTable.
- An output dataset of type AzureBlob.
- A pipeline with Copy Activity that uses RelationalSource and BlobSink.
The sample copies data from an SAP Business Warehouse instance to an Azure blob hourly. The JSON properties used in these samples are described in sections following the samples.
As a first step, setup the data management gateway. The instructions are in the moving data between on-premises locations and cloud article.
This linked service links your SAP BW instance to the data factory. The type property is set to SapBw. The typeProperties section provides connection information for the SAP BW instance.
{
"name": "SapBwLinkedService",
"properties":
{
"type": "SapBw",
"typeProperties":
{
"server": "<server name>",
"systemNumber": "<system number>",
"clientId": "<client id>",
"username": "<SAP user>",
"password": "<Password for SAP user>",
"gatewayName": "<gateway name>"
}
}
}
This linked service links your Azure Storage account to the data factory. The type property is set to AzureStorage. The typeProperties section provides connection information for the Azure Storage account.
{
"name": "AzureStorageLinkedService",
"properties": {
"type": "AzureStorage",
"typeProperties": {
"connectionString": "DefaultEndpointsProtocol=https;AccountName=<accountname>;AccountKey=<accountkey>"
}
}
}
This dataset defines the SAP Business Warehouse dataset. You set the type of the Data Factory dataset to RelationalTable. Currently, you do not specify any type-specific properties for an SAP BW dataset. The query in the Copy Activity definition specifies what data to read from the SAP BW instance.
Setting external property to true informs the Data Factory service that the table is external to the data factory and is not produced by an activity in the data factory.
Frequency and interval properties defines the schedule. In this case, the data is read from the SAP BW instance hourly.
{
"name": "SapBwDataset",
"properties": {
"type": "RelationalTable",
"linkedServiceName": "SapBwLinkedService",
"typeProperties": {},
"availability": {
"frequency": "Hour",
"interval": 1
},
"external": true
}
}
This dataset defines the output Azure Blob dataset. The type property is set to AzureBlob. The typeProperties section provides where the data copied from the SAP BW instance is stored. The data is written to a new blob every hour (frequency: hour, interval: 1). The folder path for the blob is dynamically evaluated based on the start time of the slice that is being processed. The folder path uses year, month, day, and hours parts of the start time.
{
"name": "AzureBlobDataSet",
"properties": {
"type": "AzureBlob",
"linkedServiceName": "AzureStorageLinkedService",
"typeProperties": {
"folderPath": "mycontainer/sapbw/yearno={Year}/monthno={Month}/dayno={Day}/hourno={Hour}",
"format": {
"type": "TextFormat",
"rowDelimiter": "\n",
"columnDelimiter": "\t"
},
"partitionedBy": [
{
"name": "Year",
"value": {
"type": "DateTime",
"date": "SliceStart",
"format": "yyyy"
}
},
{
"name": "Month",
"value": {
"type": "DateTime",
"date": "SliceStart",
"format": "MM"
}
},
{
"name": "Day",
"value": {
"type": "DateTime",
"date": "SliceStart",
"format": "dd"
}
},
{
"name": "Hour",
"value": {
"type": "DateTime",
"date": "SliceStart",
"format": "HH"
}
}
]
},
"availability": {
"frequency": "Hour",
"interval": 1
}
}
}
The pipeline contains a Copy Activity that is configured to use the input and output datasets and is scheduled to run every hour. In the pipeline JSON definition, the source type is set to RelationalSource (for SAP BW source) and sink type is set to BlobSink. The query specified for the query property selects the data in the past hour to copy.
{
"name": "CopySapBwToBlob",
"properties": {
"description": "pipeline for copy activity",
"activities": [
{
"type": "Copy",
"typeProperties": {
"source": {
"type": "RelationalSource",
"query": "<MDX query for SAP BW>"
},
"sink": {
"type": "BlobSink",
"writeBatchSize": 0,
"writeBatchTimeout": "00:00:00"
}
},
"inputs": [
{
"name": "SapBwDataset"
}
],
"outputs": [
{
"name": "AzureBlobDataSet"
}
],
"policy": {
"timeout": "01:00:00",
"concurrency": 1
},
"scheduler": {
"frequency": "Hour",
"interval": 1
},
"name": "SapBwToBlob"
}
],
"start": "2017-03-01T18:00:00Z",
"end": "2017-03-01T19:00:00Z"
}
}
As mentioned in the data movement activities article, Copy activity performs automatic type conversions from source types to sink types with the following two-step approach:
- Convert from native source types to .NET type
- Convert from .NET type to native sink type
When moving data from SAP BW, the following mappings are used from SAP BW types to .NET types.
Data type in the ABAP Dictionary | .Net Data Type |
---|---|
ACCP | Int |
CHAR | String |
CLNT | String |
CURR | Decimal |
CUKY | String |
DEC | Decimal |
FLTP | Double |
INT1 | Byte |
INT2 | Int16 |
INT4 | Int |
LANG | String |
LCHR | String |
LRAW | Byte[] |
PREC | Int16 |
QUAN | Decimal |
RAW | Byte[] |
RAWSTRING | Byte[] |
STRING | String |
UNIT | String |
DATS | String |
NUMC | String |
TIMS | String |
Note
To map columns from source dataset to columns from sink dataset, see Mapping dataset columns in Azure Data Factory.
To learn about mapping columns in source dataset to columns in sink dataset, see Mapping dataset columns in Azure Data Factory.
When copying data from relational data stores, keep repeatability in mind to avoid unintended outcomes. In Azure Data Factory, you can rerun a slice manually. You can also configure retry policy for a dataset so that a slice is rerun when a failure occurs. When a slice is rerun in either way, you need to make sure that the same data is read no matter how many times a slice is run. See Repeatable read from relational sources
See Copy Activity Performance & Tuning Guide to learn about key factors that impact performance of data movement (Copy Activity) in Azure Data Factory and various ways to optimize it.