Skip to content

Latest commit

 

History

History
357 lines (271 loc) · 16.2 KB

storage-quickstart-blobs-nodejs.md

File metadata and controls

357 lines (271 loc) · 16.2 KB
title description services author ms.custom ms.service ms.topic ms.date ms.author
How to create a blob in Azure Storage using the Node.js SDK v2
Create a storage account and a container in object (Blob) storage. Then use the Azure Storage client library for Node.js v2 to upload a blob to Azure Storage, download a blob, and list the blobs in a container.
storage
tamram
mvc
storage
conceptual
11/14/2018
tamram

How to upload, download, and list blobs using Node.js SDK v2

In this how-to guide, you learn how to use Node.js to upload, download, and list blobs and manage containers with Azure Blob storage.

Prerequisites

If you don't have an Azure subscription, create a free account before you begin.

Create an Azure storage account in the Azure portal. For help creating the account, see Create a storage account.

Download the sample application

The sample application is a simple Node.js console application. To begin, clone the repository to your machine using the following command:

git clone https://github.com/Azure-Samples/storage-blobs-node-quickstart.git

To open the application, look for the storage-blobs-node-quickstart folder and open it in your favorite code editing environment.

[!INCLUDE storage-copy-connection-string-portal]

Configure your storage connection string

Before running the application, you must provide the connection string for your storage account. The sample repository includes a file named .env.example. You can rename this file by removing the .example extension, which results in a file named .env. Inside the .env file, add your connection string value after the AZURE_STORAGE_CONNECTION_STRING key.

Install required packages

In the application directory, run npm install to install the required packages for the application.

npm install

Run the sample

Now that the dependencies are installed, you can run the sample by issuing the following command:

npm start

The output from the script will be similar to the following:

Containers:
 - container-one
 - container-two
Container "demo" is created
Blob "quickstart.txt" is uploaded
Local file "./readme.md" is uploaded
Blobs in "demo" container:
 - quickstart.txt
 - readme.md
Blob downloaded blob content: "hello Blob SDK"
Blob "quickstart.txt" is deleted
Container "demo" is deleted
Done

If you are using a new storage account for this example, then you may not see any container names listed under the label "Containers".

Understanding the code

The first expression is used to load values into environment variables.

if (process.env.NODE_ENV !== 'production') {
    require('dotenv').load();
}

The dotenv module loads environment variables when running the app locally for debugging. Values are defined in a file named .env and loaded into the current execution context. In production contexts, the server configuration provides these values and that is why this code is only run when the script is not running under a "production" context.

const path = require('path');
const storage = require('azure-storage');

The purpose of the modules is as follows:

file named .env into the current execution context

  • path is required in order to determine the absolute file path of the file to upload to blob storage
  • azure-storage is the Azure Storage SDK module for Node.js

Next, the blobService variable is initialized as a new instance of the Azure Blob service.

const blobService = storage.createBlobService();

In the following implementation, each of the blobService functions is wrapped in a Promise, which allows access to JavaScript's async function and await operator to streamline the callback nature of the Azure Storage API. When a successful response returns for each function, the promise resolves with relevant data along with a message specific to the action.

List containers

The listContainers function calls listContainersSegmented which returns collections of containers in groups.

const listContainers = async () => {
    return new Promise((resolve, reject) => {
        blobService.listContainersSegmented(null, (err, data) => {
            if (err) {
                reject(err);
            } else {
                resolve({ message: `${data.entries.length} containers`, containers: data.entries });
            }
        });
    });
};

The size of the groups is configurable via ListContainersOptions. Calling listContainersSegmented returns blob metadata as an array of ContainerResult instances. Results are returned in 5,000 increment batches (segments). If there are more than 5,000 blobs in a container, then the results include a value for continuationToken. To list subsequent segments from the blob container, you can pass the continuation token back into listContainersSegment as the second argument.

Create a container

The createContainer function calls createContainerIfNotExists and sets the appropriate access level for the blob.

const createContainer = async (containerName) => {
    return new Promise((resolve, reject) => {
        blobService.createContainerIfNotExists(containerName, { publicAccessLevel: 'blob' }, err => {
            if (err) {
                reject(err);
            } else {
                resolve({ message: `Container '${containerName}' created` });
            }
        });
    });
};

The second parameter (options) for createContainerIfNotExists accepts a value for publicAccessLevel. The value blob for publicAccessLevel specifies that specific blob data is exposed to the public. This setting is in contrast to container level access, which grants the ability to list the contents of the container.

The use of createContainerIfNotExists allows the application to run the createContainer command multiple times without returning errors when the container already exists. In a production environment, you often only call createContainerIfNotExists once as the same container is used throughout the application. In these cases, you can create the container ahead of time through the portal or via the Azure CLI.

Upload text

The uploadString function calls createBlockBlobFromText to write (or overwrite) an arbitrary string to the blob container.

const uploadString = async (containerName, blobName, text) => {
    return new Promise((resolve, reject) => {
        blobService.createBlockBlobFromText(containerName, blobName, text, err => {
            if (err) {
                reject(err);
            } else {
                resolve({ message: `Text "${text}" is written to blob storage` });
            }
        });
    });
};

Upload a local file

The uploadLocalFile function uses createBlockBlobFromLocalFile to upload and write (or overwrite) a file from the file system into blob storage.

const uploadLocalFile = async (containerName, filePath) => {
    return new Promise((resolve, reject) => {
        const fullPath = path.resolve(filePath);
        const blobName = path.basename(filePath);
        blobService.createBlockBlobFromLocalFile(containerName, blobName, fullPath, err => {
            if (err) {
                reject(err);
            } else {
                resolve({ message: `Local file "${filePath}" is uploaded` });
            }
        });
    });
};

Other approaches available to upload content into blobs include working with text and streams. To verify the file is uploaded to your blob storage, you can use the Azure Storage Explorer to view the data in your account.

List the blobs

The listBlobs function calls the listBlobsSegmented method to return a list of blob metadata in a container.

const listBlobs = async (containerName) => {
    return new Promise((resolve, reject) => {
        blobService.listBlobsSegmented(containerName, null, (err, data) => {
            if (err) {
                reject(err);
            } else {
                resolve({ message: `${data.entries.length} blobs in '${containerName}'`, blobs: data.entries });
            }
        });
    });
};

Calling listBlobsSegmented returns blob metadata as an array of BlobResult instances. Results are returned in 5,000 increment batches (segments). If there are more than 5,000 blobs in a container, then the results include a value for continuationToken. To list subsequent segments from the blob container, you can pass the continuation token back into listBlobSegmented as the second argument.

Download a blob

The downloadBlob function uses getBlobToText to download the contents of the blob to the given absolute file path.

const downloadBlob = async (containerName, blobName) => {
    const dowloadFilePath = path.resolve('./' + blobName.replace('.txt', '.downloaded.txt'));
    return new Promise((resolve, reject) => {
        blobService.getBlobToText(containerName, blobName, (err, data) => {
            if (err) {
                reject(err);
            } else {
                resolve({ message: `Blob downloaded "${data}"`, text: data });
            }
        });
    });
};

The implementation shown here changes the source returns the contents of the blob as a string. You can also download the blob as a stream as well as directly to a local file.

Delete a blob

The deleteBlob function calls the deleteBlobIfExists function. As the name implies, this function does not return an error if the blob is already deleted.

const deleteBlob = async (containerName, blobName) => {
    return new Promise((resolve, reject) => {
        blobService.deleteBlobIfExists(containerName, blobName, err => {
            if (err) {
                reject(err);
            } else {
                resolve({ message: `Block blob '${blobName}' deleted` });
            }
        });
    });
};

Delete a container

Containers are deleted by calling the deleteContainer method off the blob service and passing in the container name.

const deleteContainer = async (containerName) => {
    return new Promise((resolve, reject) => {
        blobService.deleteContainer(containerName, err => {
            if (err) {
                reject(err);
            } else {
                resolve({ message: `Container '${containerName}' deleted` });
            }
        });
    });
};

Calling code

In order to support JavaScript's async/await syntax, all the calling code is wrapped in a function named execute. Then execute is called and handled as a promise.

async function execute() {
	// commands 
}

execute().then(() => console.log("Done")).catch((e) => console.log(e));

All of the following code runs inside the execute function where the // commands comment is placed.

First, the relevant variables are declared to assign names, sample content and to point to the local file to upload to Blob storage.

const containerName = "demo";
const blobName = "quickstart.txt";
const content = "hello Node SDK";
const localFilePath = "./readme.md";
let response;

To list the containers in the storage account, the listContainers function is called and the returned list of containers is logged to the output window.

console.log("Containers:");
response = await listContainers();
response.containers.forEach((container) => console.log(` -  ${container.name}`));

Once the list of containers is available, then you can use the Array findIndex method to see if the container you want to create already exists. If the container does not exist then the container is created.

const containerDoesNotExist = response.containers.findIndex((container) => container.name === containerName) === -1;

if (containerDoesNotExist) {
    await createContainer(containerName);
    console.log(`Container "${containerName}" is created`);
}

Next, a string and a local file is uploaded to Blob storage.

await uploadString(containerName, blobName, content);
console.log(`Blob "${blobName}" is uploaded`);

response = await uploadLocalFile(containerName, localFilePath);
console.log(response.message);

The process to list blobs is the same as listing containers. The call to listBlobs returns an array of blobs in the container and are logged to the output window.

console.log(`Blobs in "${containerName}" container:`);
response = await listBlobs(containerName);
response.blobs.forEach((blob) => console.log(` - ${blob.name}`));

To download a blob, the response is captured and used to access the value of the blob. From the response readableStreamBody is converted to a string and logged out to the output window.

response = await downloadBlob(containerName, blobName);
console.log(`Downloaded blob content: "${response.text}"`);

Finally, the blob and container are deleted from the storage account.

await deleteBlob(containerName, blobName);
console.log(`Blob "${blobName}" is deleted`);

await deleteContainer(containerName);
console.log(`Container "${containerName}" is deleted`);

Clean up resources

All data written to the storage account is automatically deleted at the end of the code sample.

Resources for developing Node.js applications with blobs

See these additional resources for Node.js development with Blob storage:

Binaries and source code

Client library reference and samples

Next steps

This article demonstrates how to upload a file between a local disk and Azure Blob storage using Node.js. To learn more about working with Blob storage, continue to the GitHub repository.

[!div class="nextstepaction"] Azure Storage SDK for JavaScript repository