Blogs  >  Microsoft Azure Storage Client Module for Node.js v. 0.2.0

Microsoft Azure Storage Client Module for Node.js v. 0.2.0

Microsoft Azure Storage Client Module for Node.js v. 0.2.0


The Microsoft Azure Storage team is pleased to announce the CTP release of our first dedicated client library module for Node.js. This new release incorporates significant improvements over the existing node module and will be the Storage client module that is actively developed and supported moving forward. This module contains many improvements in the programming model and architecture, and includes support for the 2014-02-14 REST Storage service version. The list of improvements is long but in summary our 0.2 release includes: support for RA-GRS, expanded Table Payload Protocol support, Table Insert optimization and query helpers, Shared Access Signature support for all services, Maximum Client Execution Timeout etc.

This is currently a CTP release, meaning that this is the best time to give feedback. The source code is available via Github (note the updated location). You can download the package and its dependencies from npm using the following:

npm install azure-storage

 

Samples

For information on setting up the Node.js environment for development, see the Resources section at the bottom of this document. We have also provided a series of Getting Started articles on the Azure Node.js developer center (Blobs, Tables, Queues) and samples on Github to help clients get up and running with Azure Storage and to illustrate some key scenarios.

To run a given sample or test, the connection information for the storage account need to be set up. This can be provided using:

  • Environment variables - AZURE_STORAGE_ACCOUNT and AZURE_STORAGE_ACCESS_KEY, or AZURE_STORAGE_CONNECTION_STRING.
  • Constructors - For example, var tableSvc = azure.createTableService(accountName, accountKey);

In order to be able to use a proxy like fiddler, an additional environment variable should be set up:

HTTP_PROXY

Depending on the platform being used, these can be accomplished using the following:

  • Windows(In command prompt) -

set AZURE_STORAGE_CONNECTION_STRING=DefaultEndpointsProtocol=http;AccountName= ”account name”;AccountKey=”account key”
set HTTP_PROXY= http://127.0.0.1:8888

  • Linux(In shell) –

AZURE_STORAGE_CONNECTION_STRING=DefaultEndpointsProtocol=http;AccountName= ”account name”;AccountKey=”account key”
HTTP_PROXY= http://127.0.0.1:8888

  • Mac(Edit the .bashrc or .bash_profile file and add the following lines) –

export AZURE_STORAGE_CONNECTION_STRING=DefaultEndpointsProtocol=http;AccountName= ”account name”;AccountKey=”account key”
export HTTP_PROXY= http://127.0.0.1:8888

Users attempting to develop against the Storage Emulator should make sure they are using Emulator version 3.2 or later that supports 2014-02-14 REST version and should set the environment variable ‘EMULATED’ to true.

The unit tests can be run from the module's root directory using:

npm test

 

A Note about Packaging

We have moved the Storage module out of the Microsoft Azure SDK for Node to its own module. Developers leveraging the existing module will need to update their dependencies accordingly as the previous storage implementation (azure-storage-legacy) will be deprecated in the future and Microsoft Azure SDK for Node will include this module. A migration guide for users moving from the existing module will be made available in the future. To begin using the storage module, install ‘azure-storage’ via npm and add it to your .js file.

var azure = require(‘azure-storage’);

 

What’s New

This Storage Client Module provides support for the 2014-02-14 REST service version, as well as key client side improvements across Blobs, Queues and Tables listed below.

 

Support for Read Access Geo Redundant Storage

This release has full support for Read Access Geo-Redundant Storage (RA-GRS) which allows for reading from the Storage account data in the secondary region. This functionality needs to be enabled via the portal for a given Storage account.

Applications may retrieve the secondary replication status and last sync time for the [Blob|Table|Queue]Service using getServiceStats. Setting the location mode on the service object and invoking getServiceStats is shown in the example below. The locationMode can also be configured on a per request basis by setting it on the options object (options.locationMode). By default, the location mode is set to PRIMARY_ONLY.

var azure = require(‘azure-storage’); 
var blobService = azure.createBlobService(); 
// getServiceStats is supported only on secondary endpoints. 
blobService.defaultLocationMode = 
azure.StorageUtilities.LocationMode.SECONDARY_ONLY; 
blobService.getServiceStats(function (error, serviceStats){ 
    if(!error) { 
        Console.log(‘Replication status = %s and LastSyncTime = %s’, serviceStats. 
        GeoReplication.Status, serviceStats.GeoReplication. LastSyncTime); 
    } 
});

 

Expanded Table Protocol Support (JSON)

This release adds support for JSON payloads on the table service. Instead of AtomPub, the default protocol is now JSON minimal metadata. (You can read more details regarding these protocols as well as view sample payloads here). This change dramatically reduces the payload size of the request and the CPU required to process it, allowing applications to scale higher and realize lower overall latencies for table operations. The AtomPub protocol is no longer supported by the client. An example of setting the payloadFormat on the service object is shown in the example below. The payloadFormat can also be configured on a per request basis by setting it on the options object.

var azure = require(‘azure-storage’); 
var tableService = azure.createTableService(); 
// Set the payload format to JsonNoMetadata. 
tableService.defaultPayloadFormat = TableUtilities.PayloadFormat.NO_METADATA;

 

In some scenarios, notably when using JsonNoMetadata, clients may wish to provide the property type information at runtime for entities returned by queries. To support this scenario, the user may implement a property resolver which returns the EdmType for each property. The sample below illustrates a propertyResolver implementation.

var propertyResolver = function (pk, rk, name, value) { 
    if (name.indexOf('BinaryField') !== -1) { 
    return 'Edm.Binary'; 
    } else if (name.indexOf('GuidField') !== -1) { 
    return 'Edm.Guid'; 
    } else if (name.indexOf('DateField') !== -1) { 
    return 'Edm.DateTime'; 
    } else if (name.indexOf('DoubleField') !== -1) { 
    return 'Edm.Double'; 
    } 
    return 'Edm.String'; 
};

 

This propertyResolver is set on the options object as shown below.

options.propertyResolver = propertyResolver; 

Note that due to JavaScript’s inability to differentiate between an int and a double whose decimal part is 0 (5 versus 5.0) while parsing numbers, both of these are returned without the decimal component and a type is not specified. To preserve the type information while round tripping data from the server, please use a property resolver as described above to differentiate between these.

For more information about table queries, please check out the documentation in tableservice.js.

 

 

Table Insert Optimizations

In previous versions, the service would “echo” back the entity content in the response body. With this release all Table Insert operations, including those executed as part of a batch operation, will send the Prefer: return-no-content header to avoid this behavior. This optimization can dramatically reduce latencies for insert operations. Please note that this will cause the resulting HTTP status code on the insertEntityResponse for successful inserts to be 204 (no-content) rather than 201 (Created) and the entity is not returned back in the insertEntityResult object. The echo content behavior can be re-enabled by setting echoContent to true in the options object as shown below:

options.echoContent = true;

 

Table Query Helpers

TableQuery provides helper methods to create filter strings to use with the where clause for a query. Query strings may include type specifiers where necessary. For example, to query on a long value, users could do the following:

var tableQuery = new TableQuery().where(TableQuery.int64Filter('Int64Field', TableUtilities.QueryComparisons.EQUAL, '4294967296'));

OR

var tableQuery = new TableQuery().where('Int64Field == ?int64?', '4294967296');

 

Table Entity Creation Helpers

We have added entity-property-creation helper methods to aid users. Users can use the entityGenerator helper provided in TableUtilities and create entity properties as follows instead of using the more verbose ‘_’ and ‘$’ format:

 
var entGen = TableUtilities.entityGenerator;
var entity = { PartitionKey: entGen.String('part2'),
RowKey: entGen.String('row1'),
boolValueTrue: entGen.Boolean(true),
intValue: entGen.Int32(42),
dateValue: entGen.DateTime(new Date(Date.UTC(2011, 10, 25))),
};

 

We have also removed one additional layer of addressing while adding type information to an entity. A new entity can now be created using:

 

DateValue: { 
_: dateVal, 
$: 'Edm.DateTime' 
} 

 

Other notable new features

We have expanded the api surface to add additional conveniences, including:

  • Shared Access Signature support for Containers, Queues and Tables
  • Anonymous Access support for Blobs
  • Leasing support for Containers
  • Ability to set the Maximum Execution Timeout across retries for every API
  • MD5 validation for blob upload/download
  • Table Batch re-design to construct and execute Table Batch requests instead of turning the service batch mode on/off

 

ChangeLog

This release includes several client side changes and features other than the notable ones mentioned above. You can view the complete ChangeLog on Github.

 

What’s next

We are adding support for Files in an upcoming release. You can find more information about the File service support in Azure Storage and sign-up details here.

 

Summary

We are continuously making improvements to the developer experience for Azure Storage and very much value your feedback in the comments section below, the MSDN or StackOverflow forums, or GitHub. If you hit any issues, filing them on GitHub will also allow you to track the resolution.

Veena Udayabhanu, Emily Gerner, Adam Sorrin, and Vinay Shah

 

Resources

Source (Github)

npm

How to Use the Blob Service from Node.js

How to Use the Queue Service from Node.js

How to Use the Table Service from Node.js

Azure Storage Release - Introducing CORS, JSON, Minute Metrics, and More

Azure Tables: Introducing JSON

Azure Storage Redundancy Options and Read Access Geo Redundant Storage

Introduction to Node.js

Install Node.js

Sublime Text Environment

Debugging Node.js applications using node-inspector


Comments (0)