Blogs  >  Windows Azure Storage: Introducing CORS

Windows Azure Storage: Introducing CORS

Windows Azure Storage: Introducing CORS


We recently released CORS(Cross Origin Resource Sharing) for Windows Azure Storage. CORS is supported for Blob, Table and Queue services and can be enabled for each service through the Windows Azure Storage Client Library 3.0. In this blog, we will explain CORS and how Windows Azure Storage supports CORS with an example. The complete sample code can be found in here.

What is CORS

Web Browsers, also referred to as User Agents, commonly apply same origin restriction policy to network requests. These restrictions prevent client-side web applications running from a specific domain to issue requests to another domain for security reasons. As an example, a piece of JavaScript code loaded as part of http://www.contoso.com cannot issue requests at will to any other domain like http://www.northwindtraders.com since the Web Browser will prevent such requests from executing.

CORS is a mechanism that would relax such restriction and allows domains to give each other permissions for accessing each other’s resources. The User Agent in this case would send extra headers in order to ensure that the JavaScript code loaded from a certain domain say www.contoso.com is allowed to access resources located at www.northwindtraders.com. In this case the latter domain could either reply back with extra headers allowing or denying www.contoso.com access to its resources.

CORS extends HTTP specification and more details can be found at http://www.w3.org/TR/cors/.

Why CORS in Windows Azure Storage

In the absence of CORS, websites that use Windows Azure Storage service would need to act as a proxy for storage calls. All requests to storage would need to be proxied via the service and hence the service would have to scale out their servers in case of load increase. As an example, assume that you have a feature as part of your http://www.contoso.com website where users can upload blobs and you want to persist them in your Windows Azure Storage account contoso.blob.core.windows.net. In this case, the upload would have to be done from the browser to your service which in turn would PUT the data in the storage account. This means you would need to scale out your service whenever there is an increase in your website upload traffic.

With CORS, you can eliminate the need to have your service in the upload path. You would in this case enable CORS on your contoso.blob.core.windows.net storage account and allow contoso.com access. Then, your Javascript code can now directly upload blobs using shared access signatures to your storage account without the need of a proxy service. You can therefore benefit from the massive scale of the Windows Azure Storage service without needing to scale out a service in order to deal with any increase in upload traffic to your website.

Please note that CORS is not an authentication mechanism. Any request made against a storage resource when CORS is enabled must either have a proper authentication signature, or must be made against a public resource.

For more details on CORS for Windows Azure Storage, please refer to the following MSDN documentation http://msdn.microsoft.com/en-us/library/windowsazure/dn535601.aspx.

How Does CORS work

Typically, CORS is accomplished in 2 phases. A preflight Request (OPTIONS) request followed by the actual request i.e. GET, PUT, POST, etc.

In some cases, User Agents will not issue a preflight requests in case of simple requests that are issues with simple methods such as GET.

Preflight Request

Preflight request is a mechanism to query the CORS restriction imposed by the service. The User Agent will issue such request whenever some code as part of the Javascript attempts to send a request to another domain. The Preflight requests is an OPTIONS request that carries the actual request headers, the intended HTTP request and the Originating domain. The recipient service would then evaluate the intended operation based on a pre-configured set of rules that imposes restriction on the origin, requested headers, methods, etc. and would either accept or reject the request. In the case of Windows Azure Storage, CORS rules are configured per Service (Blob, Table or Queue) and the preflight requests are evaluated accordingly and not against the individual resource.

Preflight requests are typically cached by the browser and therefore subsequent requests would then skip this step where the actual request will be dispatched in case the cached preflight request had succeeded. The TTL (Time to Live) for cache is configurable.

Actual Request

Actual request refers to the request that the client intended. Such requests also have an additional header “Origin” added by the User Agent.

Successful CORS requests have valid Access-Control response headers. In case no Access-Control headers are returned, the browser would deny the request issuer from seeing the response. Note that the request will still be processed by the service regardless if the Origin header matches a configured rule. The Web Browser decides what part, if any, of the response is exposed to the JavaScript code based on the Access-Control response headers values.

There are some scenarios where the OPTIONS request is not required and therefore is not issued by the Web Browser. Instead Actual requests are directly dispatched and such requests referred to as a simple requests.

CORS Usage scenarios for Windows Azure Storage

As mentioned a typical usage for CORS for Blobs would be to allow direct Web Browser file upload against a Windows Azure Storage account. In this case, a user can use CORS along with SAS (Shared Access Signature) authentication mechanism in order to grant Web Browser write privilege to your storage account. In this case, whenever a user is ready to upload, the JavaScript code would request a blob SAS URL to upload against from your service and then perform a PUT blob request against storage. It is recommended that you limit the access time of the SAS token to the needed duration time in order to limit any security risks and the specific container and or blob to be uploaded. Best practices for SAS usage can be found in here.

Another usage of CORS for Windows Azure Table would be to display table data and possibly manipulate such data from the browser. A typical scenario would be to expose an admin page where a user can change some configuration settings for your website that you persist in a Windows Azure Table. In here, you can use CORS, SAS and JSON to implement such capability as is shown in the code sample. JSON in this case would be a natural fit where you can use JQuery capabilities in JavaScript in order to perform Windows Azure Table request and manipulate the data.

We have provided a sample code where we demo JavaScript code that illustrates Blob upload and listing capabilities as well as Insert and Query Entities capabilities for Windows Azure Table using JQuery.

How to Enable CORS in Windows Azure Storage

To take advantage of CORS, you first need to enable it and configure it on the desired service (Blob, Table, or Queue) separately. You would have to use the latest storage version “2013-08-15” in order to set CORS rules by either using REST APIs or the Windows Azure Storage Client (.Net, Java, C++). Once enabled, you can use any other storage version to interact and issue REST requests against your storage account, such as Put Blob, Query Entity, etc… while staying compliant with CORS.

The MSDN documentation here explains in detail how to configure CORS rules on your account via the REST APIs Set Blob Service Properties (REST API), Set Queue Service Properties (REST API), and Set Table Service Properties (REST API).

In this section, we will briefly explain the different CORS rule configuration and show how to use the Windows Azure Storage Client .NET Library 3.0 in order to enable and configure CORS.

You can configure up to 5 CORS rules for each storage service (i.e. Blob, Table or Queue). An example of a single CORS rule is as such.

<CorsRule>
<AllowedOrigins>http://www.contoso.com, http://www.fabrikam.com</AllowedOrigins>
<AllowedMethods>PUT,GET</AllowedMethods>
<AllowedHeaders>x-ms-meta-data*,x-ms-meta-target,x-ms-meta-source</AllowedHeaders>
<ExposedHeaders>x-ms-meta-*</ExposedHeaders>
<MaxAgeInSeconds>200</MaxAgeInSeconds>
</CorsRule>

For a detailed explanation of each of the xml elements, please refer to this MSDN documentation.

The above rule would permit CORS requests that satisfy the following conditions to be made:

  • Originating from either http://www.contoso.com or http://www.fabrikam.com (AllowedOrigins)
  • And have an HTTP request method of either PUT or GET (AllowedMethods)

Note that you can set both AllowedOrigins and AllowedMethods to *, which means the service would send required CORS headers for all methods (PUT, POST, GET, etc.) when invoked by all origins and is the most typical usage. This does not mean that anyone can access your account. They still need to have proper authentication signature as mentioned before.

The AllowedHeaders would let you configure which request headers the client (JavaScript code) can send against your storage account. This is enforced by the Web Browser who would get to learn about these allowed headers through either the preflight request or the actual request where the Windows Azure Storage service would return such list as part of the Access-Control-Allow-Headers response header. Assume that you don’t want the client to store arbitrary metadata information when calls are issued from the browser and you want to deny storing any info, for instance, in the x-ms-meta-reserved key. In the above example, you would be limiting the metadata to key names either starting with x-ms-meta-data or exactly matching x-ms-meta-target or x-ms-meta-source. Typically you would want to configure this to *, i.e. Allow all request headers.

The ExposedHeaders indicate which response headers can be exposed to the client JavaScript code. This is also enforced by the Web Browser who would learn about these headers through the actual request where Windows Azure Storage Service would return such list of headers as part of the Access-Control-Expose-Headers header response. Typically you would want to configure this to *, i.e. Allow all response headers.

The MaxAgeInSeconds indicates the maximum amount time that a browser can cache the preflight OPTIONS request. In the example above, it is set to 200 seconds. It is recommended to set MaxAgeInSeconds to a relatively large number if the rules do not change frequently since this will increase the responsiveness of your web application by reducing roundtrips and also reduce your storage billing charges as Preflight requests are billed.

Note that CORS rules are evaluated in the order they appear in XML as part of Set Service Properties REST API. This means that you should consider having the more restrictive rules appear first. Any * rule should then appear last. For more details on CORS rule evaluation, please refer to Understanding CORS Rule Evaluation Logic MSDN documentation.

Code Sample for Enabling CORS using Windows Azure Storage Client Library 3.0

Below is a code sample that would enable CORS on a Windows Azure Storage Account for the Blob and Table Service. The code is a snippet from the sample code provided in here.

The code would first GET the current service properties, enable CORS with a single rule of allowing all origins and HTTP methods (PUT, GET, HEAD and POST), and then commits the modified service properties against both the Blob and Table Service.

private static void InitializeCors()
{
// CORS should be enabled once at service startup
// Given a BlobClient, download the current Service Properties
ServiceProperties blobServiceProperties = BlobClient.GetServiceProperties();
ServiceProperties tableServiceProperties = TableClient.GetServiceProperties();

// Enable and Configure CORS
ConfigureCors(blobServiceProperties);
ConfigureCors(tableServiceProperties);

// Commit the CORS changes into the Service Properties
BlobClient.SetServiceProperties(blobServiceProperties);
TableClient.SetServiceProperties(tableServiceProperties);
}

private static void ConfigureCors(ServiceProperties serviceProperties)
{
serviceProperties.Cors = new CorsProperties();
serviceProperties.Cors.CorsRules.Add(new CorsRule()
{
AllowedHeaders = new List<string>() { "*" },
AllowedMethods = CorsHttpMethods.Put | CorsHttpMethods.Get | CorsHttpMethods.Head | CorsHttpMethods.Post,
AllowedOrigins = new List<string>() { "*" },
ExposedHeaders = new List<string>() { "*" },
MaxAgeInSeconds = 1800 // 30 minutes
});
}

Sample JavaScript Code for uploading a Blob to Windows Azure Storage

This below sample is a code snippet that would allow you to upload a file directly from the Web Browser into Windows Azure Storage using JavaScript. The full code is available in here. The code below is a snippet from the UploadImage.cshtml file.

// Method uploads a blob to Azure Storage by using a Blob SAS URL.
// The Web Browser will add the necessary CORS headers and issue a preflight request if needed.
// blobSasUrl: Blob SAS URL already obtained through an Ajax call to own service
// fileDataAsArrayBuffer: an ArrayBuffer (Byte Array) containing the raw data of the file to be uploaded
function uploadImage(blobSasUrl, fileDataAsArrayBuffer) {
var ajaxRequest = new XMLHttpRequest();

// Once the image is successfully upload, we will call render Image that would show the uploaded image
ajaxRequest.onreadystatechange = function() {return renderImage(ajaxRequest, blobSasUrl)};

try {
// Performing a PutBlob (BlockBlob) against storage
ajaxRequest.open('PUT', blobSasUrl, true);
ajaxRequest.setRequestHeader('Content-Type', 'image/jpeg');
ajaxRequest.setRequestHeader('x-ms-blob-type', 'BlockBlob');
ajaxRequest.send(fileDataAsArrayBuffer);
}
catch (e) {
alert("can't upload the image to server.\n" + e.toString());
}
}

The code first calls into your ASP.NET service in order to retrieve the Blob SAS URL. Here is the code that does that and which eventually calls into the uploadImage method above. It first loads the selected files, then retrieves the Blob SAS URL and finally calls into uploadImage in order to upload the image into Azure Storage.

// This Method is called after the user selects the images to upload.
// It loops over all selected files and uploads them one by one into Azure Storage.
function handleFileSelect(evt) {
var files = evt.target.files; // all selected files (FileList object)

// Loop through the FileList, save them and render image files as thumbnails.
for (var i = 0, file; file = files[i]; i++) {
// Create a reader that would read the entire file data as ArrayBuffer
var reader = new FileReader();

// AJAX URL that would be used to request an Azure Blob SAS URL
file.getSasUri = "/Home/GetBlobSasUrl" + "?blobName=" + file.name;

reader.onloadend = (function (theFile) {
return function (e) {
// Once the reader is done reading the file bytes
// We will issue an AJAX call against our service to get the SAS URL
$.ajax({
type: 'GET',
url: theFile.getSasUri,
success: function (res, status, xhr) {
// Called into GetBlobSasUrl to generate the SAS for the required blob
blobSasUrl = xhr.responseText;
// Now we have a SAS url that we will use to upload the image
// Pass in the SAS URL and the ArrayBuffer to be uploaded
uploadImage(blobSasUrl, e.target.result);
},
error: function (res, status, xhr) {
alert("can't get sas from the server");
}
});
};
})(file);

// Read in the image file as an ArrayBuffer. Once done the reader.onloadend event is raised
reader.readAsArrayBuffer(file);
}
}

Sample JQuery Code for accessing Windows Azure Table

Please refer to the sample code provided in here for JQuery code for InsertEntity and QueryEntities code. A starting point would be the InsertTableEntities.cshtml and QueryTableEntities.cshtml files.

CORS Sample Code

The CORS code sample can be found in here. Please ensure that you have the latest version of the NuGet Package Manager installed in order to get access to the referenced libraries.

The sample by default is configured to use the Windows Azure Storage Emulator which can be downloaded from here. You can use your Windows Azure Storage account by configuring AzureCommon.StorageAccount as part of the AzureCommon.cs file.

Summary

To summarize, the support for CORS in Windows Azure Storage allows application developers to use scripts in their web pages to directly communicate with Windows Azure storage resources (blobs, tables and queues) without having to proxy the calls through their web service. The Windows Azure Storage Client Library makes it simple to configure CORS.

Wael Abdelghani and Jean Ghanem

 

 Resources

CORS Sample code on MSDN

Windows Azure Storage Release - Introducing CORS, JSON, Minute Metrics, and More

2013-08-15 Storage REST version release notes

Cross-Origin Resource Sharing (CORS) Support for the Windows Azure Storage Services

Windows Azure Storage Libraries

 


Comments (6)

  1. Gangsir says:

    Great introduction of CORS

  2. Ahmet Alp Balkan says:

    Finally, thanks! Looking forward to see index.html support for containers!

  3. Sam K says:

    While the CORS functionality of the blob store itself works fine, and it adds the correct Access-Control and Vary: Origin headers in the response, there are still some issues that don't seem to be addressed.

    * Adding CORS policy seems to only affect files created AFTER the policy was added. Any existing files will behave as if CORS does not exist.

    * The Azure CDN which is running of a blob store will strip out the Vary:Origin and Access-Control headers entirely. It'll serve the same response whatever the origin is. This makes the CDN unable to serve static website files, namely fonts, as browsers block them by default (ironically a policy Microsoft adopted first in IE).

    Is there a team member I can make contact with to discuss these limitations, I have a test case that shows how the CDN doesn't work.

  4. jaidevh1@hotmail.com says:

    @sam,

    CORS should impact all files – even old ones too. Are you sure it is not the caching layer serving the cached content since TTL is set to large number?

    About the CDN, we can work with Azure CDN team to figure this one out. Can you please email jharidas at Microsoft dot com with details?

    Thanks,

    Jai

  5. Sam K says:

    @Jai,

    yes you're right the CDN does cache the response for old files. I've written to you separately about the CDN stripping out the CORS headers.

    Thanks.

  6. olympus@huge.it says:

    so is it possible to host fonts on Azure CDN? I tried, but still seems that CORS doesn't work…