The Azure File Service

At TechEd 2014 in Houston, Microsoft announced preview availability of a much-anticipated feature, Azure Files. It is a platform capability to easily expose SMB file shares that can be used from multiple instances to read & write files, without having to build an infrastructure yourself (like I described in my blog post about building file shares using DFS. It is a native Azure service built on top of the same architecture as the other storage services (blobs, tables, queues), so it offers the same characteristics in terms of (geo-)redundancy, HA and scalability. Apart from providing a SMB 2.1 interface (that is only accessible for Azure VMs being hosted in the same geographic region or from on-premises environments via a VPN connection), shares in Azure Files can also be accessed remotely via REST from anywhere in the world, provided that a client knows the storage account credentials.

This post gives an introduction to getting started with Azure Files and shows how to automatically mount a share when provisioning a new Windows VM, using the new Custom Script configuration extension (also announced at TechEd).

Getting Started

First thing you need to do as an owner of an Azure subscription is to sign up for the Azure Files preview. Once you get approval via email you will have to create a new storage account, as the feature is currently only available for new accounts. The dashboard view of that storage account in the management portal should show the newly provisioned service:

File Service

Before we can actually create a share, we have to make sure to have the right PowerShell environment in place. For that, check the version of the Azure PowerShell cmdlets and update your installation if necessary. At the time of this writing the cmdlets for Azure Files were not integrated into the main Azure PowerShell distribution, so you might have to download them separately from here.

The download is a ZIP-file ( that you should save und unpack to a local directory. Do not store the content in C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell\ServiceManagement\Azure (i.e. the default directory of the Azure PowerShell installation), as this will result in some versioning issues. In our example, let’s say you will extract files to c:\AzureFiles.

Next, you will need to need to unblock Microsoft.WindowsAzure.Commands.Storage.File.dll in that directory:


Open a PowerShell console (e.g. Windows PowerShell ISE) as administrator and execute the following statement in order to import the Azure Files cmdlets:

import-module c:\AzureFiles\AzureStorageFile.psd1

Create a Share

After having created the storage account and set up our PowerShell environment we’re good to go. Let’s create an Azure File share in our new storage account:

$storageAccountName = "name"
$storageAccountKey = "key"
$context = New-AzureStorageContext $storageAccountName $storageAccountKey

$shareName = "myshare"
$share = New-AzureStorageShare $shareName -Context $context

Make sure to put in the proper storage account name and access key of your storage account and pick a share name of your choice. With the following statement you can retrieve information about the share:

Get-AzureStorageShare -Name $shareName -Context $context

The result looks like this:


Mount the Share Automatically in a VM

Next, we are going to provision a brand new Windows VM and leverage the Custom Script extension functionality to mount the share at startup of the instance automatically.

First, let’s create the PowerShell script that we will use to mount the file share. Open a text editor and save the following content to a new file c:\AzureFiles\ConnectShareStartup.ps1:


$cmd = "net use z: \\$\$ShareName /u:$accountName $accountKey"
$cmd | Set-Content "c:\ProgramData\Microsoft\Windows\Start Menu\Programs\StartUp\ConnectShare.cmd"

Restart-Computer -Force

This script will create a cmd-file in the Windows startup folder, i.e. whenever a user is going to logon to the machine, the file will be executed. It contains a simple net use statement to mount your Azure File share. The script takes storage account name, key and the share name as parameters (provided at provisioning time) and will write the corresponding values into the cmd-file. When mounting Azure File shares this way, the storage account name is acting as username, the access key provides the password.

So, create a new Windows VM in the portal (version shouldn’t matter) and make sure to deploy it to the same region as your storage account with the file share.
Note: If you go to a different region you will not be able to mount the file share, as SMB is only routed within a datacenter. Using different storage accounts or even Azure subscriptions within a region is not an issue.

Now, when coming to the final step of the provisioning wizard in the portal, select the checkboxes for Install the VM Agent and Custom Script. In the Custom Script Configuration section, select the PowerShell script you have created above via the From Local button and enter the following string into the Arguments field:

-AccountName name -AccountKey key -ShareName myshare

Again, use your specific values for storage account name, key and share name. The wizard should look as follows:

VM Config

Finalize the wizard and wait for the VM to come up. Be aware that the script is going to restart the instance. RDP into the VM and open file explorer. You should see the share mounted as drive z: and ready to go. The share will survive reboots and even de-provisioning and re-creation of the VM from the VHD file.


You can use the Azure File share as any regular SMB file share and mount it in multiple VMs for read and write access. Now go ahead, create a folder (e.g. myfolder) and add a file to that folder (e.g. myfile.txt).


Remote Access to the Share

Now, let’s see if we can access the share remotely (i.e. from outside the datacenter) via REST and HTTP. The easiest and quickest way to do that is again PowerShell. The following script will retrieve the share content (note you will need the $context object used above):

$dirName = "myfolder"
$share = Get-AzureStorageShare -Name $shareName -Context $context
Get-AzureStorageFile  -Share $share -Path $dirName


In order to download a file you can use the following statement:

Get-AzureStorageFileContent -ShareName $shareName -Destination "." -Path "myfolder\myfile.txt" -Context $context

You can also upload a local file to the share like this:

Set-AzureStorageFileContent –ShareName $shareName –Source .\sample.txt –Path “myfolder\sample.txt” -Context $context

Final Thoughts

The classical tools when working with Azure storage (Visual Studio, Cerebrata Management Studio, ClumsyLeaf CloudXplorer, etc.) do not yet support Azure Files. This basically means that you will have to work with PowerShell or the REST API directly. Another option is to use AzCopy for moving data between Azure File shares and your environment by using the * URL of your storage account.

In terms of sizing and scalability, the preview of Azure Files provides the following characteristics:

  • max. file size: 1 TB
  • max. share size: 5 TB
  • max. share I/O: 1.000 IOPS (8kb), 60 MBps data throughput
2 comments on “The Azure File Service
  1. i tried each step . but still having issue .
    getting error

    New-AzureStorageShare : Unable to connect to the remote server
    At line:1 char:10
    + $share = New-AzureStorageShare $shareName -Context $context
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : InvalidOperation: (Microsoft.Windo…ureStorageShare:NewAzureStorageShare) [New-AzureSto
    rageShare], StorageException
    + FullyQualifiedErrorId : ConnectFailure,Microsoft.WindowsAzure.Commands.Storage.File.Cmdlet.NewAzureStorageShare

    plz suggest

Leave a Reply

Your email address will not be published. Required fields are marked *