I’ve been taking various grumpy tentative steps towards using Azure, but my experience has been made a delight by a new free tool from Cerebrata, now part of Redgate. I’d better explain.
Because it is now possible to run SQL Server in a VM in Azure, Microsoft added the means to do backups into an Azure Storage Account to SQL Server 2012 SP1 CU2. In order to try this out, I decided that I needed an Azure Storage account, and a means to access, check and maintain my backups. It was harder process than I hoped but fortunately, at the point of frustration, my pains eased considerably when I laid hands on a copy of a new (and free) Azure Explorer tool, from Cerebrata.
Armed with decent bandwidth, Azure Storage seemed a good general option for offsite backup storage and an convenient way to store files, scripts and data if you are moving about a lot. You can store any data in Azure Storage and access it via ordinary HTTP or HTTPS protocol. This data can be either private or public but you can’t choose to make individual files public or private, only Azure containers and their contents.
First things first, I needed to set up a Windows Azure Storage Account.
Setting up the Windows Azure Storage Account
The storage account provides the namespace for accessing data. An account can store up to 100TB, and can contain an unlimited number of containers. Each container can store an unlimited number of blobs. Each block blob can store up to 200GB of data. Page blobs can be up to 1TB in size, and are better for frequently updated data.
Once you’ve signed up for an Azure Storage account, which is a slightly awkward process, you need to find your way to the Windows Azure Management Portal. Now, whatever you do, if you have an MSDN trial account, don’t do what I did and create a SQL Azure Database with the default settings. It means that when your subscription reaches its spending limit, you are locked out for a month. Yes, this system isn’t exactly friendly.
Firstly, log into the Portal. Then, at the bottom of the navigation pane, click NEW. Then click on DATA SERVICES | STORAGE | QUICK CREATE. You’ll see something like this:
In the URL field, you’ll need to type a subdomain name to use in the URL for the storage account. The entry can contain from 3-24 lowercase letters and numbers. Lowercase, remember. This isn’t Windows but Unix. Within the URL, the subdomain name becomes the host name used in order to address the Blob, Queue, or Table resources for the subscription. It is essentially the name of the account.
You’ll need to choose a Region/Affinity Group in which to locate the storage. Click on the geo-location tick-box if you’d like to spread your bet regarding the data across regions. All you need now is to get the access keys, which Windows Azure generates and uses for authentication when accessing the storage account. Through some twisted logic, to get these, you have to click on a letter ‘i’, next to the delete button. Fortunately, the designers allowed a ‘Manage Access Keys’ label next to the ‘i’. (Odd choice of name. “what shall we call these keys. Let’s name it after one of our most famous products!”)
Managing Data and Backups in Azure Data Storage
Having completed this stage, I got busy downloading the tools I needed to manage my data and backups, copy files locally, perform restores and so on. This entailed forty minutes of downloading and installing an SDK, a PowerShell pack, and VS2012 add-in pack, followed by a reboot. About halfway through, I had my first twinge of doubt. What on earth would I want to do all this for? I’m not thinking of building an application. I’m a user. . This is primitive stuff.
My frustration grew as I searched in vain for an easy way to manage my files. All I want is a way of treating an Azure Storage account as if it were a file system, with Azure Containers as the root directories. I know it isn’t like that really, but let’s pretend it is remote file system, since all I want to do is to securely store or retrieve files and maintain a hierarchical directory in which to store them. I just want to drag and drop, or copy ‘n’ paste, files between my Azure Storage file system and my local machine, or from one Azure ‘directory’ to another. Sure, there are plenty of other things I’d like to do with them, and it is frustrating to have to spend time doing things in Azure that can be done simply with a mouse-click in a Windows domain.
I hadn’t a clue what I wanted specifically from an Azure tool: something maybe like Skydrive or Dropbox, but for real Azure Storage. I just knew that I didn’t want to engage in a huge cultural shift just to manage my Azure storage. I popped over to see Cerebrata to ask them what they recommended. They listened patiently as I poured out my frustration, but then looked smug and gave me a copy of Azure Explorer (Yup, gave me).
Within Azure Explorer, you can create a new Azure Storage Account, equivalent to a drive, simply by giving the name of your account and the key. You can test it before you commit. That’s it. The only hassle is on Microsoft’s side of the fence setting up the account.
Azure Explorer looks just like Windows Explorer, and one can work the two applications together. You can drag and drop files between them or copy n’ paste. You can load or run files directly from Azure. Within a couple of minutes, I was using it like an old friend. It transformed my experience of working with Azure. In its design, it is so close to Windows Explorer that there is no point in giving you a screen dump. It even allows you to access your local files in much the same way as you’re used to. It doesn’t have all the features of Windows Explorer, and I wouldn’t want them. It provides the essentials without all the frilly stuff. Thankfully, they use the list view, not the ‘home’ thumbnail images. It works fine, and you soon forget that you’re getting data from Azure storage.
I gather from what they told me that Azure Explorer started out life as a test-bed to implement ‘explorer’-like features into Cerebrata Azure Studio, but it took on a life of its own. Sure, all the technology behind it is going into the other Cerebrata tools, but after a lot of heart-searching, they decided to give it away on their website free as well. I like ‘free’, especially a proper professionally written tool that is full of goodness. A lot of work has gone into this tool. A great deal of the smoothness of the tool has come from the fact that they clearly listened to the suggestions of their many beta-testers.
To access a file directly via scripting, you right-click, click on ‘copy path’ and you can get hold of the URL. I can use PowerShell to do…
If you make the container ‘public’, this is available to anyone.
Obviously, Azure is very different from real file storage, but Azure Explorer keeps these differences in the background. In Azure storage, files equate to Block BLOBs but you can create Page BLOBs too, in Azure Explorer, if you click on the drop down under upload in the ribbon bar and select a menu item for uploading page blobs.
One other difference that can present you with slight puzzlement is that the paths in Azure blob storage just denote the categorization that comes to you when you name the blob. Containers don’t contain directories. Azure Explorer masks that cleverly from the user whilst conforming to the way in which Azure arranges it. Within Azure Explorer, the container is the root directory, containing a series of files (blobs). When using Azure storage for backups, you can ensure that the backups go into the required directory, just by specifying the path, just as you always have, except you don’t have to check that all the intervening directories in the path you specify already exist. It doesn’t matter if another application uploads the backup, because the directories will appear in Azure Explorer anyway. The only time you might come unstuck is when you create directories with nothing in them. They’d be volatile if you exited Azure Explorer before creating files in them.
Here’s how to get the free copy of Azure Explorer.