Click here to monitor SSC

Tony Davis is an Editor with Red Gate Software, based in Cambridge (UK), specializing in databases, and especially SQL Server. He edits articles and writes editorials for both the Simple-talk.com and SQLServerCentral.com websites and newsletters, with a combined audience of over 1.5 million subscribers. You can sample his short-form writing at either his Simple-Talk.com blog or his SQLServerCentral.com author page. As the editor behind most of the SQL Server books published by Red Gate, he spends much of his time helping others express what they know about SQL Server. He is also the lead author of the book, SQL Server Transaction Log Management. In his spare time, he enjoys running, football, contemporary fiction and real ale.

Head in the Clouds

Published 7 January 2011 10:16 am

We’re just past the second anniversary of the launch of Windows Azure. A couple of years’ experience with Azure in the industry has provided some obvious success stories, but has deflated some of the initial marketing hyperbole.

As a general principle, Azure seems to work well in providing a Service-Oriented Architecture for services in enterprises that suffer wide fluctuations in demand. Instead of being obliged to provide hardware sufficient for the occasional peaks in demand, one can hire capacity only when it is needed, and the cost of hosting an application is no longer a capital cost. It enables companies to avoid having to scale out hardware for peak periods only to see it underused for the rest of the time. A customer-facing application such as a concert ticketing system, which suffers high demand in short, predictable bursts of activity, is a great example of an application that would work well in Azure.

However, moving existing applications to Azure isn’t something to be done on impulse. Unless your application is .NET-based, and consists of ‘stateless’ components that communicate via queues, you are probably in for a lot of redevelopment work. It makes most sense for IT departments who are already deep in this .NET mindset, and who also want ‘grown-up’ methods of staging, testing, and deployment. Azure fits well with this culture and offers, as a bonus, good Visual Studio integration.

The most-commonly stated barrier to porting these applications to Azure is the problem of reconciling the use of the cloud with legislation for data privacy and security. Putting databases in the cloud is a sticky issue for many and impossible for some due to compliance and security issues, the need for direct control over data, and so on. In the face of feedback from the early adopters of Azure, Microsoft has broadened the architectural choices to cater for a wide range of requirements. As well as SQL Azure Database (SAD) and Azure storage, the unstructured ‘BLOB and Entity-Attribute-Value’ NoSQL storage alternative (which equates more closely with folders and files than a database), Windows Azure offers a wide range of storage options including use of services such as oData: developers who are programming for Windows Azure can simply choose the one most appropriate for their needs. Secondly, and crucially, the Windows Azure architecture allows you the freedom to produce hybrid applications, where only those parts that need cloud-based hosting are deployed to Azure, whereas those parts that must unavoidably be hosted in a corporate datacenter can stay there.

By using a hybrid architecture, it will seldom, if ever, be necessary to move an entire application to the cloud, along with personal and financial data. For example that we could port to Azure only put those parts of our ticketing application that capture and process tickets orders. Once an order is captured, the financial side can be processed in our own data center.

In short, Windows Azure seems to be a very effective way of providing services that are subject to wide but predictable fluctuations in demand. Have you come to the same conclusions, or do you think I’ve got it wrong? If you’ve had experience with Azure, would you recommend it? It would be great to hear from you.

Cheers,

Tony.

7 Responses to “Head in the Clouds”

  1. gandip says:

    Not much more expectation. I am happy with my in-premise hosting. Rewriting codes for azure is hectic.

  2. Keith Rowley says:

    The problem with Azure is that it is one more different technology that we database professionals need to learn. It would be much more useful if it provided the option to just put a SQL Server database in the cloud with some kind of secure connection back to our application, possibly or possibly not also hosted in the cloud. Then we could use existing code and knowledge to write apps both for the cloud and non-cloud and transition between them as we had need. Right now from what I have read I would essentially have to tailor make two versions of at least the DAL for an app in order to have cloud and non-cloud versions of my applications. TOO MUCH WORK! Make it simple for us to use a cloud based version of SQL Server and we will love you for it.

  3. anantham says:

    We are an ISV(Independent software vendor).
    Moving to the cloud does not give us or our client any significant advantage cost wise(as of now). The cost is cheaper in some cases if I do a shared server hosting or VPS hosting.
    But it was beneficiary for one of our client who had a product which required a lot of processing power only on 3days of a month(14,15,16). So we ported the number crunching dll alone to the azure(read as re-wrote) .The application is hosted in premise itself but the task of heavy weight lifting is left to the cloud.The customer pays to use higher and powerful instance of Azure on the 3 days alone on other days it runs on lower powered instances. Thereby saving him money.

  4. IowaWebDave says:

    I totally agree with Phil Factor in the article you referenced – I just don’t see how “the cloud” is a viable option for customer private data when WE are responsible for it as a financial services provider. There is so much compliance and government regulation that we just have to have that data within our control and – at least not yet – I don’t think that’s feasible when the data is “out there”.

  5. BuggyFunBunny says:

    My principle concern with cloud (Azure or otherwise) is the tendency to build them from Cheapest Parts Available, which either makes it impossible to specify what the machine is you’re “renting” (and which of its resources are the “flexible” ones, and by how much and all that), or the “rent” will end up being your cost plus the Cloud Vendor’s overhead and profit.

    Cloud (and the IBM Service Bureau which started this meme 45 years ago) works only if the clients’ demand for resources are inversely correlated and match on resource type. For that to happen, the scale has to be kinda big. Easy for a given client (one which fits the global demand curve nicely) to get lost. A client that’s big enough to get preferential treatment, 1) could get the economies of scale on its own and, 2) will skew the service against the small clients. Happens all the time, that squeaky wheel getting all the grease.

  6. DBA Dave says:

    SQL Azure, and perhaps other similar cloud based data services, are not ready for private personal data for all the reasons mentioned. Yet…

    I am just about to jump into the cloud via a POC using SQL Azure. The company for which I work, is on an expansion program, merging with other large firms in different markets. Part of this naturally involves data sharing. Until now it has been done with a custom rolled transaction log shipping solution. From the start this has meant the exchange of large data volumnes, say 100GB each way per week, when all that is really used is less than 2% of that. Such are the pressures of getting things done , and getting them done now! Needless to say, this is not an ideal solution.

    The POC will be used to assess the feasability of using the cloud, it may or may not end up being SQL Azure. The data is publicly available data about companies that are clients in each market. Market A needs to be aware that Market B has Client C on its books. This type of simple data sharing looks to be an idela use of cloud data services. Each market will supply data to the cloud and have a local full copy sourced from the cloud.

    If it proves to be efficient, and lets face it, it won’t be hard to be better than the current solution, several problems will be solved. Data volume will be down, it will easily cope with new markets, it will be more robust and the problems associated with time zones vs backup windows will be totally removed.

  7. timothyawiseman@gmail.com says:

    Probably the biggest issue that many companies have with moving to any cloud based solution is the loss of control. The regulatory issues mentioned in this post are one specific aspect of it as regulations require you to control certain data in certain ways. Yet, even beyond that, there is the knowledge that you are largely at the mercy of the provider. If they go out of business, you may have start over on code developped for that provider, if they are infilitrated by someone with ill intent it is hard (though possible in at least certain cases with strong end to end encryption) to protect yourself from that. In fact, if you have a strong lock in to a particular vendor they even have great ability to raise your rates with little recourse unless you protected yourself with well written long term contracts.

    Some of this factors can be mitigated through proper planning, management, and technology, as well as careful choice of the vendor that will be used. But even a vendor as technologically savy and stable as Microsoft can have problems, as was shown on with the termporarily lost data for the Sidekick phones in 2009.

    No matter how well you attempt to protect yourself, using the cloud entails losing some control over your IT operations. This may be a very good trade off in some situations, but it is one that can rightfully make management uncomfortable especially for something which is mission critical.

Leave a Reply