Azure Table Storage, what a pain in the ass.

Lately I’ve been playing with Microsoft’s Azure service.  Tonight in particular I was attempting to use the Table Storage service.  Table Storage is a simple REST based object persistence system.  Microsoft have wrapped this in the ADO.NET Data Services API.  So it looks fairly full featured.  However it is not.  At almost every turn I have ended up bashing my head against a Table Storage limitation.  Debuging these problems has been a bit of a nightmare.

The things I have learned are as follows.

Development Table Storage is Arse

The local Development Table Storage service (based on top of SQL Server Express), has limitations and idiosyncrasies that the full cloud hosted service does not, as outlined by Microsoft.

In particular the fact that “in development storage, querying on a property that does not exist in the table returns an error” caused me a bit of a problem.  When my table was empty I could not execute simple queries such as:

var q = from v in context.CreateQuery(VehicleTableName) where v.Id == id select v;

Doing so with an empty table would generate cryptic and unhelpful exceptions with messages along the lines of “one of the request inputs is not valid”.  With some furious googling I discovered that, with the development table storage service, one has to incant the following on service start up to ensure that the table storage knows about the structure of your objects.

var query = from x in context.CreateQuery(VehicleTableName) select x;
var l = query.ToList();
var v = new Vehicle();
context.AddVehicle(v);
context.SaveChanges();
context.DeleteObject(v);
context.SaveChanges();

This insert/delete voodoo ensures that the SchemaXml column of the TableContainer table for my “Vehicles” table is populated with the appropriate XML definition of my Vehicle class. You have to do this for each of your tables/classes every time you start up your service.  This is idiotic to say the least.

You Can’t Store Classes with Decimal Members.

It took me a while, after many more “one of the request inputs is not valid” style exceptions to figure out that my Vehicle class was being rejected because it had a property, Price, of type Decimal.  That type is not supported by Table Storage.  I don’t think this is documented anywhere.

DatesTimes Must Be UTC.

After yet more “one of the request inputs is not valid” exceptions, I guessed why the following was failing.

var q = context.CreateQuery(VehicleTableName)
.Where(v => v.PartitionKey == Vehicle.Partition && v.ExpiryDate >= DateTime.Now.Date);

I needed to add the magic UTC characters so it read as follows.

var q = context.CreateQuery(VehicleTableName)
.Where(v => v.PartitionKey == Vehicle.Partition && v.ExpiryDate >= DateTime.UtcNow.Date);

So my journey so far into Azure’s data storage APIs has been somewhat less enjoyable than I had otherwise hoped. I just hope my luck improves as I delve deeper into its mysteries.

4 thoughts on “Azure Table Storage, what a pain in the ass.

  1. Oliver,

    With a relational development storage mimicing Azure storage, there are definitely differences, but it seems like all of your travails were documented per the link you referenced above and the one here –
    http://msdn.microsoft.com/en-us/library/dd179338.aspx (in which Decimal is notably absent and the UTC aspects of date are covered). I know that’s not all that helpful now, and I have taken the point that we need to emphasize the storage differences more clearly to lessen the aggravation of learning this new platform.

    If you aren’t already, I might recommend using Fiddler as well to help better diagnose the rather cryptic errors you see in the .NET debugger. The return HTTP stream will often have additional information in it that will prove more helpful, and Fiddler is a free tool to introspect the HTTP traffic.

    Lastly, feel free to contact me directly if you run into some problems. I’m a Developer Evangelist with Microsoft in the Northeast US and have spent a bit of time working with Azure and presenting on it to developers across the region, so consider it all part of a day’s work :)

    Good luck going forward!

    • Yeah, I subsequently found additional docs (white papers, MSDN articles and such) that noted the reasons behind some of the troubles I was having.

      While I’m sure if I had done all the reading up front I would have had an easier time of it, that is not my usual MO when I’m learning something new. I actually prefer to just dive straight in and give it a crack and solve the problems as they appear. I find I ultimately get a better understanding that way.

      Where this technique was failing me however was in the error messages reported by the Azure table development storage service. There just wasn’t any detail about what I was doing wrong. So it was harder to find solutions with Google or looking in the docs.

      I’ll definitely give that tool you mentioned a go in future and perhaps that will improve my ability to diagnose things. I’m also now going in and reading some more of the background documentation to fill in the gaps of my knowledge.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s