web 2.0

Dates and Time Zones in Javascript, C#, and SQL Server

An issue that always seems to plague developers who write software that is used in multiple geographically disperse locations is that of dates and time zones. If a user in the Eastern time zone updates a record that contains a date and time field (say, 1/15/2014 4:35 PM), and then a user in the Pacific time zone views that record, what date and time are they supposed to see? The time as it was entered in Eastern time, or should they see their local time instead (1/15/2014 1:35 PM)?

Generally speaking (there are, of course, exceptions) date and time information should be stored in a database as a UTC (Coordinated Universal Time) time, and automatically converted and displayed in the user's current time zone in the client application. But that is often not as easy at it seems to implement. Below, I'll outline a process that has worked well for me in several apps.

Step 1 - Getting data from the client to the server

In the past, web apps collected data in an HTML <form> element that was submitted to the server causing a full page refresh. Today, I prefer to use single-page application frameworks where almost all client-server communication is done using the Javascript XmlHttpRequest (xhr) object. For me, I find jQuery nicely abstracts away the details of xhr using the $.ajax() api method. But how do we wrap up user-entered date-time data into an xhr request so that time zone data is preserved and can be interpreted by our server?

The Javascript date object does have time zone support, but it's not immediately obvious how we can make that work properly with the server. When sending data in an xhr object, it's usually JSON formatted, meaning just about everything is a string - there are no data types. Javascript date objects have a handy toString() method that outputs a date like this:

Sun Feb 23 2014 15:33:09 GMT-0400 (Atlantic Standard Time)

This would seem to contain all the information we require, but alas, the default ASP.NET WebAPI configuration can not translate this to a C# DateTime instance. But there is a solution - the ISO-8601 date format, which looks like this:


In fact, most modern browsers have the ability to output this format using the toISOString() method, but there are a couple of drawbacks. First, the toISOString() method is not supported on all browsers and second, this built-in method deletes the time zone information and simply outputs it as UTC (such as 2014-02-23T19:33:09Z). Sometimes it is desirable to preserve the originating time zone data.

Thus, I prefer to use the excellent moment.js library for formatting date and time values. It is incredibly flexible and powerful and you'll find yourself using it everywhere in your Javascript code when dealing with dates. The default format() method of moment.js outputs dates in ISO format.

Step 2 - Processing on the Server

Actually, with WebAPI and its default use of Newtonsoft's Json.NET library, there is nothing else you have to do on the server. Using the default model binder, any date that is submitted in ISO-8601 format will be automatically converted to local server time and, most importantly, the Kind property of the DateTime instance will be set properly to DateTimeKind.Local.

Step 3 - Saving in SQL Server

Now here's one step I see a lot of people miss - saving the datetime properly in SQL Server. We want the date to be stored as a UTC value so that it can be converted to any time zone for display purposes. In order to do this, you must convert the C# datetime instance to UTC when adding it to the parameters of your stored procedure, like so:

parameters.Add("myDateField", myDateTime.ToUniversalTime());

Step 4 - Retrieving from SQL Server

Another step often missed by developers is the requirement to properly stamp the Kind property of the C# DateTime instance when retrieving it from the database. You must mark the value as UTC like this:

myDateField = DateTime.SpecifyKind((DateTime)dbReader("myDateField"), DateTimeKind.Utc);

Step 5 - Displaying on the Web Page

Actually, there is no step 5. The Json.NET serializer will format the date value in ISO-8601 format which is automatically converted to local time in the client Javascript code. If you output this value to the web page, it will be displayed in client local time.

By following this process you can ensure that all date and time data in your database is consistently represented as UTC, and can be reliably converted to the time zone being used by the client application. One gotcha to be careful of, however, is when filtering date values. Just like storing a date value in the database, the client specified date filter must be converted to UTC time before the comparison against SQL data is performed.

Happy coding!


Fields Not Updating When Using Salesforce API with .NET / C#

I just spent the last few hours tearing my hair out trying to determine why a Contact record in Salesforce wasn't being updated from their API. I had successfully created the record using Visual Studio to generate a series of proxy classes from Salesforce's WSDL file, but subsequent updates didn't appear to be working at all even though the API reported a successful update transaction.

It turns out that all the fields I was trying to update were non-string fields. A few of them were Boolean fields (true/false), some were number fields, and others were date fields. Apparently, when using .NET, you have to explicitly tell Salesforce that you've changed the value of any non-string field by using the associated "Specified" field, like so:

myContactRecord.CustomDateField__c = DateTime.Now;
myContactRecord.CustomDateField__cSpecified = true;

A thousand thank-you's to fine fellow (RobD) who answered this question on StackOverflow.


Intel NUC HTPC, XBMC, and Harmony Ultimate Remote Control

I recently upgraded my home theatre PC (HTPC) to an Intel NUC with a built-in infrared receiver. My previous HTPC had an outboard IR receiver that I bought as part of a Windows Media Center remote control kit, and after a fashion it worked very well with XBMC. I say "after a fashion" because XBMC has matured over the years and its remote-control integration has evolved along with it. In the early years, I even had installed a program called Event Ghost that would interpret IR signals and issue XBMC commands. When it came time to use the Intel NUC with a remote control, I was no longer sure how I had gotten it to work so well with the previous machine.

Thankfully, for the most part the Intel NUC's IR receiver works very well with my Logitech Harmony remote. However, one command no longer functioned - the context menu when I clicked the "Info" button. This had worked without issue on the previous HTPC, and I don't recall having performed any special configuration to enable it. To resolve the problem, I resorted to several different methods.

First, I had set up the HTPC remote control on my Harmony as a "Media Center Special Edition". The "Special Edition" version of the Harmony device profile enables some very important functions that the regular Media Center device profile does not. Chief among these are the ability to send Windows-N and Alt-N keystrokes, which in my case I used to send a Windows-4 keystroke to load XBMC (XBMC was the forth icon in my Windows Task Bar) and an Alt-F4 keystroke to quit XBMC. This made it very easy to restart XBMC without requiring me to haul out my wireless keyboard. And these commands still worked with the Intel NUC, just the "Info" button no longer functioned.

I tried restoring to the general Media Center device profile, but this didn't work either. Back to the Special Edition profile, and I attempted to enhance XBMC's built-in remote control support with an enhanced keyboard map file that I found here. Still no luck. For whatever reason, I couldn't get the remote's "Info" IR signal to launch any action. In the end, I remapped the Info button to issue an Alt-N keystroke instead, and I modified the keyboard map file to look for this signal and interpret it as an XBMC Context Menu command. This worked great, so I took the opportunity to map some more buttons, and create a few Harmony "soft" buttons that I didn't have with my previous HTPC build:

	  <one mod="alt">Info</one>
	  <two mod="alt">ShowSubtitles</two>
	  <three mod="alt">NextSubtitle</three>
	  <four mod="alt">OSD</four>
	  <five mod="alt">ContextMenu</five>
	  <six mod="alt">audionextlanguage</six>

It was a round-about way to get things working (and likely not the most efficient) but the Harmony configuration is now very powerful. I expect the only reason I'll need to use the wireless keyboard any more is to install Windows Updates once a month.


HTPC Build 2014 - Intel Haswell NUC

For several years back in the early 2000's, I had an original XBox that was modded so it could run XBMC (XBox Media Center). XBMC on the original XBox was awesome - it had a great user interface and would play every video file format known to man. But the original XBox wasn't high-def, so about 6 years ago I upgraded to a home theater PC built around an AMD Athlon 64 X2 5200 with a Radeon video card placed in an Antec NSK2480 HTPC case. Again, XBMC (on Windows) was my media player software and it has worked mostly great right up to now. I say "mostly" because XBMC has always been a little twitchy on this machine, requiring a restart every time the machine came out of sleep. But a small price to play for the amazing flexibility it offers.

My old Antec NSK2480

Well, six years is a long time for a PC and the Antec case is a little too big for the new equipment stand I recently purchased, so I decided it was time for an upgrade. I set my sights on Intel's NUC (Next Unit of Computing) machines because they have Core i3, i5, and i7 processors and are amazingly diminutive. The most recent iteration of the NUC has a few killer features that make it absolutely ideal as an HTPC:

  • The integrated GPU on Haswell Core CPU's can now output reliably at 24 fps (ideal for film material)
  • Because the NUC uses laptop-grade parts, it is amazingly power-efficient
  • It has a built-in infrared receiver

In addition to its size, the above three features were key to my decision to go with the Intel NUC. I looked briefly at a competing NUC unit from Gigabyte which includes Wi-Fi and Bluetooth for the same price as the Intel (you have to buy your own PCIe Wi-Fi card for the Intel if you want it), but foregoes an integrated IR port. I didn't need Wi-Fi or Bluetooth, and an outboard IR receiver is much less tidy. I'm also an Intel Ethernet bigot and the Gigabyte offering uses a Broadcom part instead, so the Intel NUC won out on several counts.

I ordered the i3 version of the NUC from Newegg.ca, along with 4 GB of RAM and a 120 GB SSD. Note that the RAM is a 1.35 volt SO-DIMM and the SSD is an mSATA drive - be careful when ordering these parts as the NUC doesn't use standard desktop memory and you can't fit a regular 2.5 inch SSD in this case (though Intel now has a slightly larger NUC that will accommodate 2.5 inch drives). I usually order my parts from NCIX, but in a rare instance Newegg was actually cheaper this time around and had all the parts in stock. The NUC shipped from Canada, but the RAM and SSD shipped from the States. In a happy twist of fate, both shipments arrived on the same day (3 days after I ordered, and using ground shipping, too - kudus to Newegg on this one). I should also mention that I had ordered a mini-HDMI to HDMI cable from Amazon.ca a few weeks prior, since the Intel NUC doesn't have a standard HDMI port.

  • Intel NUC Kit D34010WYK - $315
  • Crucial M500 120GB SATA mSATA Internal Solid State Drive - $95
  • G.SKILL 4GB 204-Pin DDR3 SO-DIMM DDR3 1600 - $44

One thing I found amusing about Intel's packaging for the NUC was the Intel jingle that played when I opened the box, much like those greeting cards that play a tune when you unfold them. The NUC was also much smaller than what I expected - only a little over 4 inches square, not much larger than Apple's current Apple TV box. I wasn't so impressed that I had to unscrew the four large Phillips foot-screws in order to install the RAM and SSD, but that was hardly a big deal. Once everything was installed and the NUC was reassembled, I was pleasantly surprised to see a nice GUI-based UEFI BIOS screen. It is very easy to use and I had the latest NUC BIOS upgraded in no time.

I used Microsoft's Windows USB Boot Tool to put Windows 8.1 on a bootable USB flash drive and had it installed on the SSD in less than 15 minutes. Another plus for Intel was the fact that you can download the entire set of Windows drivers in one ZIP file - a nice time-saver. I'll say this for Windows 8.1 on Intel's i3-4010 and the Crucial SSD - the machine boots wicked-fast in less than 15 seconds! This is a welcome change from my old HTPC which took well over a minute to boot.

I soon had XBMC installed and was ready for some testing. First, I wanted to check the power draw of this feisty little NUC. Man, this thing barely uses any juice. While playing a 1080P video file with DTS-HD Master Audio, it barely hits 15 watts. I measured idle power draw at around 9 watts, and it sips a measly 2 watts while in sleep mode. I was absolutely floored by these numbers. I also had some concerns that the integrated graphics in the Core i3 chip wouldn't be able to handle Blu-Ray quality 1080P video and high-def audio (I seriously considered spending an extra $100 for the i5 version of the NUC), but the machine barely breaks a sweat. Well under 20% CPU time while playing such a clip - amazing.

The last thing to mention is the fan noise. This is the one quibble I have with the NUC as the fan is clearly audible even from several feet away. It isn't necessarily loud, but if you had to listen to it that close for an extended period of time I think it could get very annoying. Thankfully I sit about 12 feet away from it which makes it barely audible, and you certainly can't hear it when watching something. Even a whisper is enough to drown it out.

Not directly related to the NUC itself but still pertinent to the HTPC experience as a whole is how I control the unit from way across the living room. I have a Logitech K400 wireless keyboard with an integrated track pad that works smashingly. But I hate to haul it out just to select a movie to play so instead I've programmed my Logitech Harmony Ultimate remote to work with Windows 8 and XBMC. It works really well, especially after making a few tweaks to the remote configuration to add some important functions. I'll write about those in a separate post.

Overall, I am immensely pleased with the Intel Haswell NUC. It's blazingly fast, amazingly energy-efficient, performs magnificently, and is nice and tiny. In fact, this would make a great general-purpose PC for anyone who didn't need an optical drive or required any expansion room. And while Intel's integrated graphics have come a long way and are well-suited for video tasks like this, their performance in today's top-tier games is underwhelming so gamers should look elsewhere. For an HTPC it's a tad pricey, true, but its tiny footprint makes it incredibly flexible. It even comes with a VESA mounting plate to attach it to the back of a monitor for a truly out of sight configuration. All in all, this is a great little device and I expect it will serve my HTPC needs for years to come. Highly recommended.


Entertainment | Hardware | Tech | Toys

Core i7-4770K PC Build 2013

A few months ago I decided to retire my 4-year-old i7-860 PC and build a new machine based on Intel's Haswell platform. I had skipped Intel's previous Sandy- and Ivy-Bridge systems and was looking forward to modernizing my ageing PC, not only with a new, power-efficient Core i7 processor, but also with chipset improvements such as USB 3 and SATA III (and because of the motherboard I selected, Thunderbolt).

Here's a list of the components I ended up using in my new build:

  • Motherboard: Asus Z87 Deluxe/Dual
  • CPU: Intel i7-4770K
  • CPU Cooler: Be Quiet! Dark Rock Pro 2 Silentwings 
  • RAM: 16 GB Kingston HyperX 10th Anniversary Edition (2 x 8 GB, 1.5v, 1600 MHz, CL9)
  • Video Card: Sapphire Radeon HD7970 3GB OC with Boost
  • Boot Drive: Crucial M4 256GB (carried over from previous build)
  • Data Drive: Western Digital Black 3TB WD3003FZEX
  • Optical Drive: LG BH10LS30 Bluray Writer (carried over from previous build)
  • Chassis: Antex P280 (carried over from previous build)
  • PSU: Seasonic Platinum 860 (carried over from previous build)
  • OS: Windows 8.1 Pro

Like the i7-860 system before it, my new Z87/4700K system is rock-solid reliable. No weird BSODs or driver issues to deal with - everything just works. This is a relief, since 20 years of custom PC builds have taught me that you never know what to expect when assembling your own system. But with each iteration, PC components and drivers have thankfully gotten more trustworthy. And more efficient. The i7-860 system idled at about 140 watts while the new i7-4770K build runs 110 watts at idle.

I can't say, however, that this new i7-4770K system feels significantly faster than the i7-860 it replaced. Oh, the benchmarks might tell you it is, but I find it hard to notice mostly because, as I've stated in the past, I don't do a lot of PC gaming and this is used primarily for general purpose PC tasks like web browsing. However, being an enthusiast means I'm not content with merely good enough hardware and I'm compelled to build high-performance machines.

What is noticeable, however, is the system noise or rather the lack of it. This is hands-down the quietest machine I've ever built. While not completely silent, it is difficult enough to hear under my desk that I'm never sure if the system is on or not. This can attributed to the following:

  • Antec's P280 is designed to be quiet,
  • The Dark Rock Pro 2 is an exceedingly efficient CPU cooler,
  • Asus's Fan Xpert 2 is excellent at keeping system fans at low RPM when excessive cooling is not required,
  • The Sapphire Radeon dual-fan cooling solution is remarkably quiet.

Fire up a modern first-person shooter game (like Tomb Raider), however, and you'll have no trouble hearing it anymore. Every fan in the system steps it up a few notches to provide adequate cooling to the CPU and GPU. I wouldn't exactly call it noisy at this point, but you can definitely hear it.

The last thing I wanted to make note of is the NFC reader that Asus bundles with the Z87 Deluxe/Dual board. In order to justify the higher cost of this particular motherboard, Asus had to add something. Frequently, these "value-add" items are gimmicky and while they may sound cool, they are of little real-world use. For me, however, the NFC reader is extremely useful. It provides two very welcomed features: the ability to log into my computer without typing a password, and a two-port USB 3 hub. The USB hub is self-explanatory, but not many people are aware of NFC (near-field communication) as it is primarily a smart-phone technology that hasn't really caught hold yet. In this case, it allows me to place a small NFC tag (included with the Deluxe/Dual, or you can use your phone if it supports NFC) on the NFC reader which then auto-logs me into my machine. Anything that keeps my computer secure without requiring me to type passwords all the time gets high marks in my book.


Hardware | Tech

Single Page Application Frameworks

Several months ago I started work on a new project that required a web-based user-interface. I had been out of the HTML game for several years, spending time instead on Windows Forms and Flex-based applications. For this new web project, however, I wanted to retain some of the user-experience and development features that I came to enjoy working with traditional "thick-client" technologies.

My investigations led me to discover a technique now called the JavaScript "Single Page Application," or SPA for short. I am no stranger to some of the paradigms of a SPA having spent the first ten years of my career developing web apps with rich user-interfaces that used AJAX-like techniques before AJAX was even a term. But SPAs have evolved way beyond simple AJAX calls and the now clunky-feeling ASP.NET "Update Panel." The architecture of a SPA more closely resembles that of full-blown client-server apps written in traditional technologies like Windows Forms, bringing with it a more developer-friendly environment and a better end-user experience. For that I think we can thank Google Chrome for stepping up the browser wars and resulting in a doubling of JavaScript performance every nine months over the past five years or so.

After much research (including much material from John Papa), I began to develop my own SPA framework to serve the needs of my project. Keep in mind that a SPA relies on a single HTML page as the application container and the browser makes no further full-page HTTP requests once that page is loaded. Instead, the SPA loads and renders content on-demand as it is required. To make this happen, the SPA has to perform much of the work that has traditionally been performed by the web server, such as view routing and data-binding. While SPAs as an all-inclusive framework are fairly new, many of the functions it must perform are available in the form of individual JavaScript libraries, some of which have been around for years.

And so, I set out to assemble my own framework using several of these individual libraries. In the meantime, and months after I began my own efforts, a wholy-inclusive SPA platform has become available called Durandal. Though I had my own framework mostly developed by this point I wanted to see what Durandal offered and how it compared to the features I had put in my own framework. To my surprise, the two frameworks were amazingly similar. This is most likely because the problem domain was identical, and many of the libraries Durandal uses are the same ones I chose:

  • jQuery for DOM manipulation. This is a no-brainer for me, though there are reasons to choose other DOM frameworks.
  • SammyJS for in-page "hash"-style routing. I have also implemented routing using Path.js.
  • RequireJS for dynamic on-demand loading of resources, whether it's HTML, JavaScript, CSS, or localization resources.
  • Knockout for data-binding. This is turning out to be a pretty core piece of the developer experience since it abstracts away a lot of old-world direct DOM manipulation for form input values and label display.

While all these libraries (and many more ancillary libraries I haven't mentioned) are important in assembling a SPA framework, they don't solve the entire problem. While a routing library helps with determining the view the user wants to see, it doesn't actually help with view management per-se. So I developed a view framework where each view has an event lifecycle: initialize(), beforeShow(), show(), hide(), afterHide(). Not surprisingly, Durandal offers a similar view lifecycle. What appears to be missing from the Durandal framework (though, honestly, I haven't spent more than a few hours examining it) is an integrated user-authentication mechanism, as well as global error handling and the ubiquitous "404" handling for undefined view routes.

If I had to do it over again, I would definitely pick the Durandal framework, however. Their JavaScript-foo is at least an order-of-magnitude better than mine and Durandal is already beginning to accumulate a large community of developers who are using it. This can only mean the product will continue to evolve and grow with a large user base reporting and fixing bugs, making it far more robust than any feeble framework I could manage to tack together. Still, it was gratifying to see that I, a self-proclaimed "hack," was able to assemble a SPA framework that performed many of the tasks Durandal does (though undoubtedly not nearly as well), and maybe even one or two that it doesn't. And it sure is a blast to be back in the web world again. :-)


Tech | Web Development

Windows Server 2012 Essentials - Essential Tips

I recently replaced my aging HP EX470 Windows Home Server (v1) with a custom-built machine running Windows Server 2012 Essentials (WSE). There are already several articles online that go into detail on the new server OS from Microsoft (like here), so I won't spend much time repeating it. However, what is interesting to note is that WSE is not really a replacement for Windows Home Server. It does many of the same things, but Microsoft targets WSE at small businesses, not home users. To that end, WSE sets up its own Active Directory domain and client computers are automatically added to that domain when they run through the connection routine.

As a home user, I don't really have any need for my machines to be connected to a domain. Not only do I not have a need, I'd actually prefer that they're not on a domain. WSE has no option to support such a scenario, but fortunately there is a Registry setting you can specify on the client before running the WSE connection setup that will prevent the setup routine from adding the client to the domain.

Simply open a command window (run as Administrator) and execute this command:

reg add "HKLM\SOFTWARE\Microsoft\Windows Server\ClientDeployment" /v SkipDomainJoin /t REG_DWORD /d 1

See this TechNet forum post for more details: How to skip domain joining during client deployment in a Windows Server 2012 Essentials network

That works great, but I've recently come across another problem. It turns out that the WSE Connector software alters the network settings on the client computer to route all DNS requests through the WSE server. I generally wouldn't mind this so much, but I frequently get name resolution errors in my web browser on my client machine when the DNS settings are pointing to the WSE server. Manually changing the DNS settings back to their defaults works for a while, but I found something would reset the DNS setting back to the WSE machine a short time later.

Today, I finally found a blog post that outlines how to fix this behavior. Paul Braren at TinkerTry spells it all out in his article Windows Server 2012 Release Candidate Essentials remote client loses its Internet connection, here’s your DNS-related fix. In short, the problem is caused by the Windows Server LAN Configuration service on the client computer. Disabling this service prevents the DNS settings from being altered to point to the WSE server.

All is well again.





Still Stuck With An iPhone

I recently upgraded my cell phone from an iPhone 3GS to an iPhone 5. I had REALLY wanted to get a Windows Phone 8 device, but the killer feature for me was iPod integration in my car, and my car doesn't support streaming over Bluetooth. I have an AUX jack, but that just seems so primitive. So I was "stuck" with the good ol' iPhone.

As it turns out, the iPhone 5 isn't so bad, though I still think the screen is too small. There are several advantages to Apple's little iOS device, and near the top of that list is the fact that everyone gets the latest OS release at the same time. There is no "oh, well, carrier X is going to release it today but carrier Y won't release it for another two months." This is an endemic problem in the Windows Phone world, and possibly even a worse problem in the Android world. For example, last month Microsoft released their "Portico" update to Windows Phone 8 and to their credit Rogers in Canada pushed out this update almost immediately to their HTC 8x customers. However, 40 days later and Bell (my preferred carrier) is still sitting on this update and they're not saying when they'll release it. (I understand Verizon in the US is also dragging their feet on this.)

In addition to the immediate global availability of iOS updates, I have to give Apple credit for still supporting iPhones as far back as the 3GS. Sure, it doesn't have all the latest features (like Siri) but they're still supporting their loyal customers with older devices. Kudos. Another reason that Apple has kept me as a customer is the fact that I can buy a phone directly from them, unlocked. Buying any other manufacturer's phone in an unlocked state is not often straight-forward. I can't buy a Lumia 920 directly from Nokia. I can't buy an HTC 8x from HTC. I can't even buy a Galaxy SIII from Samsung. Though there are some vendors who sell these items unlocked, they are not major retailers and I don't know whether to trust them or not.

One more thing that Apple has in their favour is the available AppleCare+ warranty and insurance for the iPhone. I never buy extended warranties for anything but I paid an extra $100 for this. Why? It's not so much the additional one-year warranty (for a total of two years) but rather the "accidental damage" insurance. If I drop my iPhone and break the glass, for example, I pay Apple $49 and they give me a brand new iPhone. I can do that twice within the two-year AppleCare+ coverage period. I don't know of any other smart phone that offers that optional coverage.

So, yes, I'm "stuck" with the iPhone. I really prefer the Windows Phone 8 operating system and I would like a bigger screen, too. But, it turns out being stuck in the iPhone world isn't so bad after all, and I haven't even mentioned the ubiquity of iOS apps. I guess I can be satisfied with my situation, for now.


Hardware | Tech | Toys

Visual Studio 2012, Extension Updates, and (417) Expectation failed

While running Visual Studio 2012 behind a proxy, I received the following error message when checking for extension updates in the Visual Studio Gallery:

The remote server returned an unexpected response: (417) Expectation failed

A look around Google finally revealed a solution here:


Basically, he's saying the fix is to edit your devenv.exe.config file and change this:

     <ipv6 enabled="true"/>

to this:

     <ipv6 enabled="true"/>
     <servicePointManager expect100Continue="false"/>

Thanks, Jeff!


Web Development

JavaScript: Rise of the Single Page Application

Last month I wrote about the current state of HTML application development and how it has changed drastically over the past five years. In that post I mentioned a lot of technologies that have evolved, but one area in particular that is in the midst of an amazing sea change is where the user-interface logic is executed.

Historically, most web applications have used a server-templating model where an application server such as Microsoft's ASP.NET processes data and emits it as HTML which is rendered on the client. If the user clicks a button or link on that web page, an HTTP POST or GET operation is performed on the application server and a completely brand-new HTML page is returned to the client browser to be rendered again. The problem with the server-templating model is the annoying screen refresh that occurs between postbacks and page re-draws, and the amount of time required to re-render the new page, often when very little actual data on the page has changed. AJAX techniques can aid immensely with this problem, only updating the areas on the page that have changed. However, this is only half of the solution and a further step must be taken to bring a truly rich user experience to a web-based application.

Enter the JavaScript "single-page application" (SPA). This is a relatively new technique (well, Google's GMail has been using it since 2004 but it's only now starting to catch on in the mainstream) where only a single HTML page is downloaded to the client browser and all further updates are performed by JavaScript code. In fact, the single page that is downloaded is often no more than a shell used to bootstrap the JavaScript application. Once initialized, the application runs completely in the browser with no further HTML pages generated by the server. Instead, the JavaScript code is responsible for creating the HTML DOM elements and binding data to them. This reflects a truer image of the client-server architecture that has existed for decades and is the model most frequently used in Flex and Silverlight applications that run in a browser plug-in (no HTML).

This is not to say that the application server no longer serves any data, it simply no longer wraps that data in HTML. That is the job of the SPA running in JavaScript code on the browser. JavaScript will issue a remoting call to the server for data, which is most often returned in JSON format. Using a client-side databinding framework (such as Knockout.js) this data is then bound to an HTML construct. Any changes are sent back to the application server, once again in JSON format, where the server then performs operations on the data, such as updating a database. The result is a more responsive user-interface that works much like any other rich application developed in more traditional client-side technologies.

From where I'm sitting, 2012 is shaping up to the be the year of the SPA. There has been a tremendous explosion in the number of JavaScript libraries and frameworks designed to provide more power to the client-side developer. Which is truly fantastic - front-end developers have more capabilities at their fingertips than ever before, and it just keeps expanding every month. However, the exploding popularily of JavaScript components is also part of the problem. Many, like me, have no idea where to start. There certainly isn't a clear winner in the race to establish the de-facto framework, and in fact many existing frameworks are happy to just provide what they believe is necessary to fill a given need. Assembling all the pieces to create a complete framework for a web application project is still a task left to the developer.

Which is not so much a bad thing, but compare this with the RIA frameworks of Flex and Silverlight. Using these rather completely-featured frameworks, you may wish to incorporate the odd component or pattern and maybe some additional UI controls, but JavaScript SPA developers have many more choices than that to make. One of the big problems with JavaScript in the browser is that it was never designed to house large applications. Modularizing your JavaScript code involves a peculiar set of work-arounds to address the fact that JavaScript is more object-based than object-oriented (though some may disagree), and requires the use of closures to achieve some degree of modularity. Among other things, this often leads to what can best be called namespace-collisions since all variables defined outside of a function in any script file are invoked in the global namespace. So when you bring all these pieces together in your application you will often find conflicts that you must resolve yourself.

Despite these challenges, the SPA is emerging as the preferred methodology for developing rich and responsive applications in a browser (in no small part because Flex and Silverlight aren't supported on increasingly popular mobile devices). User-interface control vendors are now beginning to offer commercial JavaScript control libraries that have no particular server requirement at all - they consist only of client-side JavaScript code. Developer productivity is still a challenge in this environment as IDEs still have a difficult time with the dynamic nature of JavaScript. But advances are being made and the SPA space is absolutely electric with increasing developer interest and daily news on technique improvements. Indeed, it will be very interesting to see and participate in the SPA revolution over the next couple of years.


Web Development