Infrequently Noted

Alex Russell on browsers, standards, and the process of progress.

An Now For Something Entire....Oooh! Shiny!

The Google O3D team just launched and the news stories are already starting to trickle out.

Ok, so it's shiny...but so what?

First, O3D embeds V8. This means that while you might be running your O3D code in a browser with a terrible JavaScript engine, or worse, an engine with terrible GC pauses, your O3D content isn't subject to those problems. This is a Big Win (TM). Most of the web can limp by with bad GC behavior, but interactive 3D just can't. You might have seen the difference this makes by running Dean's Monster demo in Chrome and then trying it in other browsers.

Next, O3D presents a scene graph. Direct-mode proposals to the 3D-on-the-web discussion are based on the idea that JavaScript programmers will ship enormous toolsets down the wire in order to re-create the scene graph and/or to parse shape descriptions. Having direct access to the OpenGL surface description is incredibly powerful, but I suspect not sufficient in the long term to really bootstrap a world where 3D is a first-class citizen. Also, using the web as a way to break-open some of the closed interchange challenges of today's 3D world isn't going to happen when everyone's description of things is entirely programmatic, so I'm excited by the direction of O3D as a force for good.

Congrats to the O3D team. It's a big day for them and the deserve huge props for shipping concurrently on Windows and Mac.

Ending The ga.js Wait

Google Analytics is ubiquitous, not least of all because it's better at what it does than most of the alternatives. Also, it doesn't require any install or maintenance. And it's free. What's not to like?

Frankly, not much, but if I had to nit pick, I'd note that the worst part of Google Analytics is the ga.js script that does the actual tracking of content. It's not bad, in and of itself, but it tends to load slowly. There are several reasons for this:

So how to fix this? Dojo 1.3's dojox.analytics.Urchin to the rescue! Given that I was going to be including Dojo on the page to do other things anyway, I can use 1.3's new Urchin system to help amortize the cost of using Google Analytics. The code running on this blog now includes the following code (more or less):

<script type="text/javascript"
src="https://ajax.googleapis.com/ajax/libs/dojo/1.3/dojo/dojo.xd.js">

</script>
<script type="text/javascript">
dojo.addOnLoad(function(){
setTimeout(function(){
dojo.require("dojox.analytics.Urchin");
dojo.addOnLoad(function(){
var tracker = new dojox.analytics.Urchin({
acct: "UA-XXXXXX-X" // your tracking # here
});
});
}, 100);
});
</script>

You can see the real thing by looking at the bottom of this page which pulls in custom.js which includes this logic. Pete Higgins blogged about how the module works when he first wrote it, and the strategy is to load Dojo from a CDN (since the page wanted it for other things) and wait until after the page has loaded to include ga.js. This delayed loading ensures that the page is responsive before we start doing anything related to tracking. The nested calls to dojo.addOnLoad show how we can wait for one set of modules to finish before kicking off another group, and in this case we also wait until after the page is responsive to load dojox.analytics.Urchin. This module further delays loading of ga.js, meaning that it doesn't matter how long it takes for things like DNS to resolve and for Google to serve ga.ja since it's not ever blocking the user.

Looking at the "before" and "after" shots in firebug gives you some idea of how this technique can really make a difference:

[caption id="attachment_940" align="aligncenter" width="300" caption="onload waits for ga.js"]onload waits for ga.js[/caption]

[caption id="attachment_941" align="aligncenter" width="300" caption="using dojox.analytics.Urchin, we can get a responsive page and then track usage"]using dojox.analytics.Urchin, we can get a responsive page and *then* track usage[/caption]

We get page tracking and the user gets an interactive page faster. Rad.

Omaha Goes Open Source

Google Updater, aka "Omaha", has gone Open Source!

This is the auto-update system that's key to keeping Chrome secure by always ensuring that the version you're running is the freshest it can be. It's huge for the Omaha team to be out in the open, particularly given how many inaccurate articles have been penned about the update system. Now you, dear user and/or journalist, can know exactly what the update system is doing all the time. It's all right there in the code.

Dojo 1.3 Is Out!

Dojo 1.3 is here! (download site)

If you're already using Dojo, this should be a no-brainer upgrade. It's out-and-out better. As a quick example, dojo.create("tagname", { /properties/ }) is now the preferred way to build DOM nodes quickly. Its simple API will be natural to anyone who has used dojo.attr(). Even better, Pete's exciting PlugD version of dojo.js has been updated to 1.3 as well.

1.3's Core features the new "Acme" CSS selector engine which provides a big boost in speed for many operations in the fast-path. I blogged before about the work we did to make Acme fast, and rest assured it is (in aggregate, across all use cases) quicker than any other selector system you can get your hands on today. But selector performance isn't where it's really at, and I've been saying that for a long time.

Luckily, Pete Higgins decided to prove it and has been working on a new set of benchmarks with the help of other toolkit vendors (to ensure fairness) called "TaskSpeed". Dojo 1.3 wins by a wide margin. Across all the reported browsers so far, Dojo is at least 2 times faster than other toolkits on common DOM operations. We've worked very hard over the years to make sure that Dojo's APIs don't encourage you to do things that will hurt you later, and TaskSpeed finally shows how much this philosophy pays off:

taskspeed

The numbers above are from TaskSpeed, a new toolkit benchmark developed by Pete Higgins with tests contributed by other toolkit authors to ensure fairness. Shorter is better.

Given that DOM is the primary bottleneck in most apps DOM is a big bottleneck in today's apps, usually just behind network I/O and these tests demonstrate how Dojo's approach to keeping things fast pays off not just on micro benchmarks like CSS selector speed, performance improvements to single toolkit functions, or even file size - but on aggregate performance where it really matters. Dojo's modern, compact syntax for these common operations doesn't slow it down, either. For instance, if you go check out the TaskSpeed reporting page, you'll see that where browsers are slowest (IE6/7/8, etc.), Dojo's focus on performance pays off most. Why use a toolkit that's going to hurt you when it really counts, particularly when Dojo so easy to get started with? Dojo's Core has been designed from the ground up with APIs that encourage you to do things that are fast and keep you from doing things that are slow unless you really know what you're doing. In some cases, we've made hard size-on-the-wire tradeoffs in order to keep actual app performance speedy. That hard engineering doesn't show up in micro-benchmarks or single test release-over-release improvements or the "my toolkit is smaller" comparisons that some would prefer that web developers focus on. It's easy to win rigged games, after all. It's only when you see APIs composed together in real-world ways, across browsers, that you can start to see the real impact of a toolkit's design philosophy. Dojo is designed to help you make things that are awesome for users, and that means they need to be FAST.

Other toolkits have released performance numbers of late, and most of them have been either reported badly or run without much rigor, so it's exciting to see everyone finally pitching in to build end-to-end tests that show how library design decisions interact with real-world realities of browsers. The TaskSpeed tests have been designed to be both even-handed and reliable (no times below timer resolution, etc.). The reporting page is also designed to make the results understandable and put them in context. A lot of care has been taken to keep this benchmark honest. JavaScript developers have suffered at the hand of chart junk for far too long.

I can't do 1.3 justice in a single blog post, so I recommend that you check out these resources and then just dive in:

Big thanks to the folks who tried out the betas and RC's and helped make 1.3 solid.

RMS: Crazy Is As Crazy Rants

I suppose we had it too good. JavaScript hackers of the world lived in relative licensing bliss. Organizations like the Dojo Foundation built and preserved large swaths of high-quality code for anyone to build on, and even the outlier toolkits eventually came in from the cold. The open were even progressing toward even more transparent and community-driven development. Politics, of course, existed, but BSD-licensed code was the norm and Foundations helped guarantee the rights of users.

Alas, no, we've been doing it all wrong. Excuse me while I go rinse the taste of situational ethics and lost plots out of my mouth.

Older Posts

Newer Posts