I keep getting distracted from writing this long thing by responding to the discussion created by the last short-ish thing, but I wanted to explicitly call out one aspect, namely that standards are a form of insurance.
More correctly -- and apologies if this sounds like a Planet Money episode -- vendors sell derivatives contracts (insurance) against the proprietary nature of technologies. I.e., as a hedge against winner-take-all dynamics of network effects and the potential for monopoly rent extraction. Adopters of that technology often refuse to buy it at all without the availability of a derivative to cover their risk of lock-in.
The nice bit about these contracts is that the results (price) are free to the public -- ITU and other old-skool orgs are exceptions -- meaning anyone who wants to implement can assess what it'll cost, and if they can provide a solution at a lower to-end-user price, they can compete under the same terms. The public nature of the derivative contract can have the effect of lowering the price of the good itself in markets that clear and have many participants.
Standards organizations are correctly understood as middle-men or market-makers for these contracts.
Update: It may seem strange to some that derivatives contracts (insurance) can drive prices down, but it happens in many industries. E.g., satellite insurance has done wonders for the commercial launch vehicle industry. And who would fund the construction of new cargo ships if you couldn't convince enough people to ship their precious goods? Insurance matters.
Karl Dubost asked what a plan would look like for a W3C split along the lines I proposed in my last post. It's a fair question, so let me very quickly sketch out straw-men while noting that I would support many alternative plans as well. The shape of the details might matter, but not as much as movement in the right direction, and I have faith that the W3C members and staff would do a good job of executing on any such plan. Here are some general forms it could take, both of which seem workable to me:
Spin-Off With Cross-licensing
Taking many of the W3C's activities to a different organization might be traumatic for some members due to IPR concerns. If the members agree that this is the most important consideration, a pure split could be effected, leaving smaller, parallel organizations with identical legal arrangements and identical IPR policies in place. Member organizations would, in a transition period, decide to move their membership to the spin-off, stay with the W3C, or join both (perhaps at a discount?). A contract between the new organizations would allow perpetual cross-licensing of applicable patent rights.
The risk to this plan is that it may not be clear that one new organization would be appropriate -- remember, we're talking about many strange bedfellows under the current W3C umbrella -- so perhaps this may only work for whatever center of gravity exists in the spun-off concerns. Smaller efforts (Java-centric APIs, etc.) may want to find other homes.
Merger
One or more of the sub-efforts on the block may wish to find a new home in an existing standards organization. For these efforts there is likely to be larger mismatch between processes, IPR policies, etc. than a spin-off would entail, but the benefits to both receivers and members are clear: a viable home for the long term is hard to build, but coming home to "your people" isn't nearly as difficult. XML standards, for instance, have a long history at OASIS and it's unlikely that many members who care primarily about XML at the W3C aren't also active there. Transfer of IPR in this case is likely to be more fraught, so contracts which stipulate license from the W3C to these new organizations would also need to include clauses which cause the receiving orgs to exercise similar processes to the ones currently in place for the standards which they receive ownership of. The contract would also need to stipulate that RF never be endangered for new versions of divested specs. I haven't thought hard about transfer of membership under this scenario, but pro-rated and discounted membership terms at all merging organizations -- but only for divested activities -- might work.
Whatever path a breakup takes, the W3C should listen to each of the non-web communities it's spinning off and work to ensure that their needs are met in the process. There are lots of details I don't have time to think through right now, but none of the ones I can identify seem like deal-breakers. The biggest missing piece is the political will to make something happen in the first place.
This is a quick thought as I'm working on something much longer (and hopefully more interesting) to be published in the next week or so. Also, I just want to re-iterate that what's said here are my own thoughts and not those of Google unless expressly stated otherwise.
I've been arguing -- mostly over beers -- for the last year or so that the W3C needs to find a ways to re-focus on the needs of the constituencies that give it credibility; namely web developers and browser vendors. This tends to fall on frustrated W3C staffer ears as individuals might agree but the organization feels the effects of its failures not as a question about how to re-focus around a credible mission but as a short-run funding shortfall. I'm excited that the new Community Groups have the potential to help fix some of this by allowing a freer flow of ideas and a more representative expressions of demand, but there's the lingering pay-for-play issue that I think is liable to drag the organization back to the status-quo as soon as the excitement and newness of Community Groups wear off.
Now, let me say clearly up front that I don't think that standards bodies are charities, nor should they be. People join (in the case of the W3C, "people" == "companies", an equivalence that works as US legal fiction but never in the real world) because they're expressing their economic interests in agreement on thorny, market-focused issues. The W3C, as an organ, has grown out of an IETF alternative into a critical mediator in the flow of ideas and technology in the web ecosystem. When it fails, we're starved of progress because nothing else can easily grow in its place. Bless Hixie and the WHATWG for trying, but the W3C sits on fertile spec and patent rich land. It's tough growing in the rocky craigs, particularly without a patent policy.
So, in the spirit of reform, not revolution, I have a proposal, complete with Swiftian modesty:
At this year's TPAC, it should be agreed that the W3C will divest itself of any and all Semantic Web, RDF, XML, Web Services, and Java related activities. SVG can be saved, but only if it re-charters to drop all XML dependencies in the next version.
Where should these specs and member organizations go to get their itches scratched? I'm not sure it matters, although it seems likely that OASIS wouldn't mind being the go-to place for XML. And there's always ECMA, or IETF, or ISO, or...you get the idea.
Why do such a drastic thing?
Well, it's not that drastic, really. The SemWeb folks use the term "web" in a hopeful, forward looking way that nobody else does. Their technology is showing promise inside firewalls but their work has near zero relationship to the actual, messy web we swim in every day. The result is that their narrow constituency isn't the one that animates whatever legitimacy the W3C might enjoy, and therefore their influence in the W3C is out of all proportion to their organizational and web-wide value. Sir Tim's belief in their goal is noble, but to use the suffix "web" on it today is simply wishful thinking. And things that don't have anything to do with the web probably don't belong at the W3C. Besides, the W3C imprimatur doesn't help these specs and groups as much as their advocates might think -- standards bodies aren't venues for creating the future after all, only for cleaning it up in response to the messy, explosive process of market-driven evolution. Anyway, being endlessly frustrated that the entire web hasn't changed to accomodate some new model even though it has the phrase "web" in the name can't be fun. Technologies need to find their success where they really do fit and distributed extensibility and semantics might be valuable...but it hasn't been to the web yet. It needs to go.
As for XML, well, it won the enterprise and lost the web. That's still success, but it's not the web.
Rinse, repeat for RDF, Web Services, and Java.
What would that leave us with? CSS, DOM & JS APIs, HTML, a11y, i18n, and all the other stuff that has legs out on the public web. More to the point, it would have the beneficial effect of re-focusing the organization around getting HTML5 done, getting DOM APIs done that don't suck for JavaScript on the altar of IDL "language neutrality", etc.
Organizationally, it would leave the W3C in a much leaner, more focused place. It's much easier to build a focused constituency and membership, and the lower cost structure would only help the organization weather hard times. I have high hopes that it can be brutally effective again...but to get there, it needs to focus, and focus is the process of ignoring things. The time has come for the W3C to grab the mantle of the web, shake off its self-doubt, and move to a place where doing good isn't measured by numbers of specs and activities, but by impact for web developers.
There's a lot of "don't turn it into Java!" in the comments on my last post, and I feel it needs a response -- and not because I think there's anything to like about Java's class or type systems (perhaps that they exist at all? I'm torn on even that).
Lets look at some possible ES.next code and walk through how it relates to today's equivalent JS:
module widgets {
// ...
export class DropDownButton extends Widget {
constructor(attributes) {
super(attributes);
this.buildUI();
}
buildUI() {
this.domNode.onclick = (e) -> {
// ...
};
}
}
}
Ignoring the semantic improvements of Harmony modules over the module patterns we're using today, we can see a pretty straight-up de-sugaring of that to today's JS with an emphasis on the many and varied uses of the word function
:
var widgets = (function(global) {
// ...
function DropDownButton(attributes) {
Widget.call(this, attributes);
this.buildUI();
}
DropDownButton.prototype = Object.create(Widget.prototype, {
constructor: { value: DropDownButton },
buildUI: {
value: function(e) {
this.domNode.onclick = function(e) {
// ...
}
}
}
});
})(this);
What you should take away from this isn't that we're turning JS into Java, it's that we're making it easier to read, AND THAT'S ALL. Once more for effect: what class means in ES6 is function. Or at least one of the things you currently do with function
.
Better yet, we're blessing a single pattern which you can use with old-style classes just as easily as you do today. I'll leave it as an exercise for you to name all the different class patterns and systems that do nearly the same thing in slightly incompatible ways today. It's a very long list.
If you like JS, and you like functions + prototypes (I do!), you have nothing to fear from the sugar we're adding in ES6.
Update: A lunchtime conversation here at DojoConf today reminded me of perhaps the largest thing that this new system doesn't do; ES.next aren't tied to a Java-like type system. Many of the conjoined fears of Java come from the very real pain created by a strict, nominal type system, and nobody in the committee is proposing a mistake like that -- least of all me.
There's very little public information yet about Dart (nee, Dash), and as I'm not on Lars' team I can't comment about it. More details will be forthcoming at the GOTO session next month. I'll also be at GOTO, speaking on JavaScript and the state of the web platform.
Making the rounds is an accidentally leaked early draft of notes from a meeting last year that discusses both Dart and JavaScript. I work on many web platform-related things at Google, including serving as a representative to TC39, the body that standardizes the JavaScript language. I wasn't at the meetings having previously committed to presenting at FFJS, but my views were represented by others and my name is on the document. As I said, though, it was a draft and doesn't reflect either the reality of what has happened in the meantime or even the decisions that were taken as a result. And it certainly doesn’t reflect my personal views.
So what's the deal with Google and JavaScript?
Simply stated, Google is absolutely committed to making JavaScript better, and we're pushing hard to make it happen.
Erik Arvidsson, Mark Miller, Waldemar Horwat, Andreas Rossberg, Nebojša Ćirić, Mark Davis, Jungshik Shin and I attend TC39 meetings, work on implementations, and try to push JS forward in good faith. And boy, does it need a push.
Erik and I have specifically been working to focus the TC39 agenda away from syntax-free, semantics-only APIs (Object.defineProperty, anyone?) which might be good for tools, compilers, and frameworks but which are hard for day-to-day use.
Through Traceur and other efforts we've been socializing the idea that the one thing the committee can exclusively do -- and should do more of -- is to carve out syntax for commonly exercised semantics. Seemingly small things like the class
keyword as sugar for the constructor function pattern, or a shorter syntax for functions are big improvements, if only because it's TC39 that's making them. Syntax can end battles over common patterns and help you say what you mean, and JS is overdue for some of this. Larger but more subtle things, like agreement to use pragmas as a way to enable the process of progress in a compatible web, are even more exciting to me. Even proposals that haven't made it through like Scoped Object Extensions and deferred functions have been fought for by us because we desperately want JavaScript to get better. We're big fans of much of the work coming out of Mozilla for the same reason -- in particular Dave Hermann's excellent modules proposal. Together, these are going to do much to help give modern JS some "backbone" in the near term. I’m proud of what we’ve accomplished in TC39 over the last year, and while I hoped for more, the result is an ES.next that looks like it’ll embody many of the things I feel are currently missing. The day-to-day practice of writing JS is going to change dramatically for the better when ES.next arrives.
But it's not the end -- not by a long shot. Classes will give us a humane, interoperable inheritance syntax, but it leaves composition unaddressed by syntax. I'm hopeful that we bless traits in future versions, removing the use of inheritance in most cases. Similarly, I think we can find a way to repair "this" binding foot-guns with softly-bound "this". Repairing the shared-prototypes issue, either through DOM or through something like Scoped Object Extensions, can and should be done. And once we have all of this, the stage will be set for a flexible, advanced type system that does not need to be all-or nothing and does not need to be hobbled by the ghost of C++/Java's inflexible nominal-only types. That's the dream, and we're not shying away from it.
It's hard to square this sort of wild enthusiasm for "raw" JavaScript with what's in the leaked memo, and I can only beg for some amount of understanding. As committed and enthusiastic as I am about the prospects for JavaScript, others are just as enthused about Dart. Google is big, can do many things at once, and often isn't of one mind. What we do agree on is that we're trying to make things better the best we know how. Anyone who watches Google long enough should anticipate that we often have different ideas about what that means. For my part, then, consider me and my team to be committed JS partisans for as long as we think we can make a difference.
Reality Check
There are risks, of course. TC39 is long on seasoned language design skill and short on webdev experience, meaning that many things that Erik and I may take for granted as pressing problems need to be explained, sometimes to an incredulous audience. The flip side risk is that naïve solutions may have better alternatives that seasoned language hands can quickly spot and that simple answers have non-obvious risks or preclude movement in other important areas later. It's good, then, that the committee is working well and is taking appeals to developer productivity seriously.
Whatever you might think about programming languages for the browser, let me assure you of one thing: your problem isn't the language. Not really, anyway. We've made good progress in the last year repairing some of the seams between JS and DOM, and Cameron McCormack has helped us drive a new version of WebIDL that correctly explains DOM as a reasonable prototype chain. But it's only the beginning. The DOM is in terrible shape, and not due to implementation differences. The malign neglect of IDL-addled designs, the ghosts of dead-end XML experiments, and endless cruft will plague any language that sidles up to it. Until we get a real Component Model, a better CSS OM, and some sort of a pragma for DOM that allows us to fix DOM's abhorrent JS bindings, we'll continue to be hostage to C++ APIs inartfully wired up to an incredibly dynamic language. And that’s to say nothing of the pressing need for better CSS, animations, and a built-in data-binding/templating system. When the platform doesn’t provide it, today, we get it from JavaScript. But that’s not where the solutions always belong.
Let me put it another way: when you find yourself thinking "man, JavaScript sucks," remember that it's only painful in large quantities. And why do you need so much of it? 'Cause the DOM, CSS, and HTML standards are letting you down. Any language wired up to the browser today is subject to the same fate, and the insane reality that these things are specified under different roofs in processes that aren't subject to the popular will of web developers. Python doesn’t have it’s DOM APIs decided by the W3C, they borrow the idiomatic ElementTree API from within their own community. WebIDL is an artifact of a different time that has a tenuous relationship to idiomatic JavaScript, the CSS-OM barely exists, and DOM apologists are doing more harm than any of JavaScript's warts ever have. We can fix these things, of course, and here at Google we're trying – in good faith – to work in standards to make them better. But the bottom line is that the language isn’t the problem. I repeat: the language isn’t the problem, the platform is.
The only thing that's going to replace the web as universal platform is the next version of the web. Those of us working on Chrome believe that to the core and feel a deep urge to make things better faster. We might not always agree on the "how,” but we all believe that we can’t do it alone.
Older Posts
Newer Posts