PhaultLines

Watching the Web evolve at Fluent 2014

I attended O’Reilly’s Fluent conference in San Francisco earlier this month. The annual event covers various aspects of the Web platform, particularly topics that relate to frontend development. The talks that I attended covered a wide range of topics, spanning from JavaScript performance improvements to unconventional uses for the CSS border-radius property.

Fluent gave me an opportunity to reflect on the manner in which the Web is evolving and the opportunities that will arise as it continues to mature. As a platform, the Web benefits from the collective investment of many different stakeholders. An incredibly diverse assortment of companies are committing resources to help advance new standards that extend the capabilities of the Web, improve the performance of HTML rendering engines and JavaScript runtimes, and bring Web technology to new devices and platforms.

Brendan Eich, the creator of JavaScript, gave a keynote presentation that highlighted the future of JavaScript. He started by discussing the high-level syntactic improvements that are introduced in ES6, the next generation of the language. In the second half of his presentation, he showed how ASM.JS and SIMD make JavaScript increasingly suitable for low-level programming, paving the way for new kinds of performance-sensitive content to come to the Web platform.

The most compelling demo in Eich’s presentation was Unreal Engine 4, an extremely impressive new generation of Epic’s powerful game engine. Brendan showed sample UE4 content running in Firefox, with incredible atmospheric lighting effects and other stunning visuals. The performance delta between Web and native is shrinking, making it possible for incredibly rich, hardware-accelerated 3D experiences to come to the browser.

Better performance

In a particularly interesting session at the conference, an Intel representative talked about the company’s multifaceted collaboration with browser vendors. They are working on increasing parallelization so that rendering engines can take better advantage of modern multicore CPUs. They are also working to help HTML rendering engines take advantage of hardware acceleration more pervasively so that even more work can be offloaded to the GPU.

Improvements to rendering engine implementation can deliver fairly immediate and impressive results. Intel’s Mohammad Haghighat talked about a research project where parallelization resulted in a 42% performance boost in Firefox’s layout engine.

A slide from Haghighat’s presentation

There are also a number of exciting projects in the standards ecosystem that will give developers the ability to unlock more performance in their own Web applications. SIMD, which is coming in ES7, will help developers speed up many kinds of multimedia processing tasks. According to Intel’s benchmarks, SIMD delivered performance gains of 4–10x for certain kinds of workloads.

WebCL, which exposes the OpenCL APIs to JavaScript, is also very promising. It will help developers leverage GPU acceleration for certain kinds of general-purpose computing in Web applications. I attended a session where a Samsung engineer presented the latest WebCL developments and showed a few impressive demos. Samsung’s WebCL project has evolved considerably since I first wrote about it at Ars Technica in 2011.

Better JavaScript

JavaScript has many idiosyncrasies, but the heroic efforts of TC39 are rapidly bringing the language into the modern era. With arrow functions, destructuring assignment, array comprehensions, and proper object-oriented classes, the next version of JavaScript is going to look a whole lot like CoffeeScript with curly braces.

Those improvements will greatly improve the everyday JavaScript development experience, but there’s even more coming soon. Additional features, such as support for operator overload, are already on the roadmap for ES7. It seems as though the process of improving JavaScript has gained momentum, potentially putting the language on track to get regular updates.

Asynchronous programming is another area where JavaScript is making good progress. Promises, a high-level abstraction for asynchronous tasks, have gained widespread adoption among JavaScript library implementors. Similar to C#’s Task, Promises make it possible to deal with asynchronous operations in a consistent and composable way, eliminating the need for deeply-nested callbacks.

My colleague Kris Kowal, author of the popular Q library, gave a presentation about Promises at Fluent. He demonstrated various usage scenarios and explained how Promises can be pipelined to simplify complex asynchronous workflows. He also demonstrated Q-Connection, a framework he built on top of Q that enables asynchronous communication between remote objects—useful for IPC and other similar needs.

Kris Kowal speaking at Fluent. Everybody loves the wizard hat.

Promises have historically been implemented via libraries, but the API recently became an official part of the DOM standard. Several browser vendors are working on native implementations based on the standard. But it gets even better: TC39 has conducted some preliminary discussion about adding something like C#’s excellent async/await syntax to JavaScript, which would further simplify asynchronous code by making it possible to consume Promises in a cleaner and more intuitive way.

The Web as an application platform

Although there are still significant gaps between what you can accomplish today with Web technologies and what you can do with native platforms, the Web is advancing faster than ever before. The performance challenges, in particular, no longer seem insurmountable.