It’s an early build and there’s no demo yet, but you can go see the massive output file it produced, meaning it actually does compile. The goals of the project are
Develop frameworks for image/webpage capturing and automated web testing (similar to PhantomJS).
Develop a framework for prototyping CSS filters, HTML elements and attributes.
Those are interesting goals. I have no idea how realistic or practical they are, but its a great showcase of the power of Emscripten to compile justaboutanything these days.
Currently a lot is not working yet, but its interesting to see the vaporware of two years ago become reality today. Go check it out on Github, and try your hand at getting various WebKit features working.
glsl-transition is a project from Gaëtan Renaudeau which provides a very flexible and extendable way to use WebGL shaders to create iPhoto like slideshow transitions. It has a promise-based API, supports easing functions, and supports any GLSL fragment shader you can write.
It’s obviously meant to be an extendible library, making it fairly easy to write your own transition effects. You just have to implement a WebGL fragment shader with a couple special uniforms that will be filled by the library for you, including the starting image, the ending image, the progress in time, and the resolution. The library will automatically render the transition using your shader as needed, controlled by a timing function.
glsl-transition provides support for not just bezier easing functions, but any function that takes a parameter t as a linear time, and it returns an easing value. If you want a bezier curve, there is a library for that which returns compatible easing functions, and supports some built-in commonly used easing functions. If you’re interested in how this works, the author wrote a really good article about implementing bezier curve easing functions. Apparently it was good enough that even Apple is using the code from that article on their Mac Pro website.
I’m glad to see WebGL taking off. We still have a little way to go on mobile, but WebGL support is now fairly widespread on modern desktop browsers with even Microsoft supporting it in IE11. Only Safari is left with it turned off by default.
You can check out the demo, which runs in all modern browsers, but best in Firefox from what I’ve seen, thanks to asm.js optimizations. I got about 23 FPS at about 40% CPU usage. Of course, Firefox and Chrome already support Ogg video natively, so the primary targets of the project are Safari 6+ and IE 10+. Ogv.js is an Emscripten compile of libogg, libtheora, and libvorbis. It’s actually a fork of a project I started to bring Ogg Vorbis audio to the Aurora.js suite of audio codecs. However, at this point, I’d be hard pressed to find any of my code in the project.
The video decoding process actually starts with a streaming HTTP implementation, which allows videos to start playing before the entire file has been downloaded. Unfortunately, there isn’t a good streaming XHR API cross browser, so they use a combination of Microsoft’s MSStreamReader API for IE 10+ (hopefully going to become standardized?), Firefox’s proprietary moz-chunked-arraybuffer, and binary strings (ouch!) for Chrome and Safari. This allows much increased perceived performance, since decoding can happen as the file is being downloaded.
Once the video stream has been received, it is fed into the C wrapper around libogg, libtheora, and libvorbis for decoding. Then the audio goes to the Web Audio API (which means no IE support at this point), and the video is rendered in a canvas element. At this point, the project is just using a simple 2d canvas for rendering pixels, but they say in the readme that a WebGL implementation could be used in the future. I’ve seen huge performance increases (up to 20%) by moving the colorspace conversion (the last step of decoding) to the graphics hardware, which can do it in parallel, so I’m looking forward to seeing this.
Parallel decoding in web workers is also being considered, since decoding both the video and audio on the main thread can make for some stutters on slower machines (read: mobile). Support for seeking is also on the roadmap.
You can check out ogv.js on Github, and a demo as well.
P.S. I hope you like the new site design, it should be much more readable on mobile thanks to the responsive layout. Let me know what you think!
TraceGL works by instrumenting all of your code so it knows when calls took place, and all of the boolean logic that determined which code path to take. Then it visualizes all of this, using WebGL for performance, showing you a high level overview called the “mini map” in the top left, a log of function calls in the top right, the call stack in the bottom left, and finally the code for the function in the bottom right.
As your code runs, TraceGL visualizes all of this data in real time. The mini map is useful to see the ebbs and flows of the code, i.e. where the stack gets deeper and shallower again. In this way, you can see where events are being processed, like mouse or keyboard events in the browser, or HTTP requests in a Node.js application, and then get to a section of the potentially very long call stack very quickly. TraceGL even works over asynchronous events, unlike most step debuggers, which means that these operations are still shown as part of a single call stack under their originating calls, rather than as separate events.
Here is a video showing TraceGL in use:
TraceGL can instrument both browser based and Node.js applications, and integrates with various editors so that double clicking a line can open your favorite editor. An interesting aspect of the UI is that it is written entirely using WebGL, apparently for performance reasons. Of course, all of the text rendering (most of the UI) must have been done in a 2d canvas and then uploaded to WebGL as a texture since WebGL has no native text rendering capabilities, but clever rendering tricks like only re-rendering what has changed can make things fast. And once the textures are on the GPU, moving them around, scaling them, etc. using shaders is very fast.
I think we’re probably going to see more and more WebGL user interfaces soon. We’ve seen a lot of 3D stuff written on top of WebGL, and it is certainly good for that, but I’m betting that normal 2D user interfaces on the web will start being written with it too, just thanks to its great performance characteristics. HTML and CSS is great for documents and applications, to a point, but for web apps to compete with native on performance, hardware accelerated UIs on top of WebGL will be important.
Of course, building user interfaces using WebGL means that any text rendering that is done won’t be selectable, copyable, or accessible to screen readers without lots of additional work, so I can see frameworks being developed to facilitate this. I’ve already been working on and off on something similar to Apple’s Core Animation framework on top of WebGL (not public yet), and other interesting 2D frameworks like Pixi.js have been released recently. Especially with WebGL’s likely support in Internet Explorer 11, I think the age of WebGL user interfaces is upon us, and it’s exciting!
You can check out TraceGL on their website. It costs $15 to buy, but not all good tools are free and it’s nice to support good developers, so give it a shot and let me know what you think in the comments!
John’s article talks about some of the use cases for asm.js, some of the common misconceptions about it, and finally includes a question and answer section with Mozilla’s compiler engineer David Herman, who is one of the authors of the asm.js specification. It’s definitely a good read, so check it out!
I think asm.js will be really important over the coming months and years, and I’m excited to see other browser vendors already getting on board. I got even more excited about it when I saw Mozilla and Epic Games’ demo showing the Unreal Engine running in the browser at very good performance, thanks to Emscripten and asm.js last week.