06 2 / 2013
IDBWrapper is trying to do just that, making simple stuff like setting, getting, and deleting one or all records really easy, and making more complex stuff like queries simpler. It also deals with several cross browser inconsistencies (ironic, after what I just wrote about WebSQL, right?). IDBWrapper has been around since sometime in 2011, but just hit a 1.0 release, so I thought I’d mention it here.
You can find the code for IDBWrapper on Github, the up to date documentation, and two older blog posts that describe the API as more of a tutorial. Although the IndexedDB is kinda ugly by itself, libraries can help just like they have with many other ugly web APIs like the DOM in the past. Ugly or not, IndexedDB will be important in bringing offline web apps to the masses.
04 2 / 2013
You can check out a live demo and see for yourself that it is indeed quite smooth. In his own tests, he says you can draw around 500,000 data points per second with 10,000 per frame without a hitch. This is pretty impressive and shows the importance of WebGL to the web platform (*cough* Microsoft *cough*). We’re only just getting started.
One possibility for this libraries usage is some sort of realtime heatmap of a website or app’s user mouse movements. The user mouse movements would be recorded and sent via Web Sockets to an admin app where they could be rendered on a heat map in real time. Having good performance is critical when you have lots of users on a site all at once.
01 2 / 2013
There have been a number of interesting cutting edge standards proposals that I’ve seen out there recently and thought worth sharing. Several of these come from Adobe, which seems to really be investing themselves in the web platform recently. They’re the ones behind the CSS regions and exclusions features that will offer really beautiful magazine-like text layout (e.g. flowing around non-rectangular shapes) in the browser.
Adobe has also been working on some other improvements to the web platform, including several to the Canvas element, including Photoshop like blending modes. The blending modes have been added to the canvas context’s globalCompositeOperation property, which had previously only been used for Porter-Duff compositing modes (e.g. sourceOver). Adobe has also added blending modes to CSS for HTML content, with separate blending modes for foreground, background, and shadows. You can check out the compositing and blending specification for more info.
As for winding rules, this was actually one first implemented by Mozilla for the PDF.js project. Previously, there had been no way to specify a winding rule for canvas like you could for SVG content. Winding rules change how shapes are drawn when they are self intersecting, as you can see in the above image. So Mozilla added support for a fillRule property to canvas elements, and it was subsequently adopted by the standard. I believe WebKit has also implemented the feature, though I’m not sure. However, Adobe has a new proposal to change it from a property to an additional argument to the fill, clip, and path hit testing methods instead. They go over why, in this blog post.
SVG OpenType Fonts
Another really interesting recent standards proposal is Mozilla’s addition to the OpenType font specification for SVG glyphs. OpenType is pretty much the industry standard font format at this point, and it has great support for complex language shaping and advanced typographic features. Currently it supports glyph definitions in two simple formats: TrueType and PostScript. Mozilla is proposing to add a third glyph type, which is SVG. This means that you could define each glyph in the font to be an SVG document. And since SVG can contain color and even animated content, you could have animated color glyphs in fonts.
I’m not sure how I feel about animated glyphs, but color would be really useful, specially with the rise of Emoji characters over the last few years. Usually Emoji fonts are just bitmaps which means they aren’t really scalable to any size. Since SVG is vector based, it that problem can be solved. If added, this proposal would probably also remove the need for the SVG fonts spec, which has been around for a while without much browser support since it doesn’t have very good advanced typographic support. It will be interesting to see where this goes. Mozilla has already implemented it in Firefox (though it is disabled by default), and I expect it will be utilized in other places as well.
Finally, Adobe has a proposal for to allow balanced text layout support in browsers for better readability. Typically, browsers use a simple greedy algorithm for wrapping text, by continually adding words until a line is too long and then wrapping to the next line. This works fairly well for left aligned text, but for centered text it doesn’t look to good to have a long line followed by a really short line. Adobe is proposing a CSS text-wrap property with a balanced value to control the algorithm that the browser uses to wrap text. They even have a jQuery Plugin that implements the feature in current browsers.
31 1 / 2013
Ross McKegney, CTO of Verold, a Toronto based startup working on brining 3D experiences to the web submitted their browser-based collaborative 3D editor and it looks pretty awesome. It allows 3D artists to import 3D object files (specifically OBJ, Collada, or FBX files) into their web interface and then collaboratively edit scenes and finally share or embed them.
The editor works in real time, much like Google Docs but for much more advanced 3D resources instead of just documents. You can edit materials, lighting and other effects all in the browser, and all collaboratively in real time so your changes immediately show up for everyone else editing the project at the same time. This is really powerful, and I think we’re going to see lots more collaborative apps in the future especially with the rise of Node.js.
You can try out the editor yourself, and as you can see it feels very smooth. You can look at some of the public projects available from the main screen to see what is possible. If you create your own project, you first upload a model file exported from another tool, and then adjust the materials, textures, lighting, fog, and several other parameters. Finally, you can create an embed code for your project and put it on a website. The user just clicks the “Go 3D” button and it turns the static image into a user manipulatable scene.
One example given in their introduction blog post would be for a car company’s website, where the user could get a better feel for the car by manipulating it in 3D rather than just looking at pictures or relying on a plugin like Flash. You can try out the car demo on that blog post, and check out a video of the editor in action.
30 1 / 2013
Howler.js allows you to define an array of URLs for different versions of your sound and it will only load the one the current browser supports, either into the Web Audio API or into a normal HTML 5 audio element. The audio element already allows you to define multiple codecs, but the Web Audio API does not. For that, you have to implement your own loading of audio files over XHR and detection of what codecs the browser supports is key so you aren’t loading all of them and wasting bandwidth. Howler.js implements all of this for you, as well as some caching behavior and provides a nice API for it.
Another feature of Howler.js that is pretty interesting is the ability to define “audio sprites”. You may have heard of image sprites as a way of combining multiple small images into a single one to save HTTP requests. Audio sprites are a similar idea for short sound effects. They combine multiple sounds into a single longer one, and only play back the time regions you request. Howler.js has a nice API for this allowing you to define named time regions within an audio file for each effect and then play back these named regions instead of the whole file. You can even play back multiple regions simultaneously even though it is technically one file. This is a pretty cool feature that should make sound effects especially for games much easier.
As I’ve said many times, we’re just getting started with audio on the web. Howler.js looks like a nice way to load and play back audio cross browser. You can check out the documentation and demos here, and the code on Github.
28 1 / 2013
Teoria has several object types: notes, chords, intervals, and scales. Each one of these objects can be created in various ways through a simple chained interface, and then operated on to get information about them, and transform them in various ways. For example, you can create a note from a string representing the note name (such as A), or from a frequency, piano key, or MIDI note number and then convert that note to a different one of those representations. Similar things are possible with scales, chords and intervals. There is a whole host of methods and operations available, so check out the Github page for a full list.
There is a fun demo of the library as well, which shows an interesting application where the user can enter a chord, and see a visualization of the sound waves it will produce. You can drag the waveform to move it around and hover over the individual lines to see what the individual notes and frequencies are. The visualization itself is all SVG based. It’s an interesting showcase for what information Teoria can extract and produce and it’s pretty fun to play with, so definitely go check it out!
After seeing the MIDI.js sequencer, the VexFlow music notation language, and now the Teoria music theory library, many of the pieces for building a high quality music composition application like Finale or Sibelius in the browser are here, so I wouldn’t be surprised to see one in the coming months. Other interesting applications that could be built using these tools and others such as the Web Audio API and the Web MIDI API include smart visualizations that know more about the music being played, music production applications like Logic or Pro Tools, and many more. We’re only just getting started in the audio space on the web, and it’s great to see more building blocks like Teoria appear on the scene!
25 1 / 2013
MathBox.js starts with a library of WebGL shaders that must be loaded. It makes use of another library called ShaderGraph.js, which takes the source code for multiple shaders, partially parses them, and combines them together into a single output shader. This enables you to have shader components that can be reused in various configurations.
Once everything is loaded, it’s your job as a user of the library to define what you want to be displayed. As mentioned above, you do this using a variety of library methods to define the viewport, camera, and objects that appear in a scene such as axes, grids, vectors, curves, and surfaces. You can even enable user interaction to rotate the camera using your mouse.
There are a variety of demos available for you to check out, and more linked to from the Github page. Make sure to view them in a WebGL compatible browser (anything but IE and Safari unless you’ve enabled it these days). As you can see, the rendering and animation is super smooth, and if you view source, it seems incredibly simple to use given the output you get.
There is an excellent article about the inspiration and process behind MathBox including a video of a talk from the Full Frontal conference that you should check out, as well as the code on Github. It’s exciting to see WebGL taking shape and enabling functionality on the web previously reserved for native applications. If you ever need to render high quality 3d math visualizations on the web, MathBox.js seems to be the place to go!
23 1 / 2013
Magic Xylophone is a getUserMedia and Web Audio API demo that lets you control a virtual xylophone by moving your hands. It draws an image of the xylophone keys on top of the live video feed from your webcam, performs some motion detection on the video feed, and detects when your image intersects with any of the keys. Then it generates audio using the Web Audio API for any of the keys you press, including multiple keys at once. It works pretty well, so you should check out the demo! Make sure you’re using Chrome though, as I couldn’t get it working in any other browsers.
The technique being employed for the motion detection is called blend mode difference. Basically, it takes the image from the live video feed and subtracts each pixel from the corresponding values from the previous frame, which generates the black and white image you see in the screenshot above describing the user’s motion in the picture. If there had been no motion at all, the image would be entirely black.
To check if the notes in the xylophone are being played, all the demo needs to do is check if there is white in the areas that define the xylophone keys. If so, there is motion in those areas and a note should be played. The notes are played using the Web Audio API, by loading a pre-recorded MP3 for each possible note and then playing them at the right times.
22 1 / 2013
Mozilla has been working hard on Firefox OS, their mobile operating system using web technologies as the primary application platform. Today they have announced that developer preview devices will be available soon. Billed as the Geeksphone, these devices actually look pretty good specs wise, including a Snapdragon processor, an 8 MB camera, and a 4.3” screen for the Peak model and slightly lower specs for the Keon model.
What is interesting about Firefox OS is that every app is a web app. The homescreen is basically a really advanced list of bookmarks. This means that right out of the gate, right now in fact before it has even shipped, there are already millions of apps available for Firefox OS based smartphones. Of course, once it is released, developers will be able to take advantage of some of the more advanced APIs that Firefox OS enables installed applications to use. Things like access to contacts, camera, notifications, in app purchase, etc.
If you’re interested in developing apps for Firefox OS yourself, you should check out their getting started guide, install the Firefox OS simulator on your desktop, install it on your own hardware, or purchase one of the new developer preview devices released today. Also be sure to check out the long list of awesome WebAPIs Mozilla has been adding to the web platform to make this possible.
Congrats to everyone at Mozilla that made this happen. I’ll be interested to see where this goes and to see if the web can make a comeback on mobile!
21 1 / 2013
As for other HTML 5 features, they’re using the Filesystem API (only available in Chrome ATM) for local file caching during uploads and downloads, as well as the drag and drop API, the FileReader and writer APIs, and more. Check out the source code and the site for yourself over at mega.co.nz.