Saturday, February 11, 2012

On CSS vendor prefixes and applying pressure where it is making most sense

If you are a web developer chances are pretty good that you have read quite o lot of tweets and blog posts on vendor prefixes lately. It all started when Daniel Glazman issued an urgent call for action. Various people have chipped in including Bruce Lawson and the Web Standards Project.

This is an expanded version from a comment I wrote on Daniel Glazman's blog.

My background

Please bear with me as I explain my background. It is important for the thoughts I have at the end.

I am an educator. I teach web development for a living and has done so for more than a decade. As a consultant for Skolverket, Sweden's national agency of education I have in fact succeeded in making standards, accessibility and best practices mandatory requirements for all teaching in the country at the secondary level (approx. senior high school in the USA/A-levels in UK).

I am participating in the Web Standards Project Education Task Force, am co-author of InterACT with Web Standards and have written material for the InterACT curriculum project. I also participated in OWEA and now the Web Education Community Group at the W3C. (Of course nothing on this blog is the official position of any such entity.)

I thoroughly believe in education!

Nevertheless I do not see how education alone can solve the problem at hand. Too much misinformation, too much marketing, too much Apple- and Google-centric journalism. Education can only alleviate the problem, never solve it.

There is a solution available that will make the problem disappear

There is in fact a solution available. One that would do more good than anything else.

Webkit must drop support of prefixed properties when there has been a standards compliant solution implemented, after a reasonable grace period. Say 6 months.

This is after all what all other browser vendors are doing right now.

This is how prefixes are supposed to work. This is the right thing! Anything else is a violation of the principle.

And yes, this will break sites, which is the exact point of this solution!!!

This is the only way to create the necessary urgency that will reach into web development sweat shops in India and China, as well as into the corporate management level in the larger companies.

There are educated web developers that simply do not have the time or the mandate to fix problems like these, because upper management do not care. They think about their own product here and now, not what's good for the web as a whole 10 or 15 years into the future.

Let's apply pressure on Apple

This is primarily not about blame, but applying pressure where it is most useful.

Let's create and uproar in the web development world strong enough to create bad will for Apple, forcing upper management to do the right thing!

Apple (and to a lesser extent Google) are sitting on the solution. Not Microsoft, not Mozilla, not Opera.

If they are getting away with acting badly it is because we are letting them, letting love for the products they make get in the way of the higher goals, letting them behave in ways that perhaps was OK when the iPhone was launched and Webkit had 0 % market share on mobile, but not changing their way according to the present situation.

Apple initially supported SOPA and PIPA – which speaks volumes about the default mind set within the company – but back peddled. So even if they have claimed that they will never remove the support for prefixes, let's create enough pressure on Apple that they will benefit from back peddling again!

Right now the pressure is applied to Mozilla, Opera and Microsoft to do the right thing, even if it will hurt their market share – and by extension their ability to fight for a free and open web in the long run. We are asking those companies to sacrifice themselves for the common good and for higher ideals.

But Apple is either loved too much for anyone to act against them – or seen as a lost cause, an unfixable problem, the IT equivalent of North Korea.

As long as the web development community is stuck in that love/hate relationship with Apple, we can never act sensibly.

Who can afford it?

Apple has just had a another record quarter and even may surpass IBM and HP in a few years.

  • Still they won't hire evangelists like Opera, Google and Mozilla that teach best practices.
  • Still they won't hire full time spec writers to get their proprietary extensions through the standardization process.

The takeaway from every Apple keynote I've heard the last years is "concentrate on the iPhone and iPad".

This message need to change!

The other main stakeholder in Webkit is Google. And their business is not shabby either. In fact it is Android and Chrome that has made Webkit the most popular browsing engine in the world.

But Google embraces open more than Apple. Google may very well be quite easily persuaded to stop supporting prefixed properties for mature technologies, in my opinion.

It is Apple that primarily needs to be pressured.

What's at stake?

Since Mozilla not only develops their browser openly, but has an open process for just about anything I stumbled upon their internal discussion about supporting -webkit- prefixes already a few months ago. First of it is quite clear that nobody within that organizations likes the idea of supporting -webkit- prefixes. Even those that support the idea do so with the utmost distress.

But except for some Gnu/Linux distros Gecko is not the default rendering engine anywhere. Opera is in a similar position. For the majority of users their is but one incentive to switch: A better user experience.

That is the bottom line.

Not ideology, but my experience as a user today, is what makes people switch.

However, since Mozilla and Opera to a high degree are value driven organizations it is easy to think that it is "their job" to keep the web open. But if right decisions comes at the cost of minuscule market shares they won't have much power to wield anyway.

And please remember that Opera and Mozilla did go down the path of ideology concerning video codecs. Would that have been possible today when their market share is falling?

It's time for Apple to pay back

Apple has never been a true champion of the open web or indeed anything open. Despite the fact that open technologies saved Apple from ruin. That, and two important gifts from Microsoft: Cash in return for stock that gave no voting power and a commitment not to stop developing Office for Mac OS.

Back in the 90's Apple struggled to develop a modern and technologically sound operating system. Copland failed. Taligent failed. And so did a bunch of other efforts.

The number one option was to buy BeOS and that deal almost came through. (Imagine the world if it had! Steve Jobs return would probably not have happened...)

However, what Apple could not develop in house they found in open source:

But also this might not have been. KHTML developers came very close to throwing in the towel and switching to Gecko.

Once again, what if they had? What if they had not persisted thanks to ideology, not business considerations?

Projects driven by ideology provided Apple with the core technologies from which they have built Mac OS X, iOS and the Safari browser.

And Apple also benefited from Firefox. It was Firefox that wrestled the web away from Microsoft. It was Firefox embracing standards that provided the opportunity to develop a browser like Safari. Remove Firefox from the equation and there would never have been a superior browsing experience on the iPhone. Those experiences were made possible from developers embracing standards thus eliminating the Microsoft monoculture.

So why should Apple help the world? Because the world first helped Apple!

Once again, we should apply pressure where pressure is due.

Unconditional love or unconditional hate bestowed upon Apple will both just make the company worse. And as for the latter (hate), did you notice that Apple actually pack peddled concerning SOPA and PIPA? Given enough pressure the company actually can abandon bad ideas.

If the web development community keeps praising Apple as if the company does nothing wrong – or at least nothing wrong that can't be excused – or if we treat Apple like a lost cause, a closed brainwashed sect incapable of change, then by all means, put all pressure on Mozilla, Opera and Microsoft.

But please acknowledge that in so doing we are forfeiting our best chance to keep the web open. Remember, this is not so much an issue of who is to blame, as it is an issue of how it could be fixed.

Sunday, November 20, 2011

Still using Firefox because privacy matters

I am still using Firefox as my number one browser, occasionally switching to Opera, and occasionally trying out Chrome (on my Laptop). Why am I so stubborn, some Chrome fans will ask? And why am I not using the default Android browser on my slate? And why did I not end up buying an iPad?

It all boils down to two points: The browser ecosystem and the browser ethos.

The web is more important than apps

Sometimes I hear people say it's all about the apps. I disagree. The web is the only one truly open and life changing platform. I did not chose Android over iOS because of usability or design concerns. I'd say it is still slightly – but only very slightly – behind. My new phone is a Nokia N9. Neither Android or iOS, but Meego! And yes, it has a few hundred apps, including a Dropbox client and a Google Reader.

(By the way, the user interface on the N9 is just fantastic, it beats both the iPhone and Android!)

However, both the Android Tablet and the Meego Phone lets me run Firefox. And that is nowadays a sine qua none for me when choosing a device.

Every major browser has an ecosystem nowadays

Of course all browsers can surf the web and in that regard the web is their ecosystem. But in addition to that they come with additional features.

  • Internet Explorer's ecosystem is that of corporate intranets, Windows and the occasional web site that still requires ActiveX. I know of few people who enters this ecosystem by choice, except gamers, but it is easy to get support. And all your games are playable on the hardware that is far cheaper than anything from Apple. And Windows does really have the best graphics drivers and infrastructure for sound.
  • Chrome's ecosystem is of course Google products: G-mail, G-cal, G-reader, G-docs and G+. You're at least considering getting an Android phone and think that Chrome OS has a lot more potential than it has shown so far. You are obviously not so concerned with the privacy of your data.
  • Safari's ecosystem is of course the Apple products. If you are into Apple, you're using a Mac, and every program in sight that starts with an i. You had an iPod and now have an iPhone and you will get an iPad, not because you need it, but because its made by Apple. In short, you enjoy the shiny prison of being locked in to Apple. Censorship and authoritarian control of the platform be damned – the bars are golden!
  • Opera thrives on the mobile and on devices. They have managed to make web surfers out of millions of people who never could afford a modern smart phone. In some ways though, they struggle since the entire web universe basically is their ecosystem – except for sites made by incompetent developers that shut Opera out for no good reason. There are things like Opera Link and Opera Unite, but they have not got the buzz they so rightfully deserve. The fact that Opera pioneered many things now common to all browsers, like tabs, and have always been a true champion of standards, even before Mozilla appeared, makes them worthy of tons of respect.

So, what makes Firefox ecosystem so special? It boils down to one thing. Firefox is everything!

  • It is not the oldest champion of web standards among browser vendors, as that honor goes to Opera, but they are a good runner up, and it was Firefox that de facto wrestled the web away from Internet Explorer, providing room for all other browsers to exist as well. Firefox paved the way for both Safari and Chrome!
  • It is not the true speed king, as that title goes to Chrome or perhaps Opera, that are faster on most – but not all – benchmarks as far as I can tell. But it is fast enough. And it championed the cause of being lean – once upon a time! Even today it consumes far less system resources than most other browsers.
  • It is, however, the true champion of add-ons. And it has the best add-ons for my needs.
  • Firefox has the best sync. First of all it is client side encrypted, meaning no data mining opportunities for Google or anyone else. But what makes it stand out is the syncing of tabs. It's a time saver and a life improvement factor! I sue it all the time, moving between computers, my phone and my tablet.

The right focus for the future

The economy of the future in IT is data driven. In a world of ubiquitous computing stewardship of our data in the cloud is the main prize for all big players to fight about. Through pads, tabs and boards data will be accessible in lots of places. But whoever is in control of the cloud is the winner of the future.

But will cloud stored data be accessible to all? Or only a select few that use the right brand?

And will it be inaccessible to anybody except those that I chose to share my data with? Or will the host continually mine my data and in the worst case scenario share it without my consent?

In this regard, Apple's guiding principles fail totally. The company shows close to zero interest in interoperability and since the margins go down when selling cheap devices this becomes a global problem. By building digital walls around the data in the richest countries, we are once again failing all non-western countries.

By the way, the Arab spring was not brought to the world by Apple, but by Nokia…

Google might not have failed yet. Interoperability is high on their agenda and so far their data mining seem to be mostly anonymized. But when one single entity sits on too much data and control the complete surrounding ecosystem, we are providing someone with enormous temptations. Sooner or later that temptation will become too strong!

Therefore we need truly anonymized cloud technologies. We need user owned and controlled data. We need stuff like browser id to replace the usage of id-solutions provided by Google, Facebook, Twitter, Microsoft or Apple.

In short, we need what Mozilla and Opera are championing. And for me personally that means that if you see me using a browser for anything but testing, that will still be Firefox for the foreseeable future, unless I change to Opera!

Monday, June 20, 2011

Will my next laptop be a Thinkpad?

It’s time to move on from my trusty old Thinkpad z61p. It has been a dear friend, but its simply not anywhere near state of the art anymore. Still, having lasted over 4 years and many thousand hours of heavy duty use is high praise indeed.

So what do I want? I had hoped for an upgrade of the Thinkpad W701, to use Sandy Bridge and Thunderbolt – but instead the series seems canceled. Where is the follow up to the W701? Why, oh, why have you forsaken me Lenovo? (I actually don’t know. Lenovo’s web site is basically useless.) It takes a ton of work to find the information I actually want.

I want a workstation, movable but not ultra portable. More power. And it may weigh a few kgs.

Stuff the computer must have

  • A trackpoint – a high quality, usable Trackpoint. Does any other line of Laptops except Thinkpad have those?
  • A really good keyboard. Once again, can I trust any other manufacturer than Lenovo?
  • Preferably 17 inch, high resolution (1920px wide), high contrast – matte. Not glossy! (Maybe I must settle for 15 inch, but it would be very disappointing.)
  • A high speed SSD, as big as possible.
  • It should run Linux with no hickups. My distro of choice is Fedora. For testing purposes I may shrink and keep the Windows partition, but it will see very little usage. (If buying an OS-less laptop was an option, I’d chose it.)
  • Firefox and Chrome should be able to run WebGL on Linux, thus the graphics drivers must be on their whitelist.
  • Display port or HDMI
  • Service should be available in Sweden, so lesser known brands are out of the question.

All of the above are top priorities. And I am prepared to spend quite a lot of money on this purchase.

Then there are a few things that goes without saying

  • High end modern CPU, like the 2920XM or one step down from that one.
  • At least 8GB of RAM DDR3 or better.
  • Gigabit Ethernet.
  • IEEE802.11abgn.
  • Bluetooth 3.0.
  • At least one USB 3.0 port,
  • Large and high performance SSD,

And there is also a list of nice to have features on my wish list

  • Spill proof keyboard.
  • Back lit keyboard.
  • Thunderbolt
  • 2nd drive for storing large files that I do not use very often, video, images, sound. (Yes, the W701 had 2 drives...)
  • A Firewire interface

I prefer a 2nd drive to optical media. If there is only room for one drive, chances are high I will ditch the optical drive and use that space for an extra hard drive.

So, what are my choices? Right now the list looks like this:

  1. Thinkpad W520, but it has only got an 15 inch screen…
  2. HP EliteBook 6760w, but will the Trackpoint be good enough… (and I can't find a model with the 2920XM processor, OTOH I can get it with a better graphics chip, up to the Quadro 5010M, but that GPU consumes a whopping 100W and is probably a bit too much, even for me.)
  3. Fujitsu Celsius H910 The specs looks nice. I can have up to 2 x 250 GB SSD but the keyboard and Trackpoint is not comparable to a Thinkpad.

Given my set of requirements, what other options do I have?

And the key question: Is there any hope for a 17 inch Thinkpad W7xx to be released within the next 3-4 months?

(And no Mac is not and never has been an option. And, yes, I have plenty of experience with Macs and know perfectly well how they compare to Thinkpads.)

Wednesday, May 11, 2011

Firefox Aurora and Nightly on Linux

Automated download and install Firefox Aurora and Nightly on Linux

ffupdate.sh


#!/bin/bash
FTPDIR="http://ftp.mozilla.org/pub/mozilla.org/firefox/nightly"
AURORA="firefox-5.0a2.en-US.linux-i686.tar.bz2"
NIGTHLY="firefox-6.0a1.en-US.linux-i686.tar.bz2"
DLDIR="Hämtningar"
#
cd ~/${DLDIR}
rm -rf firefox*
test -d /opt/aurora || sudo mkdir /opt/aurora
wget ${FTPDIR}/latest-mozilla-aurora/${AURORA}
test -f ${AURORA} && tar xjf ${AURORA} || \
  ( echo "Aurora download fail" &&  exit 1 )
sudo rm -rf /opt/aurora/*
sudo mv firefox/* /opt/aurora/
rm -rf firefox*
test -d /opt/minefield || sudo mkdir /opt/minefield
wget ${FTPDIR}/latest-trunk/${NIGTHLY}
test -f ${NIGTHLY} && tar xjf ${NIGTHLY} || \
  ( echo "Nightly download fail" && exit 1 )
sudo rm -rf /opt/minefield/*
sudo mv firefox/* /opt/minefield/
rm -rf firefox*

To be improved and adjusted when new versions appear, but it does the job for now.

Excerpt from .bashrc


alias ff36="/usr/bin/firefox -no-remote -P ff36 &"
alias ff4="/opt/firefox4/firefox -no-remote -P ff4 &"
alias ffclean="/opt/firefox4/firefox -no-remote -P clean &"
alias ffbeta="/opt/ffbeta/firefox -no-remote -P beta &"
alias ffaurora="/opt/aurora/firefox -no-remote -P aurora &"
alias minefield="/opt/minefield/firefox -no-remote -P minefield &"

Friday, November 26, 2010

Tidy5 aka the future of HTML Tidy

UPDATE 2011-11-19: The most immediate of my concerns have been addressed by Björn Höhrmann who has submitted basic support for HTML5 in Tidy to a forked version available on Github.

I have been a long time fan of Tidy, a tool to clean up and do some basic checks on the code. However, the tool is not really being updated any more, and since I have moved to using HTML5 and ARIA on all my new projects, it has lost much of its usefulness.

I also see no momentum picking up and thus think it should be considered folding Tidy into html5lib. By that I mean using html5lib to get Tidy like functionality.

Today I wrote a mail that I cross posted to the discussion list for Tidy and the help list for WHATWG. This blog post is essentially a longer version of that email.

Tidy must go HTML5

Here is the deal with HTML5. Pretty soon every browser will have an HTML5 parser. Except for IE, browsers do not have multiple parsers.

This means that tokenization and DOM tree building will follow the rules defined in HTML5 – as opposed to not really following any rules at all, since HTML 4 never defined them.

Simply put, there is no opt out of HTML5. An HTML 4 or XHTML 1.x doctype is nothing more than a contract between developers. Technically all it does is to set the browser in standards compliance mode.

Thus, I do not see any future in a tool that does not rely on the HTML5 parsing algorithm. Tidy can not grow from its current code base, but needs to have the same html5lib at its core that is in the HTML5 validator, which basically is the same as the one being used in Firefox 4.

Additionally, Tidy suffers from:

  • Implementing WCAG 1 checks in a world that has gone WCAG 2.0.

  • Not recognizing ARIA, which is an extremely valuable technology on the script heavy pages of today.
  • Not recognizing SVG and MathML.

I know one can set up rules to enable Tidy to recognize more elements and attributes, but for full HTML5 + ARIA + SVG + MathML (and perhaps RDFa), that is simply not doable without superhuman efforts.

The merge

A basic Tidy5 implementation could look like this:

  1. Parse the tag soup into a DOM.
  2. Serialize HTML from that DOM.
  3. Compare the start and the end result.

Perhaps any error reporting can be made during the parsing process. Henri Sivonen could probably answer the question if that is possible.

However, there is also talk about having a lint like tool for HTML, that goes beyond what the validator does. So in addition to the above, there can be settings for stuff like:

  • Implicit close of elements. Tolerate, require or drop all closing tags?
  • Implicit elements – tolerate, require or drop (maybe require body but drop tbody...)?
  • Shortened attributes – tolerate, require or drop?
  • HTML 4 style type attributes on <script> and <style> – tolerate, require or drop?
  • Explicit closing of void elements – tolerate, require or drop?
  • Full XHTML syntax (convert both ways)
  • Indentation. Preferably with an option not to have block elements with a very short text content not to be broken up into 3 rows as in Tidy today.

Besides purification and linting, such a tool/library can be used for:

  • Security. This will require the possibility of white and/or blacklisting elements and attributes. And preferably also attribute values.
  • HTML post processing. This will enable authors to see indented code, that is explicit, while at the same time such "waste" can be removed before gzipping. This would be akin to JS minification and it could be performed on the fly from within PHP, Python, Java, Ruby, C#, server side JS or whatever. It can also be done manually before uploading from the development environment to production - or it could be integrated into the uploading tool!

Checking templates

The main feature that Tidy has today, is the ability to handle templates, by preservering/ignoring PHP or other server side code. To what extent the HTML5 parser can be modified to handle that feature I do not know.

From a maintenance and bug fixing point of view, I see huge wins in having a common base for Tidy, the HTML5 validator and HTML parsing in Gecko.

In fact, a very radical idea for Firefox (or any other browser using html5lib) would be to actually integrate these tidy-inspired features directly in their development tools, re-using the existing parser! A Firebug extension that lets me validate as well as tidy up my code directly within the browser would be super awesome!

But the actual possibility thereof is beyond my technical knowledge to evaluate, so I need to hear from people who know this stuff better than I do.

Integration with accessibility checking

Although automatic testing can not not substitute manual tests, they can give a developer an in the ball park idea about the accessibility of a page and fix the most obvious mistakes.

The fact that Tidy today do integrate WCAG 1.0 is better than nothing and any implementation of Tidy5 should strive to integrate WCAG 2.0 in a similar fashion. That really is a no brainer. Having to use only one tool and getting all errors in the same buffer (for programmers) or the same console (for manual checks) is certainly convenient.

OK, that was my two cents. What do you think?

Monday, October 18, 2010

How to know if you are watching a bad JavaScript tutorial for beginners

Answer: You are watching it. There are no good tutorials for beginners anywhere!

People who actually know enough to teach JavaScript properly either don't have the time to teach, since they get paid to work for clients and when they do teach, they do so at conferences or seminars in front of their peers, not to newbies. That means that what's available on-line for beginners is stuff taught by people who do not know enough to actually teach.

For me as a professional teacher that means that I have to devote considerable time to warn my students about bad habits, thus wasting valuable lesson minutes, which turn into hours. And there are always someone who missed a class, and tried to compensate by Googling or YouTubing – something teachers normally would encourage – but will turn in bad work, that I have to give a lower grade. Thus, the worst case scenario is that students are actually punished in a way for taking an initiative.

Bad pedagogy

One of the things I've learned developing for the web for 15 years and teaching it the last 9, is that it is way more effective to learn things the right way from the start, and then be warned about some bad or outdated habits to avoid. Learning bad stuff, and then un-learning will harm students' motivation, confidence (how can I trust anything I read?) and slow down the learning process.

Also, to a newbie it is not immediately apparent why a particular way of doing things is bad, so they tend to stay with their bad practice, even though they may hear that they shouldn't. But it works! of course I answer something like Yes, but it's fragile, not scalable, hurts performance and violates good style, but it will take a very long while until I can actually show them that this is the case, and a demonstration is more worth than a thousand words.

And habits, once formed, have a tendency to become automatic, instinctive. They are hard to un-learn.

Don't believe me? Look at the praise in the comments for this super crappy JavaScript tutorial on YouTube. People are saying thing like now I know how to do it, when they should be saying that now they know how not to do it.

What's out there?

My students belong to the YouTube generation. They prefer videos to words, demonstrations to articles, images to text. Thus, nowadays the first place I look for material to use is on sites like YouTube or Vimeo, but every single beginners tutorial I've found so far is crap. They all exhibit most of the following features:

documen.write()
All by itself, document.write() is bad and should be avoided. This is way more important than keeping the "hello world" tradition alive. If you really must do that, at least use alert() since it does not alter the DOM and thus does not give a wrong impression…
document.write() is giving the impression that JavaScript has a linear execution model and that document.write() works like echo in PHP, or cout.
The faster students get into an event-loop/event-driven programming mindset, the better!
Inline event handlers
As PPK says,Don't use it.
Hiding JavaScript from old browsers using HTML-comments
Yup, in the year 2010 some wannabe teachers think you should still be nice to Netscape 1.2 and early versions of Mosaic! Monkey sees, monkey does.
Setting the language attribute on script tags
Now that we are learning that even the type attribute is not needed any longer, I still see people use the language attribute.
It is OK to omit semi-colons on the end of a statement
This is telling students that it is OK to rely on automatic semi-colon insertion.

I am teaching this stuff for a living and you really do not need any of the features listed above as an intermediate step to learning. When I teach my students, I only mention these in passing, saying that:

  • If you use them, I will fail you on the course, or at least give you a low grade. (Yes, I will!)
  • If you receive a tip from someone on a forum, or by Googling, suggesting that they are used, do not listen any more to that person. Learn to use these only as a sign of bad advice.

Which brings us right back to our problem. Where is the good advice?

There are some good things to read, but as I've said, I want videos as well.

What I do as a teacher today

I recommend watching the way too advanced videos, but only the first couple of minutes, originally meant as a rehash only by the speaker. I tell my students to watch the first 20 minutes of this video with Robert Nyman and the first 18 minutes of this one with Douglas Crockford.

That is not something they do instead of listening to me or reading, but this semester I have occasionally been called away on other business and also missed a few days due to illness (nothing serious).

If not for these, I might as well let my Swedish students look at this one. At least they won't be taught something bad!

(No, I do not speak Hebrew and I have no clue if that is a good video or not.)

And I enforce the following rules:

  • You shall validate your HTML early and often.
  • You shall validate your CSS early and often.
  • You shall learn how to use JSLint and use it early and often.

Linting is not an advanced topic. It is an essential tool to encourage good habits for newbies. Besides, validation and linting are not something you do at the end of your development process just to get a stamp of approval. They are tools to help you along the way.

In a similar fashion, I introduce PhpCodeSniffer very early in my teaching of PHP. In fact, with every year of teaching, introducing these tools have come earlier and earlier. My web design students this year were introduced to the HTML validator the very first lesson that I devoted to HTML.

Cursing darkness or lighting candles?

This blog post basically is me ranting again. However, I do have have put in some considerable effort (if I may say so myself) to make the JavaScript teaching world a better place, by my work on the InterACT curriculum for DOM-scripting for the Web Standards Project.

If you are somewhat knowledgeable about unobtrusive JavaScript, and the good parts of the language, please take a look at the competency table and provide a video explaining one or two bullet points from it. You do not have to be Douglas Crockford, but should at least have heard a couple of his speeches and have read his book.

In so doing you will not get the reputation of being a ninja or guru. You won't get a chance to show how clever you are, but guess what? Rookies eventually grow up to become intermediate programmers, and they may very well evolve to become rather advanced one day. They will go to conferences and hear you speak on the advanced topics – one day. If you did provide them with something of real value when they took their first steps, they might just value that experience and return to hear some more in the future.

Bonus teaching tip

Use the JavaScript shell, while talking about basic things like types, variables, operators, expressions, statements and blocks.

Since the students are not in the browser environment, they are less likely to transfer bad habits to real work. The fact that they are in a shell makes it very clear that what they are seeing is not definitive. Even so, I will nevertheless point that out to them, again and again. One must never take stuff like this for granted, and as a teacher one always have to repeat oneself…

Oh yes, when teaching PHP I use the PHP shell as well.

The better version of this post

Chris Williams talks about how JavaScript education must be revolutionized (and other things) in this talk from JSConf.eu 2010. (transcript)

Saturday, September 4, 2010

Why H.264 is disqualified from being a web standard

In short: H.264 can never become a standard for web video as long as the patents are not released according to W3C patent policies.

The MPEG-LA consortium so far has showed no interest whatsoever to release their patents in a W3C compatible way. Thus the question is answered, H.264 is not even a candidate for becoming a web standard. It can't win that race, since it's not even in the running!

The W3C patent policy

The goal of this policy is to assure that Recommendations produced under this policy can be implemented on a Royalty-Free (RF) basis.

You see, it's not a case of Mozilla and Opera being obnoxious. In fact they are only fighting for the same thing as the W3C is: An open web. Video should be open just like HTML, CSS and the DOM is open.

Yes, the W3C mandates that all standardized web technologies should be free for all and for all types of usage and that as far as they are affected by patents, the owners of those patents legally commit to not stopping such usage.

But H.264 is implemented natively in the browser

Yes, it is. In fact there is no law against implementing anything that is not a standard or being considered for standardization. Google implements Flash, within the browser, but that does not make Flash a web standard.

H.264 is usable, and since the drivers are mature technically still the best option for delivery on mobile platforms (more on that later). In fact, would the MPEG-LA consortium consider releasing their patents in a W3C friendly way, it would be an excellent web standard. I don't see that on the horizon, though.

The fact remains that H.264 is a proprietary, patented and closed technology. Some vendors have bought themselves the right to use that technology and others perhaps could, but that is not the kind of freedom web standards should be made of. I find it very ironic that people fighting for free and open web standards for markup and stylesheets, for scripting and for graphics (PNG, SVG, etc) and for net neutrality and universal access are so quick to sell out their ideals when it comes to video.

Since H.264 video is implemented natively is some browsers, we can do stuff with it that we otherwise perhaps could not. But there is still precious little we can do that could not be done in Flash. Really. At least when you look at the end result, not at how it's done.

Don't get me wrong, I like having native video in the browser, but native does not equal open.

An aside: The bigger issue

There is one ideal solution to this problem of course. The USA should change its patent system, that is flawed and broken beyond usefulness. Patents are granted for user interface ideas, algorithms and all kinds of obvious stuff.

If I'd been an American I'd write my congressman and ask him or her what they are doing about this. And if I'd not be content with that answer I'd vote for somebody else, and I'd let everybody know i DID. If the USA would change its laws, most of the world would follow.

However, since a change in US patent laws not is going to happen soon, we are stuck in this mess for the foreseeable future. So what do we really do about it?

Could the MPEG-LA consortium be persuaded to change its mind?

Here is an idea: Let's have Hixie add H.264 to the HTML5 spec and release that spec in such a way as to start the W3C patent clock. That would mean that any patent holder who feels that their patent is being infringed must protest.

There could be two outcomes. The MPEG-LA could show its true colors and protest or they might succumb to the pressure and actually change its policies. The first alternative would perhaps silence everyone who thinks H.264 is free enough, the second alternative would really make H.264 free enough!

I doubt Hixie would include H.264 in the spec in order to float a balloon like this, though. But it's a fun thought.

The real solution: Solve problems that can be solved

The one strong argument in favor of H.264 is hardware acceleration, especially on mobile platforms like phones, netbooks and pads. But bringing VP8 to a comparable state is within our grasp. The hardware acceleration problem can be solved and it is an easier problem to solve than flawed US patent laws or changing the minds of stubborn MPEG-LA patent bureaucrats.

In order to understand this we must consider two things: What exactly is hardware acceleration and what is the expected lifespan of a web standard compared to the lifespan of current chip sets?

I'll start with the former. Video codecs will probably improve over the next couple of years, regardless of them being standardized. Smart people will conjure up better ways to reduce file size while increasing quality, or at least improving one of the two without hurting the other too much.

The question thus becomes, has H.264 been implemented in the layout of the transistors of modern GPUs in such a way as to make any other algorithm, or any variation of the algorithm impossible? That is, are the calculations required to encode or decode H.264 implemented in silicon in every minute detail and will electrons flow from transistor to transistor in a sequence that exactly matches H.264 encoding or decoding?

If that's the case, we have really dug ourselves into a hole. If that's the case, we've made it impossible to improve anything at all! Since new ideas can not use the GPU, they are doomed to be bad ideas!

But since it still takes a whole lot of code to actually write an H.264 encoder or decoder, the answer is of course no. Hardware acceleration of H.264 is not a magic black box.

A GPU is just a slightly different processor, optimized for some kinds of arithmetic that a normal CPU is not. There is no magic to it. It's just a layout of transistors. In the 80's and early 90's most CPU's could not do floating point arithmetic effectively. One had to buy a separate piece of silicon to get that (the 8087, the 287 and the 387). IBM recently introduced a CPU that has a core for decimal (sic!) arithmetic and does cryptography in the hardware.

It's actually not about doing some stuff in the hardware as opposed to other stuff in software. Last time I looked, the CPU was a a piece of hardware! It's a matter of letting the right piece of hardware perform the kinds of computational stuff it does best. It's matter of writing and compiling your programs to use the integer part of the CPU when that's appropriate, the floating point part when that's appropriate and the GPU when that is the most effective solution.

There is no technical barrier preventing VP8 or ogg/theora, or indeed any other software, from using the GPU. In fact, Microsoft is using the GPU to speed up core JavaScript arithmetic in Chakra. That's just one example of modern programs using the power of the GPU to do calculations that are not graphics related at all. So if that's possible, what says it's impossible to move arithmetic calculations to the GPU in the case of non H.264 encoded Video?

Mozilla has gotten CPU usage decoding ogg/theora down from 100 % on the Nokia n900, to just 20 %. And the main thing preventing that number to drop is the fact that the sound is decoded only in the CPU. But that's an obstacle that can be overcome as well.

Lack of so called hardware support for ogg/theora or WebM is in fact not really a hardware problem, but a software problem. The decoders (and encoders) have not been written in such a way as to optimally harness the arithmetic power of the GPU &ndash: yet! I expect this to change rapidly, though.

But maybe current hardware has been made with H.264 in mind, making it impossible for VP8 to fully catch up? Well, if the web industry would show a clear support for the VP8 codec, AMD, NVIDIA and Intel will soon implement some alterations to their transistor layouts in the next generation of chip sets, making the playing field even.

In a very short time we will see WebM video implementations that move enough calculation to the GPU to make it usable in portable devices, using today's silicon. But for the sake of argument, let's suppose that looking at WebM video would drain the battery of your cell phone 10-20 percent more than H.264. How bad is that? It is still within a reasonable limit, I say. And HTML5 still let's you provide H.264 as progressive enhancement to any client. But what's being argued (at least in this article) is what we should consider as a baseline, what can become a true standard for web video.

Let me say this as emphatically as I possibly can. Even if H.264 could be considered somewhat better than VP8 from a technical point of view, it still is not a good enough reason to let go of our freedom. Anyone who is valuing a slight short term technological advantage over long term freedom, needs a reality check and an ethical wake up call!

What about submarine patents?

Microsoft and Apple keep talking about submarine patents, that it is a hazard to everyone implementing ogg/theora or WebM and the MPEG-LA likes everyone to believe that they soon will smack down on the VP8 codec used in WebM video. Since not everyone smells the FUD, let's argue about this for a while.

If indeed VP8 is trespassing on H.264 patents does that mean that anyone implementing a VP8 encoder or decoder can be sued? Could Microsoft be sued? Could Apple?

The premise for such a thought is that the patents for H.264 not only stipulate algorithms but prohibits anyone licensing those patents from doing any kind of alteration not only to individual patents, but to the exact combination of those patents.

This is thus a legal variation of the hardware argument. It stipulates a lock-in mechanism to H.264 that prevents any kind of experimentation or improvement. All by itself that would be a bullet proof case against H.264. Who would like to lock the web into such a solution?

But of course this is not the case. One may use individual algorithms from H.264 together with new or altered algorithms. Anything else would be plain stupid!

And since Apple and Microsoft are licensees of the MPEG-LA patent pool (as well as contributors to it, although Apple has not really contributed as much as Microsoft has), they are authorized to use those patents. They have bought themselves the right to write software that use those patents! So even if we admit – for the sake of argument – that the VP8 codec indeed does infringe on H.264, what risk does that pose to Apple or Microsoft? None whatsoever!

If Mozilla and Opera are willing to take the risk of implementing VP8, without licensing anything from MPEG-LA, what risk is that to Apple? In what way is that a threat to Microsoft? Having bought themselves the right to use all MPEG-LA patents that risk is absolutely zero.

Bottom line: MPEG-LA will not sue Apple if they implement the VP8 codec. Nor will they sue Microsoft.

(Of course, one option for Apple would be to let anyone submit any driver they'd like to IOS. If it was a truly open platform, we would see a WebM enabled version of Mobile Safari tomorrow, without Apple lifting a finger, without Apple programmers having to write a single line of code!)

H.264 advocates can not both have the cake and eat it too

On one hand we hear that VP8 is so similar to H.264 that it probably infringes on the patents guarding that codec. On the other hand we hear that it is so vastly different that we can not get hardware decoding. But which one is it?

If the algorithms are so similar, that there is a patent infringement going on, it goes without saying that GPU accelerated VP8 encoded video must not be hard to implement. If that's the case, the silicon has been wired to do these exact calculations.

On the other hand the algorithms are so different that decent GPU accelerations is impossible, what makes anyone think that the MPEG-LA could sue you for using them?

I wish H.264 advocates would chose which of these two dangers we are supposed to be afraid of, because they are mutually exclusive.

Another example of mutually exclusive claims is that MPEG-LA supposedly owns so many patents that is is virtually impossible to write a video codec that does not infringe on their patents and the fear that there might be some third party, that is not participating in WebM or Theora video, nor in the MPEG-LA, but holds patents in secret, waiting for someone to implement it. A Paul Allen, but with an actual case. A troll with infinite patience that will strike just when WebM has taken off.

But if VP8 is so akin to H.264 that it infringes on their patents, what space would that leave for this third party troll? Very little I'd say.

Once again, I am not saying that the one of these propositions is true. In fact I believe them both to be untrue. But I wish that H.264 advocates would agree on one argument, when mere logic dictates that one being true by definition means that the other one is not.

What kind of power does Apple and Microsoft wield within MPEG-LA?

Speaking of lawsuits, the MPEG-LA is a consortium and it must act according to the will of its members. So if Microsoft and Apple really cared about open video, I have suggestion for them. Use your muscle within that consortium, that you are part of, and convince your fellow members that truly open video is a good thing™. Convince them to release H.264 in a W3C patent policy compliant way. Show us that you are submitting such proposals to the board, show us that you are arguing the case. Only then will your opinion be worthy of consideration.

Until that happens, H.264 can not be a web standard. Until that happens, it can in fact not even be considered for standardization.