Wednesday, March 25, 2015

Dart officially as Yet Another Transpiled Language

This is a great news for the Web: no more effort behind Dart VM, only to the language itself as transpiled to JavaScript.
This puts Dart as just another language like CoffeeScript, TypeScript, or any other that will desugar to JS.

Why this is good for the Web

The Web is far-far-far-far-away from being a perfect rock-solid platform, and simply the idea that every browser vendor should have put effort to create even more fragmentation on the Web due multiple integrated VMs has been worried me since the very first Dart announcement: thanks gosh it didn't make it!
On the other hand, having Dart only as transpiled langauge means there will be more effort in transpiled languages tools, including better integration and better debugging possibilities for Web developers ... and This Is Cool!

Update

While it's confirmed that Dart won't bother the Web with its VM anymore, the language and its VM will still work on the server and other places.

Toward ES.next Anyway

It wasn't just me noticing that in the last years Dart never impressed or show better muscles, and if it has to desugar to JS then I might be the only one here but how about we just learn JavaScript instead? OK, somebody probably liked Dart, as somebody likes TypeScript, CoffeeScript and others ... and this is a free world so do what you think is best for your projects, but also ... @AlwaysBetOnJS!

Tuesday, March 24, 2015

Ain't that fetch!

Update II
It turned out that Microsfot did what was the simplest way to provide Streams for developers, it simply used what was there already, and exposed within the readystatechange listener. That's a great, easy, and pragmatic solution for IE10 and above. I wish other vendors where like IE ... oh, irony!

Update I
Arthur Stolyar, aka @nekrtemplar, just explained in his gist why he's also disappointed about fetch. To cite his words:
I am not a "One particular high-profile JavaScript community member was unconvinced" or a spec's expert. I am a regular JavaScript/Front-end developer and this is how I see it from my point of view.
His technical explanation is even more detailed than mine so I suggest a read.
Chrome Canary is already exposing a fresh new Fetch API that has been empathized in this clarification post from @jaffathecake.
I've promised him I would have written a counter argument on why not everyone, or at least me, did applaud it.
I will try to answer point after point, providing also my point of view but before starting it's obligatory to underline that nobody is trying to block anything but hopefully the entire thing will be rolled out complete and ASAP with all details in.

Events VS Promises

The first provided example in Jake page talks about a massively improved API. It shows some basic XHR logic I'd like to reproduce under a different light:
var xhr = new XMLHttpRequest();
xhr.open('GET', url);
xhr.responseType = 'json';
xhr.onload = () => console.log(xhr.response);
xhr.onerror = () => console.log("Booo");
xhr.send();
Above 5 lines of code have been described in the original, not arrow->function, example as some vomit to mop up 'cause the new thing looks way better:
fetch(url)
  .then(r => r.json())
  .then(data => console.log(data))
  .catch(e => console.log("Booo"))
;
The example ends up in a rather confusing block for ES7, not sure this is a "back to vomit" thing or not ...
(async() => {
  try {
    var response = await fetch(url);
    var data = await response.json();
    console.log(data);
  } catch (e) {
    console.log("Booo")
  }
})();
If we talk about lines of code, fetch wins, but if we talk about clarity of intent can I say XHR has the best?
  1. it's a GET, not a POST, PUT, HEAD, or others, it's clearly and explicitly a GET
  2. it comes with listeners, meaning it has onprogress included for bigger data
  3. it comes with .abort() as explicit intent, and the ability to add an onabort in order to react
I will come back to the latter point but here we have yet another competition between Events and Promises, when of course Promises, on top of events, can be used to generated simplified APIs, but comparing them is like comparing apples and oranges: Promises and Events are not mutually exclusive patterns, it's the exact contrary: these can be used together, and indeed, being events more suitable for recurring invocation of the same callback during some streaming operation, it's not that we can blame Events to have an ugly API there, we simply need them.
My point here is that wrapping XHR through a Promise was already possible and it has been done already so this does not seem to be a reason to applaud this latest API, no clap clap here, I am sorry, not for this.

It isn't done yet

When I've read this part:
What exists in Chrome today doesn't cover the full spec, and the spec doesn't cover all of the planned features. In some cases this is because they haven't been designed yet, in others it's because they're dependent on other in-progress specs.
I could not help myself thinking about a tweet of mine, posted few days ago:
@dfkaye what I think is that first few lines of any new proposed Standards body API should be about "what real-world problem is this solving"
So I'd like to understand if new APIs just come out "because" or if there are real reasons behind for pushing these out.
I am 100% sure that WHATWG does its best, it works hard as a group, and the aim is to provide top notch solutions for our daily needs, but honestly ... to quickly answer to the following Jake disappointment:
Bit of a rant: it bothers me that as developers, we preach iterative development and release, but when we're the customers of that approach the reaction is all too often "HOW DARE YOU PRESENT ME WITH SUCH INCOMPLETE IMPERFECTION".
The way I felt about this was rather like the following: (which is just a satiric parody in my mind)
  • Some guru at WHATWG: we need a better version of XHR
  • Her colleague: Totally, how should we start?
  • Some guru at WHATWG: let's think about a simplified entry-point API to start with
  • Her colleague: Totally, gotta Fetch em'all via Promises!!!
  • Some guru at WHATWG: well, actually something that .. you know ... Network, something cancelable by default ...
  • Her colleague: Totally, gotta Fetch em'all via Generators!!!
  • Some guru at WHATWG: ... yeah right ... let's stick with the first idea that breaks only the Promises principles in ES6 ... and re-iterate down the road
And again, the way I see that picture is like:


Because the thing has been planned from the scratch without breaks in mind!

Having the ability to stop a download, a streamed content, or an upload for the wrong image, should have been the very first thing to think about for a new Network based API, riight?
I am simply astonished the entry point for the new XHR replacement has already a somehow limited/broken heart since the beginning and for the entire ES6 era, because as far as I know native Promises cannot be canceled (yet, 'cause they will eventually ... they have to!)
I wonder then if this is really the right way to roll out new APIs ... so that we are forced to rush solutions in order to be able to improve also a core problem in Promise-land?
Where was the announcement that a new Network API based on Promise was coming? Have I missed that? The answer is probably yes, since it's even in the specification, but then can I say I wasn't again see that coming at all?

Developers VS Standards

To keep answering Jake disappointment, as a developer I want to trust that standard bodies will choose the right tool for the job.
I like all open channels, I like the iterative model, but I don't honestly have time to be the supervisor for everything so my apologies if I haven't read every single thing people paid to write out published.
As a developer, I'd also like to trust standard bodies that if I've learned something already, like the XHR Level 2 API, and if an API has already been updated for a nice to have upgrade as onload and onprogress were, maybe just a tiny extra update to bring onstreaming would have made such API even better and we could have let libraries and wrappers deal with the best approach to Promisify all the things and implement once de-facto standardized (obligatory mention to querySelectorAll indeed).
This is not me complaining about nicer, easier looking APIs, this is me wondering what's the rush about and why so many steps at once as in the Fetch API and everything else around it.

Not All Disappointing After All

I've hopefully explained why at the very beginning I could not believe to my eyes but also thanks to Jake I've understood there are surrounding parts that come somehow for free like Streams, readers, no-cors, ServiceWorkers integration ( through events and addEeventListener :trollface: ) and more.
I honestly wish Streams where usable directly as row level API and probably that's going to happen or it happened already, and now that I know about all these things around "just fecth" I can't wait for all this to be fully out and not just a little bit, 'cause as it is, it looks Promising (dehihihi) but it's not worth using it yet due limitations, specially on the current initial cancel-ability (partially in or partially buggy here and there).

About XHR: Yes, It Is Still Good!

If we had a time to improve it, that wouldn't be 16 years ago ... that was Microsoft, not XMLHttpRquest, so I am not sure why Jake said that in his blogpost since first working drafts seem to be dated 2006.
However, XHR has been revolutionary for the entire Web and it also made eventually people aware of how the internet was working: not everything is a GET, you could PUT, POST, HEAD, and it has been used to create everything cool today about Google services, social networks etc etc.
It's already easy to do what it can do, so I wouldn't say it has been simplified because of a .then.
I can also still upload and drop while uploading images in Gmail, or TweetDeck, or somewhere else and entirely through XHR. I can post data, show progress bars, I can fetch JSON and do CORS. I can do pretty much everything but controlling streams ... and you know what? I think XHR deserves a bit of extra respect.
I wouldn't spit on something that is almost 100% consistently adoptable these days and without libraries, when most of the people just used it to GET some JSON and nothing else (reasons Promise like based Ajax calls became popular, most developers use 10% of XHR ability) for the last 10 years.
Accordingly, if there is one thing I absolutely don't like about that post, is the following sentence:
XHR was an ugly baby and time has not been kind to it. It's 16 now. In a few years it'll be old enough to drink, and it's enough of a pain in the arse when it's sober.
Come on Jake, XHR is still amazing, easy as hell to use, and it still has a lot to offer: abortability with optional reaction through listeners, good integration with Form data, progression, multiple listeners so it can notify multiple "chains" and everything else we know and used for the last 16 years indeed. If it was that bad, I am pretty sure it would have disappeared way before, and the Web as we know would have probably been behind ... don't you think?
Cheers

Friday, March 20, 2015

Bringing Custom Elements to IE8

Update
The video of the event that brought Custom Elements down to IE8 and explain the Custom Elements role on the Web is online!


While preparing my talk, slides, and demos for last Wednesday Frontend.fi event I've decided to rethink what was truly needed in order to make my Custom Elements polyfill compatible with IE8 too.

Brief history about my polyfill

When I've first written document.registerElement polyfill I had different targets in mind: everything on mobile that was not covered by Polymer and X-Tag webcomponents.js polyfill should have worked as well.
I was working some R&D project at Twitter by that time and the idea of having Custom Elements for Android 2, Blackberry, Windows Phone 7, and all other OSs ignored by the Google library, which targets mostly modern Desktop browsers and it aims to cover the the whole WebComponents family instead of just Custom Elements, was truly tempting.
The challenge was to find the minimum amount of hacks needed to make it work, and the result was a 2.5 KB minzipped output VS around 30 KB minzipped for the library used in the core of both Polymer and X-Tag.
To be honest, those 17+ KB via CDN aren't a big deal, but the incompatibility with not so old platforms is ... so I'd call it win/win in size and portability.

Why IE8 now

Once I've developed and tested Custom Elements for all these platforms, the shadow of IE8 kept bugging me. I was sure nobody cared, I strongly believed it was somehow impossible or it would have required a huge effort for zero profit.
Truth is, same way I've found the minimum amount of hacks needed to make it work in all those platforms, and please bear in mind that even if disappearing Android 2 phones are still selling, I could have simply ignored Mutation Events and Mutation Observer. I had this little monolithic piece of crap that was simply working everywhere, and I could have hooked IE8 in there with ease.
When you have a test suite to develop against, when you do TDD for real, things are way easier to implement and it took "only one extra day of work" to actually see the green light and also create a working demo that integrates the most common thing ever on the web: a map that points at your business.

The untold story of that map is that it will never show a white space instead of a map as <google-map> custom tag would: it works even in browsers that have no JavaScript support at all! ... however, this is another story that I've also better explained in Helsinki.

I put that extra day of work in quotes because I've been hearing the "cool story bro" on how much effort would take to implement something here or there, and I believe most of the time developers didn't even try. I am an old dev with a lot of IE4+ background so maybe it was easier for me ... but trust me, I had to find out few new and undocumented IE8 specific hacks in order to make it work and even if not perfect, it just bloody works!
There are inevitably caveats for IE8, but if you are targeting this platform you are already taking extra precautions so nothing new for you.
Finally, whenever my Frontend.fi talk will be live I'll update this post so you can see the presentation that brought this little "miracle" to life.

What kind of sorcery is this?

The exact sequence of polyfills needed to make this work, a sequence absolutely unobtrusive for all browsers but IE8, is represented in this file: change the order and it won't work so stick with that in any major template you have and you'll be good to go!

Thursday, March 12, 2015

taming CSS animations via restyle

I am working on a little gamish project that requires quite a lot of CSS animations ... I ended up updating restyle in order to simplify as much as possible this task: here what's new!

A not-so-fool CSS animations approach

First of all, we all know how tedious and boring is to write manually a cross platform CSS animation.
So here just one good thing about restyle, it gives us the ability to define all the burden at once:
// restyle object with 2 described animations
var gfx = restyle({
  '@keyframes text-highlight': {
    '0%':   { color: 'inherit' },
    '100%': { color: '#FFF' }
  },
  '.highlight': {
    animation: {
      name: 'text-highlight',
      duration: '500ms',
      iterationCount: '1',
      direction: 'normal'
    }
  },
  '@keyframes grow': {
    '0%':   { transform: 'scale(1)' },
    '100%': { transform: 'scale(2)' }
  },
  '.grow': {
    animation: {
      name: 'grow',
      duration: '1s',
      iterationCount: '2',
      direction: 'alternate'
    }
  }
});

// set the animation class to the element
myElement.classList.add('highlight');

// we can now do something once done 
gfx.animate(
  // the node that is animating
  myElement,
  // the animation **name** (not the class)
  'text-highlight',
  // the callback
  function (e) {
    // drop this class
    e.currentTarget.classList.remove('highlight');
  }
);

Timer based fallback included

Since restyle is compatible with browsers that do not even support animations or animation fallbacks, but since each object is aware of its own used CSS, whenever a browser without animation events is used the method will find out the animation duration automatically.
We can test the functionality through another method, .getAnimationDuration(el, animationName)
myElement.classList.add('highlight');
gfx.getAnimationDuration(myElement, 'text-highlight'); // 500


myElement.className = 'grow';
gfx.getAnimationDuration(myElement, 'grow'); // 1000

Caveats

In order to be able to understand which animation is used, and being animations named via key frames but actually assigned through classes (where only one animation per time happens), the fallback parser has to understand which class contains the animation and with which duration.
This is actually quite complex black magic but the good part is that it happens behind the scene, however the duration method works only for animations already assigned so it's not a good idea to pre-assign durations since these depends on the class that will trigger such animation.
The delay has no fallback (yet) so, if needed and known upfront, we could attach the callback later on.
Last, but not least, the returned object has a .drop() ability so that is always possible to revoke a callback previously assigned to trigger at animation end.
myElement.classList.add('highlight');
var after = gfx.animate(myElement, 'highlight', function(){
  console.log('will never happen');
});

// before the animation ends
after.drop();
How to cancel animations? Well, that's provided automatically via CSS, we can simply drop or change className and we are done.
Hope this new helper will be useful, at least it has been playing nice with my little game.
Cheers

Friday, February 27, 2015

It's your duty to develop for the Web

Let me start saying that all companies mentioned in this post resolved all problems at speed light, and either apologized or cared about the bad experience: kudos that!
There is a very annoying trend these days where any Web related issue ends up with a question like: "have you tried using Chrome?"
The moment the user's browser of choice becomes the reason an online service doesn't work, is the moment we should all realized how much we failed in promoting web standards and how many so called web developers out there are doing it wrong.

You have one job!

Web developers are a very privileged category of workers that happily live in the tech bubble.
We earn more than many other hard workers but that is not the problem: broken services, poorly developed websites, missing cross-browsers and cross-platforms tests are the problem!
If you offer a website and you do not explicitly say that only one browser is supported, going back about 20 years in competence, a time where websites where optimized for IE only, these kind of answers should never be allowed in any help desk on the Internet planet:
Even worst, if you offer "a front-to-back HTML5 app development environment for cross-platform apps." to develop HTML5, and you take care of a hub about Web development, an answer like this one should be flagged as the last resource you have once everything else possible has been verified and you are asking for a full bug report.
Apparently, it's way easier to blame the user browser, somehow also going a little bit against anti-trust rules, and somehow washing your hands about problems ... but we are all better than this, I am pretty sure everyone would agree here.
( also let me underline they came back ASAP and I couldn't reproduce the problem anymore. I also managed to buy the Ubuntu phone few hours later so that is a good service! )

Even IE moved on, so should you!

Internet Explorer marked the history of the Web. Regardless it has been most of the time the only supported browser for the first 10 years and counting, it got it eventually right, embracing standards, and contributing to make them, as much as it could.
A break from the past: the birth of Microsoft's new web rendering engine, is just the latest effort Microsoft is putting in order to be more standard and competitive than ever.
They had all developers for them at the beginning of the time and regardless, they failed at following standards until IE9, released in early 2011!
The sad story here is that apparently nobody learned the lesson so that Apple is still doing this right now.

It's your duty to develop for the Web

I don't want to go too deep into the infinite amount of problems we still have on-and-off-line, but if your business has anything to do with a browser here a quick reminder of what does it mean and, if you claim to be a web developer, what you should do:
  • learn Web Standards and don't let automations overbear your skills. Everyone can use "that IDE" so ... do you want to be good at what you do, or be just an overpaid dummy wannabe, as anyone else could be?
  • do not ever blame the user browser and fix your service, or define your targets upfront: do some feature detection, understand if your service can work on that browser before offering the service and eventually inform the user that some functionality, or the entire service, might not work. Yes, your library should be so cool to trigger something like an unsupportedFeaturesDetected ASAP or your page could use Modernizr and behave accordingly
  • you are eventually justified for IE8 and lower, if these are not explicitly your supported target, for everything else you have free access to every single possible Desktop browser: you can either download them or simulate them. You also have to test them through any sort of tool that could help: writing tests is not an option!
  • if it's mobile web that you are targeting, and not only a specific platform, throw away from the window your spoiled last minute SmartPhone because real people out there don't change phone every 6 months and don't spend a fortune each time. They also don't change contracts and related phone so frequently, since specially in the US this thing about buying unlocked smart-phones is apparently inconceivable and everyone tests on iPhone 6 .... yeah, 60FPS there ... now, test the rest of the world too!
I feel like I cannot stress this enough but I feel like we all failed here. I've failed spreading good practices, other failed even hiring people that don't care or don't know how to cross-platform ... can we please go back to the Web for everyone that we all love?

Thank you to all colleagues and to all online activities, and next time somebody asks you "have you tried Chrome?", feel free to answer: "do you know anything about Web development?", 'cause it's about the time to stop blaming customers and start looking at who's being the real incompetent here.

Updates

We reached full circle now:

Monday, February 23, 2015

Meet archibold, my daily OS

If you are looking for an Open Source Operating System that is constantly updated, Arch Linux would likely be your choice. Put the latest GNOME on top of it, and you are in a pretty sweet damn good looking OS that will give you everything you need. Would you like to install this stack but not 100% sure how? Meet archibold!

archibold: a zero hassle installer

The "Arch way" is to learn through the amazing documentation the same community created and keep updated through their wiki.
I am one of those that learned the hard way every single thing I needed, but I am also a very pragmatic and DRY person.
As example, there are subtle, problematic, not-always-clear "little gotchas" when you want to install an Operating System, such:
  • how to correctly partition the HD?
  • what is EFI boot loader and how can I customize it?
  • how to create a graphical EFI compatible splash screen that actually works?
  • what's the minimum amount of packages I need?
  • how to login automatically?
  • how to configure a full Desktop environment?
  • how to install, remove, or search for new software?
  • and what if the software is not officially supported?
  • will the terminal open new tabs in the current folder?
I strongly believe everyone should be welcomed as much as possible in the Open Source Desktop community, so why not making an installer capable of bringing a delightful and easy to use experience to all people that would like to upgrade to Linux?

... and a zero hassle manager

Once your PC will boot into GNOME, you can always open the terminal and type archibold:
$ archibold

 __________________
|                  |
| archibold v0.3.0 |
|__________________|

 usage:

  archibold [clean|update|upgrade]
  archibold [install|remove|search] package-name


 list of included AUR packages:

  acroread                # Adobe Acrobat Reader
  broadcom-wl-dkms        # Broadcom wifi
  dropbox                 # Dropbox client
  google-chrome-dev       # Chrome dev channel
  spotify                 # music baby!
  sublime-text-nightly    # Sublime 3

These are most common tasks I could think of, but of course all usual functionalities and software provided by default will be available too.
You have the freedom to learn with the ability to start easy, reducing the learning curve to the minimum.
There are also few problem solving hints and everything else needed could be found in the Arch Linux forum or wiki: just please search before opening a thread, I can assure you 99% of the time the problem is well known and documented.

About Compatibility

Right now everything based on Intel works out of the box, but most laptops come with a Broadcom WiFi and Bluetooth that might require extra hassle to be installed. This is where archibold install broadcom-wl-dkms becomes handy, you don't need to know everything about Dynamic Kernel Module Support the first time you boot your new Desktop environment, you have a ready-to-go solution and the ability, once your WiFi works, to surf all related must know things about it.

Enjoy and ...

I hope you'll be bold enough to try it out at least in a spare laptop, Intel NUC, Mac Mini from 2010, MinnowBoard Max, Lenovo Yoga 3 Pro or Samsung serie 9 or any other tested device I could try and please bear in mind that archibold is suitable for every kind of user, not only nerds/gurus/developers, even non techy people can enjoy GNOME on Arch Linux, is that beautiful, that powerful, always updated, and finally easy to install and use for very basic tasks.
Let me know how that goes, but please read all installation infos in the archibold site.