Tuesday, September 18, 2007

noscript problems? Just fixed ;-)

While I was writing precedent post I thought about a really simple solution, showed in this post using PHP but compatible with every server-side program language.

self brainstorming

Head page doesn't accept a noscript tag but it accpets without problems one or more script tags.

If I'm not wrong, page download and parse is syncronous and that's mean that if I write a piece of code inside a script, next one will have this code ready or just executed.

Since downloading is syncronous, I can use this behaviour to save in a session this information and to show different layout using next tag.

<script type="text/javascript" src="cssFilter.php"><!--// CSS Filter //--></script>
<link rel="stylesheet" media="all" href="myPage.php" />

The order is absolutely important to use this solution, just because first file need to do something like that:

$_SESSION['JavaScript'] = true;
header('Content-Type: text/javascript');

while second file, page CSS, just need to show correct CSS, based on session JavaScript variable that will be setted only on JS enabled browsers:

$output = file_get_contents(
isset($_SESSION['JavaScript']) && $_SESSION['JavaScript'] === true ?
header('Content-Type: text/css');
header('Content-Length: '.strlen($output));
$_SESSION['JavaScript'] = false;

At this point We should put a gz_handler inside CSS file

optimizing download speed for gz compatible browsers.

At the end, with a bit of immagination, We could directly use first script tag to append dinamically one or more dedicated stylesheet, returning just an empty string on page link request if JavaScript session var is set to true.

I hope this is a solution to solve W3 rules for a problem that "didn't exists" when (X)HTML was drafted for the first time ;-)

demo page


Federico said...

Hi Andrea, cool stuff, I have a question: can't you just set a cookie on the client instead of creating a new session that is overkill for the server?

Andrea Giammarchi said...

Federico, session is a cookie ... but if browser has cookies disabled, session id (SID) is automatically injected as query string for each request.

In this case the link href, when cookies are disabled, become automatically myPage.php?phpsessid=AF5233AF...

There's no way to change dinamically link href ... that's why I used server session.

Finally, I suppose a little session file with just one boolean value (4 chars ... b:0; or b:1;) is not a problem for a server and think that a lot of sites uses always sessions, so I hope W3 will remove noscript parser error if present inside head tag but until them I can't find any different way to both respect standards and users!

Andrea Giammarchi said...

any different way to both respect standards and users
obviously, I mean using less bandwidth for everyone and make this "trick" accessible for cokkies disabled browsers too.

Mike (i.Skitz) said...

Maybe I'm missing something here, but why not simply do this:

1. <script ... src="manipulator.js"></script>

2. <link ... href="basicstyle.css"/>

The manipulator.js could manipulate the page's DOM by replacing the existing link tag with the special/fancy web 2.0 one.

If the <script> tag is processed before the <link> then replacing is guaranteed, if the <script> isn't processed before the <link> tag then the script can still insert a style sheet with settings that override those in the basicstyle.css.

I see no need for server-side involvement or cookies here...

Andrea Giammarchi said...

"The manipulator.js could manipulate the page's DOM by replacing the existing link tag with the special/fancy web 2.0 one."

Mike, You probably didn't think about syncronous download.

If a script tag is before a link, it can't replace next DOM element (the link) because it will not be available while if a script tag is after link, its content has just been downloaded ... two files for JS compatible browsers if You replace, add or modify dynamically used CSS.

My goal is to make page as faster as possible to download, truly separing JS enabled browsers from other and let them download only necesary files without replacements or other kind of fake solutions.

arantius said...

I think this is what GMail does, I prefer it to your solution above.

Use a meta tag to refresh in (i.e.) 1 second, but a script to location.replace() before that. If the script runs, the user ends up at the full script-ified version of the site. If it doesn't, the meta will send the user to a page explaining the need for javascript.

Andrea Giammarchi said...

arantius, what You say is absolutely against best, unobtrusive, accessible Web practices.

Gmail is a service, not a web-site, this mean that Gmail doesn't require that search engines read page content.

In your way, each visited page will use a redirect and every content will be unavailable for search engines.

It's quite clear, You didn't understand why I propose this solution and how many problem could solve for both bandwidth usage and surfers.


Mike (i.Skitz) said...

"If a script tag is before a link, it can't replace next DOM element (the link) because it will not be available while if a script tag is after link, its content has just been downloaded ... two files for JS compatible browsers if You replace, add or modify dynamically used CSS."

Right you are, a glaring oversight on my part, good catch ;-)

I still feel a lightweight client-side solution is possible and have tested just such a solution. If I may:

1. Place the following code in a script tag in the head section of your HTML page just before your link tag that loads your legacy CSS:

document.write('<link ... href="advanced.css" \/>');
document.write("<"+"!-- ");

2. After your legacy link, place an empty comment as follows:

<link ... href="basic.css" /><!-- -->

Voila! Cross-browser client-side solution. In case you don't immediately see why it works, it's because the open comment written by the Javascript comments out the legacy CSS link that follows.

This works with modern browsers, IE 4 and NN 4, pretty cross-browser no?

Okay so there *is* one catch, if your doctype is xhtml 1.0 strict, the comment doesn't get written as a comment, but as &lt;--. Not sure why, but it does, maybe you can say why???

If you can figure that out, my proposal would be a perfect client-side solution don't you think?

Andrea Giammarchi said...

damn cool Mike!
I didn't test your idea but it seems to be real good.

Just a note: xhtml 2.0 and probably HTML 5.0 too doesn't accept usage of document.write inside head to write html content.

However, your solution should be runned outside page and with DOM it should be easyly replaced (I mean link tag).

At this point the only question is if document.write("<"+"!--//"); after link inclusion, using DOM, will be accepted by XHTML 2.0 and future (z)HTML implemnetations but now, I can only say thank you!

Andrea Giammarchi said...

damn Mike ... Your solution works perfectly with FF and IE but doesn't work with Opera or Safari:

I did a lot of different tests: compatibility VS W3 validation ... anything to do :-(

However, above link uses CDATA instead of <!-- and it's compatible with every kind of DTD ... but, as I said, it doesn't work correctly with two major browsers ... what a problem!

Mike (i.Skitz) said...

I like your CDATA modification. The XHTML 1.0 Strict thing bothered me so I just had to find a way around it :-)

Check these out:
XHTML 1.0 Solution
HTML 4.01 Solution
Quirks Mode Solution (no validation)

Cross-browser and W3C valid!

Collaboration, wonderful isn't it? ;-)

Andrea Giammarchi said...

Hi Mike ... coll again!

Your link hide=" is the problem killer.

I've just uploaded 3 pages that uses the same unobtrusive way to allow only DOM with createElement and getElementsByTagName compatible browsers.

xhtml 1.0 Strict
html 4.01 Strict

Could We tell now We found (You before me) the real solution for this stupid problem? :D

P.S. Yes, collaboration is wonderful ;-)

Mike (i.Skitz) said...

Cool Andrea! Glad we were able to work out a solution together; it was fun :-)

One note, your final solution does well to use DOM manipulation methods, but relying only on those breaks the IE 4 and NN 4 cross-browser support that my document.write solution gave.

Although it would be more code, maybe the final solution can use DOM feature detection to determine at runtime which technique to use i.e.:

if(typeof document.createElement != "undefined")
   // use DOM manipulation technique
   // use document.write technique

Just a thought. I figure since we now know how to support this technique for modern and legacy browsers why not build it into the final solution.

Have a great weekend and stay imaginative :-)

P.S. Thanks for the credit!

Andrea Giammarchi said...

I agree about legacy support Mike, however I don't know Web 2.0 libraries compatible with IE4 or NN4 so I suppose these old (too much old) browsers should be filtered and managed as not compatible browsers.

At the same time, DOM usage is quite hilarious because the last piece of code breaks every kind of standard, modifying the page flow (document.write('<link hide="')).

The perfect solution, in my opinion, should use only DOM to be XHTML 2.0 Strict compatible but I suppose that new standards will not be available soon and that if we choose to use an external JS file the page will be validated automatically in every case.

Finally, if XHTML 2.0 (probably HTML 5 too) doesn't support document.write inside head tag, I think that there's no reason to use DOM while when XHTML 2.0 will be available We could use new handler to respect standards.

This is my definitive opinion:
Today We could just use this technique inside an external JS file:

document.write('<link ... /><link hide="');

Tomorrow We'll use another one respecting better W3 standards using handler.

Have a great weekend You too, best regards.

mr_coffee said...

any ideas on how to implement this if the page you're putting it on is already php? i feel it should be easier... but i'm not able to make it work at all.