Web 2.0 and the One Page Web Site

I’m busy working on my first major web site using “Joomla!” 1.5. One of the things I did for the site was to install a simple component that provides an index of articles in a side panel. Simple enough, you click on the index link and it fetches the page with that article.

The problem is that it’s quite a page. There’s a nice graphics-rich template that wraps the site, then there’s the Javascript. A full copy of mootools comes with every page. Sure the browser has most of this stuff cached, and Joomla has the actual page content cached, but still the browser has to do a lot of work to reload the page and recompile mootools, just to change a relatively small portion of the page. The user gets to watch as everything is (re)displayed, most of it exactly the same as it was before the request.

The obvious solution would be to have the index link send an AJAX request for the relevant content and then to simply repaint the area that needs to change, but that only works in this specific case and it deprives the CMS of the ability to update other parts of the page in response to the request or to other events at the server.

I’m big on generalizing ideas like this, so why not make the entire site a single page? The first time the browser requests something from the site, send the structure of the template, and most of the Javascript needed to run the site down to the browser. Then make every internal site link send an AJAX request. The server can respond with a list of the areas of the template that need to be updated, along with either the HTML or the data needed to perform the update.

Now my index request could respond with a new menu, any updated news items, the content I requested, or even a completely new page layout. The client-side application then applies these updates, possibly issuing secondary requests. Only the data that’s changed comes back from the server, and most of the Javascript loads just once with the first request, so the page updates much faster than it did before. Best of all, the user doesn’t have to watch the template being regenerated, which is visually disturbing no matter how quick it is.

Now the page is an application in itself, and the browser is playing the role of the operating system. The user gets a platform independent, end-device sensitive interface that can be rich, intuitive, and more interactive.

It’s an idea worth implementing. Not for this particular site, but it would be nice to build this capability right into Joomla!

Microsoft Security Fix Clobbers Two Million Password Stealers

Normally I’m no fan of the “blog by repeating news” style, but in this case I have to make an exception. The headline above is from a Computerworld Security article dated June 20, 2008. Discussing a recent upgrade to Microsoft’s Malicious Software Removal Tool, this excerpt caught my attention:

One password stealer, called Taterf, was detected on 700,000 computers in the first day after the update. That’s twice as many infections as were spotted during the entire month after Microsoft began detecting the notorious Storm Worm malware last September.

“These are ridiculous numbers of infections my friends, absolutely mind-boggling,” wrote Matt McCormack, a spokesman with Microsoft’s Malware Response Center, in a Friday blog posting.

This may be mind-boggling to someone who lives deep in Microsoft culture, but to everyone else, it’s barely a surprise. The missing part of McCormack’s quote should have been “The Linux/Unix guys are right, Windows security still sucks at a deep structural level.” Good thing regular doses of Microsoft Cool-Aid prevents that.

Firefox 3 is Ready, Bug 183689 Intact. Duh.

The good news is that Firefox 3, one of the best web browsers available, is set to release version 3 in a few days.

The weird news is that it’s shipping with Bug 183689 fully intact. Under mysterious (or at least elusive) circumstances, Firefox fails to close a file that the user uploads to a web site. The file is locked and unusable until Firefox is restarted.

This has the ring of familiarity. Any user of Firefox 2.x who has lots of extensions installed will have noticed that it tends to get more and more sluggish over time. A quick look at the process will reveal that memory consumption continuously rises until the best thing to do is restart it.

That’s not really a problem that the Firefox developers had a lot of control over. If an extension is leaking memory, there’s not much the core can do to stop it. In fact this is one of the major improvements in Firefox 3. A new, sophisticated memory manager now finds a lot of these unreferenced data structures and cleans them up. On my system, the memory footprint for Firefox 3 is nearly 200Mib smaller at start up, and if it grows, it doesn’t grow very fast. That alone is reason to upgrade on June 17, 2008 – Firefox Download Day.

The point of this digression is that I’ve become used to Firefox losing track of resources. But losing a file handle? Really. Can that be too hard to find? Apparently the answer is “yes”, since Bug 183689 has been open since December, 2002! There are some good reasons why the browser needs to keep track of the file, for example if you refresh the resulting page, the file is part of the Post data that needs to be re-sent.

But eliminating memory leaks is hard, and it’s easy to just rely on increasingly sophisticated garbage collection tools instead of finding the cause. Unfortunately, a garbage collector has no way of knowing that something it’s cleaning up represents an open file, so the memory leak is fixed, but the file handle leak remains.

Five-plus years is far too long for a major bug to remain open, even for an open source project. But don’t go updating that bug! Despite the fact that there’s no explanation that the issue is independent of the user’s specific circumstances, any provision of additional information will be considered spam by Jonas Sicking, the fellow who has been assigned the bug. Considering that the bug seems difficult to reproduce, the contradiction is obvious. One would think that more examples might lead to the discovery of a pattern, but that seems to not be the case as far as Mr. Sicking is concerned.

Hopefully he was just feeling a little stressed with a major release coming up so quickly, and his comment will be clarified or withdrawn. If not, I’m guessing this one is going to remain open for quite some time to come.

[Note: it has since been determined that an extension, LiveHTTPHeaders, is the culprit for this bug. My “duh” is withdrawn. My disdain for Mr. Sickling’s response remains unchanged, however.]

RIP, SUV: Gas Prices Are “Getting There”

This weekend the Toronto Star announced the death of the SUV. One of the reasons this came up has to be the closing of the General Motors truck assembly line in Oshawa. It seems that as the price of gas gets above about $1.25 per litre (or $4/gallon in the U.S.), the number of people who “need” an unsafe gas guzzling SUV drops off pretty quickly. Now these same people “need” to unload their luxury land barges. There’s nothing like a flexible definition of needs.

This is a good start. There’s going to be a lot fewer road trips in the family road boat this year. Some people will argue that this is a bad thing, that families should be able to get out there with their kids to see all that this vast country has to offer. These people haven’t actually seen a family in one of these vehicles. The parents are happily enjoying their time “together” while each kid is in their own isolated space with individual DVD players and noise-reducing headphones. They see as much of the countryside in their basements. Besides, a lot of travel options remain open. Our geography is every bit as dramatic from a train. Better yet, on a train it’s a lot easier to get your kids to come out of their multimedia shells and look at something without risking a major accident. (more…)

Mastodon