The Environment, the Economics of Raw Materials, and the Collapse of Civilization

The theft of perfectly functional manufactured goods for scrap value has become a serious issue over the past decade. The number of stories of small to medium scale theft, primarily of copper, has gone from rarity to ubiquitous. The United States Federal Bureau of Investigation has declared copper theft a critical threat to infrastructure. The size of the problem has grown because the recovered value of many easily recycled raw materials is exceeds the risk of getting caught.

This can be generalized. If raw materials aren’t cheap relative to wages, civilization collapses by dismantling itself. This is a grave matter, and I find the implications profound. (more…)

Web 2.0 and the One Page Web Site

I’m busy working on my first major web site using “Joomla!” 1.5. One of the things I did for the site was to install a simple component that provides an index of articles in a side panel. Simple enough, you click on the index link and it fetches the page with that article.

The problem is that it’s quite a page. There’s a nice graphics-rich template that wraps the site, then there’s the Javascript. A full copy of mootools comes with every page. Sure the browser has most of this stuff cached, and Joomla has the actual page content cached, but still the browser has to do a lot of work to reload the page and recompile mootools, just to change a relatively small portion of the page. The user gets to watch as everything is (re)displayed, most of it exactly the same as it was before the request.

The obvious solution would be to have the index link send an AJAX request for the relevant content and then to simply repaint the area that needs to change, but that only works in this specific case and it deprives the CMS of the ability to update other parts of the page in response to the request or to other events at the server.

I’m big on generalizing ideas like this, so why not make the entire site a single page? The first time the browser requests something from the site, send the structure of the template, and most of the Javascript needed to run the site down to the browser. Then make every internal site link send an AJAX request. The server can respond with a list of the areas of the template that need to be updated, along with either the HTML or the data needed to perform the update.

Now my index request could respond with a new menu, any updated news items, the content I requested, or even a completely new page layout. The client-side application then applies these updates, possibly issuing secondary requests. Only the data that’s changed comes back from the server, and most of the Javascript loads just once with the first request, so the page updates much faster than it did before. Best of all, the user doesn’t have to watch the template being regenerated, which is visually disturbing no matter how quick it is.

Now the page is an application in itself, and the browser is playing the role of the operating system. The user gets a platform independent, end-device sensitive interface that can be rich, intuitive, and more interactive.

It’s an idea worth implementing. Not for this particular site, but it would be nice to build this capability right into Joomla!

Digital Rights Management (DRM) is a Waste of Time

I read a blog post today by Simon Phipps (DRM and the Death of a Culture) which was a well reasoned complaint about the constraints that DRM can place on use of content. Yet no matter how well reasoned, nor argued from which position, these arguments on DRM don’t matter. They don’t matter because DRM will never work on static content. This is so basic, so obvious that I’m not sure why anyone ever thought it would. In fact, let’s make it more general: all copy protection technologies, past, present, and future do not and will not prevent copying of non-interactive media. In fact they’re a colossal waste of time, effort, and money that only serve to inconvenience legitimate users (and as Phipps points out, kill culture).
(more…)

Splice Babies

DNA testing has given sperm banks an interesting challenge. The concept of an “anonymous donor” has gone out the window. Now a simple, affordable DNA test can verify parentage. Perhaps of more concern is that as more people contribute DNA to public databases, it’s becoming easier to identify previously unknown siblings, which leaves just a short step to the father.

With genetic manipulation becoming such an easy thing to do, how long will it be before sperm banks start offering “synthetic” fathers? A few genes from this donor, a few from that, and a few more from over here to finish the job. A baby born from the resulting DNA could theoretically have any number of fathers, none of them traceable to an individual.

Of course it might be a tricky business if there’s more interrelationships between genes than previously expected. Then again, given sufficient care, the outcomes of various combinations could be tracked, selected for deireable traits, and in no time the banks would be out in the market with competing “superbaby sperm”.

Now there’s an ethical mess.

Mastodon