I was just thinking, while reading some posts on how to improve Apple’s new image skimming technology, how to make websites load faster. I think that in the future websites should be downloaded as single compressed files, or a handful of compressed files in order to utilize parallelization. This way you could wrap all your images up in a .zip file, all your javascript files, all your content, etc. Sure there could still be some images loaded dynamically or separately, as well as other resources, but if you can get a speed increase by using a hack of embedding a bunch of 160×160 pixel images into a single 160×3200 pixel image in order to eliminate the multiple requests, why couldn’t you do the same thing by archiving all the images into a single or handful of archive files, and then having the browser automatically decompress or just unarchive the images and display them.
There of course needs to be new technology in place for this to happen, but I think it could be a good idea? Maybe having the images referenced differently <img src=”image.zip?imgLogo.gif”> or new attribute <img src=”imgLogo.gif” archive=”imageMain.zip”> or it could be somehow seemless using headers, or some other method where the images would download once as an archive and then be treated like cached images.
Just a quick rambling and as always I always love to hear other people’s thoughts.