{"rowid": 165, "title": "Transparent PNGs in Internet Explorer 6", "contents": "Newer breeds of browser such as Firefox and Safari have offered support for PNG images with full alpha channel transparency for a few years. With the use of hacks, support has been available in Internet Explorer 5.5 and 6, but the hacks are non-ideal and have been tricky to use. With IE7 winning masses of users from earlier versions over the last year, full PNG alpha-channel transparency is becoming more of a reality for day-to-day use.\n\nHowever, there are still numbers of IE6 users out there who we can\u2019t leave out in the cold this Christmas, so in this article I\u2019m going to look what we can do to support IE6 users whilst taking full advantage of transparency for the majority of a site\u2019s visitors.\n\nSo what\u2019s alpha channel transparency?\n\nCast your minds back to the Ghost of Christmas Past, the humble GIF. Images in GIF format offer transparency, but that transparency is either on or off for any given pixel. Each pixel\u2019s either fully transparent, or a solid colour. In GIF, transparency is effectively just a special colour you can chose for a pixel.\n\nThe PNG format tackles the problem rather differently. As well as having any colour you chose, each pixel also carries a separate channel of information detailing how transparent it is. This alpha channel enables a pixel to be fully transparent, fully opaque, or critically, any step in between.\n\nThis enables designers to produce images that can have, for example, soft edges without any of the \u2018halo effect\u2019 traditionally associated with GIF transparency. If you\u2019ve ever worked on a site that has different colour schemes and therefore requires multiple versions of each graphic against a different colour, you\u2019ll immediately see the benefit. \n\nWhat\u2019s perhaps more interesting than that, however, is the extra creative freedom this gives designers in creating beautiful sites that can remain web-like in their ability to adjust, scale and reflow.\n\nThe Internet Explorer problem\n\nUp until IE7, there has been no fully native support for PNG alpha channel transparency in Internet Explorer. However, since IE5.5 there has been some support in the form of proprietary filter called the AlphaImageLoader. Internet Explorer filters can be applied directly in your CSS (for both inline and background images), or by setting the same CSS property with JavaScript. \n\nCSS:\n\nimg {\n\tfilter: progid:DXImageTransform.Microsoft.AlphaImageLoader(...);\n}\n\nJavaScript:\n\nimg.style.filter = \"progid:DXImageTransform.Microsoft.AlphaImageLoader(...)\";\n\nThat may sound like a problem solved, but all is not as it may appear. Firstly, as you may realise, there\u2019s no CSS property called filter in the W3C CSS spec. It\u2019s a proprietary extension added by Microsoft that could potentially cause other browsers to reject your entire CSS rule. \n\nSecondly, AlphaImageLoader does not magically add full PNG transparency support so that a PNG in the page will just start working. Instead, when applied to an element in the page, it draws a new rendering surface in the same space that element occupies and loads a PNG into it. If that sounds weird, it\u2019s because that\u2019s precisely what it is. However, by and large the result is that PNGs with an alpha channel can be accommodated.\n\nThe pitfalls\n\nSo, whilst support for PNG transparency in IE5.5 and 6 is possible, it\u2019s not without its problems.\n\nBackground images cannot be positioned or repeated\n\nThe AlphaImageLoader does work for background images, but only for the simplest of cases. If your design requires the image to be tiled (background-repeat) or positioned (background-position) you\u2019re out of luck. The AlphaImageLoader allows you to set a sizingMethod to either crop the image (if necessary) or to scale it to fit. Not massively useful, but something at least.\n\nDelayed loading and resource use\n\nThe AlphaImageLoader can be quite slow to load, and appears to consume more resources than a standard image when applied. Typically, you\u2019d need to add thousands of GIFs or JPEGs to a page before you saw any noticeable impact on the browser, but with the AlphaImageLoader filter applied Internet Explorer can become sluggish after just a handful of alpha channel PNGs.\n\nThe other noticeable effect is that as more instances of the AlphaImageLoader are applied, the longer it takes to render the PNGs with their transparency. The user sees the PNG load in its original non-supported state (with black or grey areas where transparency should be) before one by one the filter kicks in and makes them properly transparent.\n\nBoth the issue of sluggish behaviour and delayed load only really manifest themselves with volume and size of image. Use just a couple of instances and it\u2019s fine, but be careful adding more than five or six. As ever, test, test, test.\n\nLinks become unclickable, forms unfocusable \n\nThis is a big one. There\u2019s a bug/weirdness with AlphaImageLoader that sometimes prevents interaction with links and forms when a PNG background image is used. This is sometimes reported as a z-index issue, but I don\u2019t believe it is. Rather, it\u2019s an artefact of that weird way the filter gets applied to the document almost outside of the normal render process. \n\nOften this can be solved by giving the links or form elements hasLayout using position: relative; where possible. However, this doesn\u2019t always work and the non-interaction problem cannot always be solved. You may find yourself having to go back to the drawing board.\n\nSidestepping the danger zones\n\nFrankly, it\u2019s pretty bad news if you design a site, have that design signed off by your client, build it and then find out only at the end (because you don\u2019t know what might trigger a problem) that your search field can\u2019t be focused in IE6. That\u2019s an absolute nightmare, and whilst it\u2019s not likely to happen, it\u2019s possible that it might. It\u2019s happened to me. So what can you do?\n\nThe best approach I\u2019ve found to this scenario is\n\n\n\tIsolate the PNG or PNGs that are causing the problem. Step through the PNGs in your page, commenting them out one by one and retesting. Typically it\u2019ll be the nearest PNG to the problem, so try there first. Keep going until you can click your links or focus your form fields.\n\tThis is where you really need luck on your side, because you\u2019re going to have to fake it. This will depend on the design of the site, but some way or other create a replacement GIF or JPEG image that will give you an acceptable result. Then use conditional comments to serve that image to only users of IE older than version 7.\n\n\nA hack, you say? Well, you started it chum.\n\nApplying AlphaImageLoader\n\nBecause the filter property is invalid CSS, the safest pragmatic approach is to apply it selectively with JavaScript for only Internet Explorer versions 5.5 and 6. This helps ensure that by default you\u2019re serving standard CSS to browsers that support both the CSS and PNG standards correct, and then selectively patching up only the browsers that need it. \n\nSeveral years ago, Aaron Boodman wrote and released a script called sleight for doing just that. However, sleight dealt only with images in the page, and not background images applied with CSS. Building on top of Aaron\u2019s work, I hacked sleight and came up with bgsleight for applying the filter to background images instead. That was in 2003, and over the years I\u2019ve made a couple of improvements here and there to keep it ticking over and to resolve conflicts between sleight and bgsleight when used together. However, with alpha channel PNGs becoming much more widespread, it\u2019s time for a new version.\n\nIntroducing SuperSleight\n\nSuperSleight adds a number of new and useful features that have come from the day-to-day needs of working with PNGs.\n\n\n\tWorks with both inline and background images, replacing the need for both sleight and bgsleight\n\tWill automatically apply position: relative to links and form fields if they don\u2019t already have position set. (Can be disabled.)\n\tCan be run on the entire document, or just a selected part where you know the PNGs are. This is better for performance.\n\tDetects background images set to no-repeat and sets the scaleMode to crop rather than scale.\n\tCan be re-applied by any other JavaScript in the page \u2013 useful if new content has been loaded by an Ajax request.\n\n\n Download SuperSleight \n\nImplementation\n\nGetting SuperSleight running on a page is quite straightforward, you just need to link the supplied JavaScript file (or the minified version if you prefer) into your document inside conditional comments so that it is delivered to only Internet Explorer 6 or older.\n\n\n\nSupplied with the JavaScript is a simple transparent GIF file. The script replaces the existing PNG with this before re-layering the PNG over the top using AlphaImageLoaded. You can change the name or path of the image in the top of the JavaScript file, where you\u2019ll also find the option to turn off the adding of position: relative to links and fields if you don\u2019t want that.\n\nThe script is kicked off with a call to supersleight.init() at the bottom. The scope of the script can be limited to just one part of the page by passing an ID of an element to supersleight.limitTo(). And that\u2019s all there is to it.\n\nUpdate March 2008: a version of this script as a jQuery plugin is also now available.", "year": "2007", "author": "Drew McLellan", "author_slug": "drewmclellan", "published": "2007-12-01T00:00:00+00:00", "url": "https://24ways.org/2007/supersleight-transparent-png-in-ie6/", "topic": "code"} {"rowid": 166, "title": "Performance On A Shoe String", "contents": "Back in the summer, I happened to notice the official Wimbledon All England Tennis Club site had jumped to the top of Alexa\u2019s Movers & Shakers list \u2014 a list that tracks sites that have had the biggest upturn or downturn in traffic. The lawn tennis championships were underway, and so traffic had leapt from almost nothing to crazy-busy in a no time at all. \n\nMany sites have similar peaks in traffic, especially when they\u2019re based around scheduled events. No one cares about the site for most of the year, and then all of a sudden \u2013 wham! \u2013 things start getting warm in the data centre. Whilst the thought of chestnuts roasting on an open server has a certain appeal, it\u2019s less attractive if you care about your site being available to visitors. Take a look at this Alexa traffic graph showing traffic patterns for superbowl.com at the beginning of each year, and wimbledon.org in the month of July.\n\nTraffic graph from Alexa.com \n\nWhilst not on the same scale or with such dramatic peaks, we have a similar pattern of traffic here at 24ways.org. Over the last three years we\u2019ve seen a dramatic pick up in traffic over the month of December (as would be expected) and then a much lower, although steady load throughout the year. What we do have, however, is the luxury of knowing when the peaks will be. For a normal site, be that a blog, small scale web app, or even a small corporate site, you often just cannot predict when you might get slashdotted, end up on the front page of Digg or linked to from a similarly high-profile site. You just don\u2019t know when the peaks will be.\n\nIf you\u2019re a big commercial enterprise like the Super Bowl, scaling up for that traffic is simply a cost of doing business. But for most of us, we can\u2019t afford to have massive capacity sat there unused for 90% of the year. What you have to do instead is work out how to deal with as much traffic as possible with the modest resources you have.\n\nIn this article I\u2019m going to talk about some of the things we\u2019ve learned about keeping 24 ways running throughout December, whilst not spending a fortune on hosting we don\u2019t need for 11 months of each year. We\u2019ve not always got it right, but we\u2019ve learned a lot along the way.\n\nThe Problem\n\nTo know how to deal with high traffic, you need to have a basic idea of what happens when a request comes into a web server. 24 ways is hosted on a single small virtual dedicated server with a great little hosting company in the UK. We run Apache with PHP and MySQL all on that one server. When a request comes in a new Apache process is started to deal with the request (or assigned if there\u2019s one available not doing anything). Each process takes a bunch of memory, so there\u2019s a finite number of processes that you can run, and therefore a finite number of pages you can serve at once before your server runs out of memory.\n\nWith our budget based on whatever is left over after beer, we need to get best performance we can out of the resources available. As the goal is to serve as many pages as quickly as possible, there are several approaches we can take:\n\n\n\tReducing the amount of memory needed by each Apache process\n\tReducing the amount of time each process is needed\n\tReducing the number of requests made to the server\n\n\nYahoo! have published some information on what they call Exceptional Performance, which is well worth reading, and compliments many of my examples here. The Yahoo! guidelines very much look at things from a user perspective, which is always important.\n\nServer tweaking\n\nIf you\u2019re in the position of being able to change your server configuration (our set-up gives us root access to what is effectively a virtual machine) there are some basic steps you can take to maximise the available memory and reduce the memory footprint. Without getting too boring and technical (whole books have been written on this) there are a couple of things to watch out for.\n\nFirstly, check what processes you have running that you might not need. Every megabyte of memory that you free up might equate to several thousand extra requests being served each day, so take a look at top and see what\u2019s using up your resources. Quite often a machine configured as a web server will have some kind of mail server running by default. If your site doesn\u2019t use mail (ours doesn\u2019t) make sure it\u2019s shut down and not using resources.\n\nSecondly, have a look at your Apache configuration and particularly what modules are loaded. The method for doing this varies between versions of Apache, but again, every module loaded increases the amount of memory that each Apache process requires and therefore limits the number of simultaneous requests you can deal with.\n\nThe final thing to check is that Apache isn\u2019t configured to start more servers than you have memory for. This is usually done by setting the MaxClients directive. When that limit is reached, your site is going to stop responding to further requests. However, if all else goes well that threshold won\u2019t be reached, and if it does it will at least stop the weight of the traffic taking the entire server down to a point where you can\u2019t even log in to sort it out.\n\nThose are the main tidbits I\u2019ve found useful for this site, although it\u2019s worth repeating that entire books have been written on this subject alone.\n\nCaching\n\nAlthough the site is generated with PHP and MySQL, the majority of pages served don\u2019t come from the database. The process of compiling a page on-the-fly involves quite a few trips to the database for content, templates, configuration settings and so on, and so can be slow and require a lot of CPU. Unless a new article or comment is published, the site doesn\u2019t actually change between requests and so it makes sense to generate each page once, save it to a file and then just serve all following requests from that file.\n\nWe use QuickCache (or rather a plugin based on it) for this. The plugin integrates with our publishing system (Textpattern) to make sure the cache is cleared when something on the site changes. A similar plugin called WP-Cache is available for WordPress, but of course this could be done any number of ways, and with any back-end technology.\n\nThe important principal here is to reduce the time it takes to serve a page by compiling the page once and serving that cached result to subsequent visitors. Keep away from your database if you can.\n\nOutsource your feeds\n\nWe get around 36,000 requests for our feed each day. That really only works out at about 7,000 subscribers averaging five-and-a-bit requests a day, but it\u2019s still 36,000 requests we could easily do without. Each request uses resources and particularly during December, all those requests can add up. \n\nThe simple solution here was to switch our feed over to using FeedBurner. We publish the address of the FeedBurner version of our feed here, so those 36,000 requests a day hit FeedBurner\u2019s servers rather than ours. In addition, we get pretty graphs showing how the subscriber-base is building.\n\n\n\nOff-load big files\n\nLarger files like images or downloads pose a problem not in bandwidth, but in the time it takes them to transfer. A typical page request is very quick, a few seconds at the most, resulting in the connection being freed up promptly. Anything that keeps a connection open for a long time is going to start killing performance very quickly.\n\nThis year, we started serving most of the images for articles from a subdomain \u2013 media.24ways.org. Rather than pointing to our own server, this subdomain points to an Amazon S3 account where the files are held. It\u2019s easy to pigeon-hole S3 as merely an online backup solution, and whilst not a fully fledged CDN, S3 works very nicely for serving larger media files. The roughly 20GB of files served this month have cost around $5 in Amazon S3 charges. That\u2019s so affordable it may not be worth even taking the files back off S3 once December has passed.\n\nI found this article on Scalable Media Hosting with Amazon S3 to be really useful in getting started. I upload the files via a Firefox plugin (mentioned in the article) and then edit the ACL to allow public access to the files. The way S3 enables you to point DNS directly at it means that you\u2019re not tied to always using the service, and that it can be transparent to your users.\n\nIf your site uses photographs, consider uploading them to a service like Flickr and serving them directly from there. Many photo sharing sites are happy for you to link to images in this way, but do check the acceptable use policies in case you need to provide a credit or link back.\n\nOff-load small files\n\nYou\u2019ll have noticed the pattern by now \u2013 get rid of as much traffic as possible. When an article has a lot of comments and each of those comments has an avatar along with it, a great many requests are needed to fetch each of those images. In 2006 we started using Gravatar for avatars, but their servers were slow and were holding up page loads. To get around this we started caching the images on our server, but along with that came the burden of furnishing all the image requests.\n\nEarlier this year Gravatar changed hands and is now run by the same team behind WordPress.com. Those guys clearly know what they\u2019re doing when it comes to high performance, so this year we went back to serving avatars directly from them.\n\nIf your site uses avatars, it really makes sense to use a service like Gravatar where your users probably already have an account, and where the image requests are going to be dealt with for you. \n\nKnow what you\u2019re paying for\n\nThe server account we use for 24 ways was opened in November 2005. When we first hit the front page of Digg in December of that year, we upgraded the server with a bit more memory, but other than that we were still running on that 2005 spec for two years. Of course, the world of technology has moved on in those years, prices have dropped and specs have improved. For the same amount we were paying for that 2005 spec server, we could have an account with twice the CPU, memory and disk space.\n\nSo in November of this year I took out a new account and transferred the site from the old server to the new. In that single step we were prepared for dealing with twice the amount of traffic, and because of a special offer at the time I didn\u2019t even have to pay the setup cost on the new server. So it really pays to know what you\u2019re paying for and keep an eye out of ways you can make improvements without needing to spend more money.\n\nFurther steps\n\nThere\u2019s nearly always more that can be done. For example, there are some media files (particularly for older articles) that are not on S3. We also serve our CSS directly and it\u2019s not minified or compressed. But by tackling the big problems first we\u2019ve managed to reduce load on the server and at the same time make sure that the load being placed on the server can be dealt with in the most frugal way.\n\nOver the last 24 days we\u2019ve served up articles to more than 350,000 visitors without breaking a sweat. On a busy day, that\u2019s been nearly 20,000 visitors in just 24 hours. While in the grand scheme of things that\u2019s not a huge amount of traffic, it can be a lot if you\u2019re not prepared for it. However, with a little planning for the peaks you can help ensure that when the traffic arrives you\u2019re ready to capitalise on it.\n\nOf course, people only visit 24 ways for the wealth of knowledge and experience that\u2019s tied up in the articles here. Therefore I\u2019d like to take the opportunity to thank all our authors this year who have given their time as a gift to the community, and to wish you all a very happy Christmas.", "year": "2007", "author": "Drew McLellan", "author_slug": "drewmclellan", "published": "2007-12-24T00:00:00+00:00", "url": "https://24ways.org/2007/performance-on-a-shoe-string/", "topic": "ux"}