An Adaptive Image Technique: Thinking out loud

I’ve been playing a lot with responsive layouts, and the inevitable bugbear of adaptive images for responsive designs. Ethan Marcotte’s Fluid Images is what I’ve been playing with most, particularly via Matt Wilcox’s Adaptive Images script, and dreaming of the <picture> solution proposed by Mat Marquis. But and so, I’ve been doodling and fooling around with some ways of doing this, and have now come up with something that I like. I’m worried it’s actually terrible, but I’ve played with it enough that I’d like some feedback now.

I’ve come up with a a CSS3-only solution to adaptive images. For those who just want to see the example, go ahead. You can view source for all the details. The 3-col/2-col content is purely presentational.

To do this technique, I have an image tag, like so:

<img id="image1" src="/transparent.png" data-src="/fishing.jpg" alt="Man fishing on a log" />

That displays a 1 x 1 px transparent image. The data-src attribute is there to show the “true” source. In the CSS, I make use of the background-image property to provide media-query-attentive backgrounds & sizing.

The default declaration is:

#image1 {
background: url('/images/responsive/fishing-small.jpg') top center no-repeat;
width:100%;
max-height:67px;}

This is the “small device” version of the image. using media queries, I can also load in an hd/retina version for those devices:

@media only screen and (-webkit-device-pixel-ratio:2) {
#image1 {
background-image: url('/images/responsive/fishing-small.jpg')
}
}

I also can provide a mid-size version, a mid-size hd/retina version and a desktop version (or an infinite number of variations based on media queries).

@media only screen and (-webkit-device-pixel-ratio:2) and (min-width: 600px) {
#image1 {
background-image: url('/images/responsive/fishing-mid-retina.jpg')
}
}

To provide some fallback for IE users, I’ve included an IE-specific style:

<!--[if lt IE 9]>
<style>
#image1 {background-image: url('/images/responsive/fishing');width:100%;}
</style>
endif]-->

I like to start “mobile first” and use media-queries to “grow out” from there, but I could just as easily start with the largest version and work in – in which case the IE workaround wouldn’t be necessary.

Some of my thoughts on this technique

  • I like that this technique is really easy, really light-weight and doesn’t require any javascript or php.
  • The fact that I’m using a 1×1 transparent png fills me with the howling fantods as I remember spacer gifs.
  • Reusing a single tiny image all over the place has negligible bandwidth effects, but to be fair, I am making an “unnecessary” request to get it each time.
  • The data-src attribute is there to help with this. By default, things like Pinterest and Google Images could no longer grab your images with this technique (Whether that’s a good or bad thing, I leave to you). By leveraging a .htaccess rule, you could load in the data-src attribute as src for the pinterest or various bot user-agents.
  • This system could work pretty easily with automated CMS systems: using regex to replace a src attribute with a data-src attribute and injecting the 1×1 & an id is trivial, as is auto-generating some CSS to handle the variations of the image for each media-query break-point – but that’s definitely more work than not doing anything on the CMS -side and doing all replacements in JS or PHP on the front-end.
  • I like that I can easily update/replace any 1 image in the set without updating html source anywhere.
  • This feels “too easy” to me. All the other solutions I’ve found use some sort of scripting, be it PHP or JavaScript. That fact that there’s nothing to post to github here makes me feel like I’m doing something wrong.
  • Using background-image on images means that users can’t as easily download your image – right-click on an image and most browsers don’t give the option to “download background image” like they do on other elements.
  • I worry that this is doing something unexpected for accessibility – but mostly it should be ok, I think, as there’s an alt attribute, and will still work fine with a longdesc attribute.

I’m hoping for feedback on this for the world at large – as I said, I’m thinking out loud about this – it seems like a workable solution, so your feedback, thoughts, critique would be very much appreciated before I go do anything silly like use this in a client’s project.

Measuring Page Load speed with Google Analytics

UPDATE: Google now has a built-in tool for doing this. Simply add the following: _gaq.push([‘_trackPageLoadTime’]); and hey presto: Page-load-speed tracking. See this page for more info, including an important caveat.

With Google now including Page Load speed in their page-ranking algorithm, many of our clients started asking us how fast their sites load. There’s lots of developer tools that can help with this – I use (alternately) the Google Page Speed tool and Yahoo! YSlow during development to tweak page load times. But this doesn’t help our clients very much.

There’s a chart you can use on Google Webmaster Tools, but overall, I don’t find Webmaster Tools particularly end-user friendly. As it’s name implies, I’m not sure that it is supposed to be. A couple of our more technical clients use it, but most don’t use it, or don’t really get it. The chart that Google Webmaster Tools spits out can be found under “Labs”, then “Site Performance”. It looks like this:

Google Webmaster Tools Page Speed Chart
Google Webmaster Tools Page Speed Chart

Which is nice, and gives a clear delineation of what’s good, and what’s not. As you can see, this site rides right on the edge of being “fast”. When a client asked if I could tell him how fast individual pages are loading, this gave me the idea of using Google Analytics’ Event Tracking feature to log how fast a page loads. What’s nice about this is that most of our clients “get” Analytics, and are used to going there to see their stats. So additional stats there works for them.

With Google’s visual guidelines in place, I set about making this happen. I decided to name the Event Category “PageLoad”. I then added 5 labels to group the results:

  1. Fast (less than 1500 MS) (because Google, according to Webmaster Tools, considers anything 1.5s or faster “fast”.
  2. Acceptable (less than 3000 MS)
  3. Middling (less than 5000 MS)
  4. Slow (less than 10000 MS)
  5. Unacceptable ( 10000 MS or longer)

These groupings are completely arbitrary – I could have set them at any time span. Those just seemed reasonable to me, based on knowing that most of our sites have an average load time of 2800 MS, based on our internal tools.

So then I had to track it. The code is as follows:

var pageLoadStart = new Date();

window.onload = function() {
var pageLoadEnd = new Date();
var pageLoadTime = pageLoadEnd.getTime() - pageLoadStart.getTime();
// let's set some second (1000s of ms) segments
if (pageLoadTime < 1500)
    loadStatus = 'Fast (less than 1500 ms)';
else if (pageLoadTime < 3000)
    loadStatus = 'Acceptable (less than 3000 ms)';
else if (pageLoadTime < 5000)
    loadStatus = 'Middling (less than 5000 ms)';
else if (pageLoadTime < 10000)
    loadStatus = 'Slow (less than 10000 ms)';
else
    loadStatus = 'Too Slow (more than 10000 ms)';
var myPath = document.location.pathname;
if( document.location.search)
    myPath += document.location.search;
// round the time to the nearest 10 ms.
pageLoadTime = Math.round(pageLoadTime / 10) * 10;
// send the GA event
try {
    _gaq.push(['_trackEvent', 'Page Load', loadStatus,myPath,pageLoadTime]);
} catch(err) {}
}

Some Notes:

  • I have the first line (var pageLoadStart = new Date();) at the very top of my header, right after the <head> tag. The window.onload() function sits in the footer.
  • I round my values to the nearest 10 MS – but this might be too many values, too noisy. So you could round to the nearest 100 MS or even nearest second if that worked  better for you.
  • the document.location.pathname just passes the page’s address to Google, so I can see which pages are slow, or when particular pages or slow.
  • You can only send integers to the Event Tracking widget.
  • the _gaq.push() line is where I send this to Analytics. You might have your Analytics tide to a different variable, in which case change the _gaq to whatever you use. For more on how event tracking works, see the docs

What my client then sees in Google analytics is a chart like this (note – I’m using the new version of Analytics – yours might look different):

 

PageLoad chart
PageLoad chart

As you can see (this is for the home page of the site), the results are ALL over the place. Which is troubling, but that’s for another day to figure out why it sometimes loads sooooo sloooowwwwlyyyyy. (We’re currently running an A/B test wherein a video auto-plays or not. My best guess is that because the page doesn’t finish loading until after the video starts to play, that explains the slow loading. But maybe not).

Regardless of the actual performance of this particular page, you can see how this is a nice little chart for clients to see – nothing too technical (I could possibly be even less technical by not including the MS numbers in the labels), but provides them some easy insight into how their site is doing.

Caveat: This chart obviously doesn’t include any pre-processing factors – connection speed, server-side scripts taking a long time, etc. But it’s still pretty useful, I think.

Vancouver Civic Election Polling Station Lookup

I don’t know if you’ve seen the City of Vancouver’s Polling Station Lookup Tool. It’s really, really terrible. And I figured that I could do something better. So I tooled around with the Google Maps API, drew up the regions, entered the polling stations into a DB, spent a couple hours coding and hey presto! I present to you: My Maps-based Polling station lookup tool!

Because I drew the voting division boundaries myself in Google maps, there’s a peculiar bug (It stems from the algorithm I use to determine whether a co-ordinate is within a particular polygon), which is if your address is right on the boundary line between two divisions, it is possible that the tool will push you into the wrong division. I suspect most of this stems from the lack of precision in the geocoding of the address lookup vs the precision of the boundaries – the divisions have a couple extra decimal points of precision, so the “dot” that is your address is pretty big on the map.

So, for 98% or so of all users, this will be accurate, but I do warn you – there’s the odd chance that this will return you the wrong polling station. If you think you might be on a boundary, I did include the map of the voting divisions, where you can view the divisions as I drew them on the map – and you can see that it’s not perfect….
I’d been wanting to do something with the Google Maps API for a while – this was a good project to wet my feet with, as I’ve got some upcoming work that will use this sort of stuff more more extensively. As a side note, while I prefer the Yahoo maps API (I tried out both), the actual maps in Google Maps are so much more pleasant that I ended up using Google Maps.

so again, here’s the link to my Polling Station Lookup Tool

%d bloggers like this: