I’m an address bar guy; I don’t use bookmarks and barely use a link toolbar. And somedays, my typing could be better – if/when the new gTLDs come in, the owners of ‘facebook.comk’ and ‘google.comk’ will get a good 50% of my surfing time and a catch-all on ‘setfiremedia.comk’ would receive all of my internal emails. I’m just as bad with .co.uk – the amount of traffic I send to ‘news.bbc.co.il’ and ‘news.bbc.co.yj’ is shocking.

For Many, Search Engines Are The Web

Most people, of course, don’t use URLs this way. This isn’t because they are more or less error-prone in their typing, but rather because their primary experience of web navigation is different – search engines are the front door to the web to many; they are the web to even more. This even holds true for when users do know the URL – we’ve all watched, head in hands, people type fully-qualified URLs into Google.

Organisations have caught on to this, and it interests me that we are witnessing a shift in how they direct you to their online presence from offline sources – adverts, brochures etc. It seems especially prevalent in government ads – the Royal Navy’s current TV advertising campaign, for example, simply tells the viewer to “search for ‘navy jobs'” for further details. The assumption, of course, is that they will rank #1 in whatever search engine the user choices. The risk, of course, is that somebody finds a way to hijack the listings for this term. It’s an old joke, but anyone who’s ever searched for ‘french military victories‘ knows what I’m talking about. Even ensuring you’ve got the top PPC hit won’t compensate for that level of negative PR.

What’s interesting is that, for the sake of usability, organisations are choosing to add an additional stage to the process of finding their website. By directing people to search for a term and then click on the result, they’re actually lengthening the process in order to make it easier. Risk factors aside, it’s a pretty good idea, and a pretty major development.

What Does The Future Hold For URLs?

So is natural language taking over, and are all attempts to improve URLs futile? Has the slow progress of the regulating bodies meant that we’ve had to find a better solution? Of course, some URLs are clever slogans themselves and add value to a product, but it seems that most are arbitrary identifiers, obfuscated by acronyms, abbreviations and dots, dashes and other de-humanised elements.

What do you reckon? Are the unbelievable sums of money spent on domain names wasted? Should Google Search stop using the URL of a site as an indicator of relevance? Or will their omnibar, or Mozilla’s ubiquity, be the final nail in URL coffin?