Have thoughts on this post? Head over and leave a comment on the blog: How To Easily Hide Client-side Content from Google Web/Mobile Preview Bot
Follow RedCardinal on Twitter!
var ua = navigator.userAgent.toLowerCase();
if(ua.indexOf('google web preview') == -1){
showContent();// whatever content you want to hide from Preview Bot
}
The above wont run showCntent() to Google’s PreviewBot. Why might you want to do this? Well I like to run surveys on some sites to gather qualitative feedback. I tend to use 4QSurvey.com for this (free, easy to set up, and gives great insight). The last thing you want is PreviewBot showing your website with the lightbox questionnaire, so it’s easy to block out. This is the code I use to load 4QSurvey asynchornously:
<script type="text/javascript">
(function(){
var UA = navigator.userAgent.toLowerCase();
if(UA.indexOf('google web preview') == -1){
var survey=document.createElement('script');
survey.src = "http://4qinvite.4q.iperceptions.com/1.aspx?sdfc=[your code]&lID=1&loc=4Q-WEB2";
survey.type = 'text/javascript';
survey.async = true;
var s = document.getElementsByTagName('script')[0];
s.parentNode.insertBefore(survey, s);
}
})();
</script>
The above will not block loading, and ensure PreviewBot doesn’t show a site preview with pop-up. To learn more about PreviewBot see Google’s Web Preview docs. The UA used for PreviewBot is “Google Web Preview (Mozilla/5.0 (en-us) AppleWebKit/525.13 (KHTML, like Gecko; Google Web Preview) Version/3.1 Safari/525.13)”.
Have thoughts on this post? Head over and leave a comment on the blog: How To Easily Hide Client-side Content from Google Web/Mobile Preview Bot
Follow RedCardinal on Twitter!
But implemented poorly and your site could be on the receiving end of a penalty, or worse still - an outright ban.
Have thoughts on this post? Head over and leave a comment on the blog: Using Text Replacement with Flash – Dangerous?
Follow RedCardinal on Twitter!
Flash is undoubtedly a far more aesthetically pleasing medium than plain text rendered in the browser. Although recent browsers from both Apple and Microsoft have introduced anti-aliased text fonts, most Internet users are still using non-aliased viewers. And this is where Flash can appreciably improve the suer experience.
Flash replacement involves substituting plain text output with Flash-based textual content which uses anti-aliased fonts. (Anti-aliasing, for anyone unfamiliar with the phrase, basically means removing jagged edges from text.)
There is a number of techniques available for Flash replacement, sIFR (Scalable Inman Flash Replacement) and SWFobject being perhaps the best known.
sIFR involves the use of JavaScript to detect and read the text content of any particular DOM element (any piece of text on a web page for example) and sending that text to a small Flash module which returns the content in Flash format.
The process is seamless and the user gets to view your headings and selected text in a nice anti-aliased Flash font.
On the other hand, SWFObject simply replaces any text node with a pre-compiled Flash movie. The text contained within the text node is superfluous and has no direct relationship with the Flash rendered. It is this technique that I came across recently when checking a site for an enquirer.
Yesterday I wrote about SEO Ethics and how I feel that companies who promote competing websites are far more likely to cross into what in my opinion is unethical territory.
I came across the specific situation discussed in that post from a enquiry made on this site. I was asked to perform a quick analysis of a site (which will remain nameless). The site in question made heavy use of Flash. The site also used FlashObject (a prior incarnation of SWFObject). Here’s what I found on the site:
visibility:hidden
and these pages were both unusable and displaying quote different content to users with and without FlashHere’s the exact CSS class applied to the main content:
.content {
left:0px;
padding-left:2px;
padding-right:2px;
position:absolute;
top:0px;
visibility:hidden;
z-index:1;
}
When I first came across this implementation I immediately emailed back the enquirer and asked that they contact the developer (who is a large well-known Irish web co). After a few days I contacted the enquirer again to find out how the developer responded. I’m not going to quote this, but you’ll have to trust me that the following accurately reflects the response given:
Following up on your concerns that the your website has hidden text, please be assured that your website is fully accessible to the Search Engines.
If you turn off JavaScript in your browser, the secondary pages of your website are returned.
The search engines Spiders view the html code of your website.
All areas of your site that use Flash do so with “Flash Replacement Text”, which is 100% search engine friendly.
I would also like to show you how you can see all of the pages that Google has indexed. Type site:www.yoursite.com into the Google search bar you will see that every page of your website is indexed.
I hope that this helps to reassure you that your website is search engine friendly.
I want to deal with some of the items mentioned above to clarify exactly what the Search Engines are seeing, and what the official views are on certain implementations being used.
This is indeed true. There is no bar on Search Engines accessing and crawling the pages in question. It is also true that the search engines (and in this particular case I’m referring to Google, which represents c.90% of Irish search traffic) have been known to check your CSS files to look for anything untoward.
visibility:hidden
is a very strong signal of spam. That property is used to hide content within the browser view. Here are the Google guidelines on hidden text:
Hiding text or links in your content can cause your site to be perceived as untrustworthy since it presents information to search engines differently than to visitors. Text (such as excessive keywords) can be hidden in several ways, including:
- Using white text on a white background
- Including text behind an image
- Using CSS to hide text
- Setting the font size to 0
[... ]
If your site is perceived to contain hidden text and links that are deceptive in intent, your site may be removed from the Google index, and will not appear in search results pages. When evaluating your site to see if it includes hidden text or links, look for anything that’s not easily viewable by visitors of your site. Are any text or links there solely for search engines rather than visitors?
In my opinion having text hidden in the version served to Google constitutes hidden text as defined in the guidelines and opens the offending site to the possibility of penalty or ban.
This is where the distinctions blur, and opinions diverge. There is a lot of current discussion on this topic over on Google Groups at the moment (see here and here).
Berghausen (a Google employee) has stated:
The goal of our guidelines against hidden text and cloaking are to ensure that a user gets the same information as the Googlebot. However, our definition of webspam is dependent on the webmaster’s intent. For example, common sense tells us that not all hidden text means webspam–e.g. hidden DIV tags for drop-down menus are probably not webspam, whereas hidden DIVs stuffed full of unrelated keywords are more likely to indicate webspam.
I bring this up because, although your method is hiding text behind a very pretty Flash animation, you are still presenting the same content to both the user and the search engine, and offering it through different media.
On the face of it it would appear that Flash replacement shouldn’t be an issue. On the face of it…
Google’s Dan Crow (head of Crawl) recently attended a SEMNE group event on the subject of ‘Getting Into Google’. Apparently he was very frank on a number of issues, one of which was Flash replacement. SherwoodSEO attended the event and reported the following:
- sIFR (scalable Inman Flash Replacement) – sIFR is a JavaScript that allows web designer to customize the headlines displayed on their pages. Headline text rendered in HTML can look blocky and unrefined – sIFR paints-over that HTML with a Flash-based equivalent. This gives the headline a smooth, refined look, while still preserving the indexable text that Google needs to process the page. Dan said that sIFR was OK, as long as it was used in moderation. He said that extensive use of sIFR could contribute negative points to your website’s overall score. Yes, that’s a bit vague, but “vague” is not as bad as…
- SWFObject – SWFObject is a more elaborate JavaScript designed to swap-out an entire section of Flash with its HTML equivalent. Think of the Flash section of a webpage as being painted on a window shade. SWFObject decides if you have Flash installed (i.e. you are a web surfer) or not (i.e. you are a search engine.) If you don’t have Flash, the window shade rolls-up, and an HTML text equivalent is displayed on-screen. Dan pulled no punches on SWFObject: he characterized it as “dangerous.” He said that Google takes great pains to avoid penalizing sites that use technical tricks for legitimate reasons, but this was one trick that he could not guarantee as being immune from being penalized.
Now when the head of Google Crawl says that a particular technique is “dangerous” and cannot “guarantee as being immune form being penalized” I sit up and take note. Dan Crow is in charge of Google’s entire fleet of Googlebots. In my opinion his comments carry considerable weight.
If using SWFObject has been classified as “dangerous”, what might happen when you use this implementation AND use visibility:hidden
for the text replaced by the Flash? Well in my opinion this implementation wont improve your standing with Google.
Google bans sites every day. I regularly contribute over on Google’s Webmaster Help Group and see cases of banned sites every other day. Often threads are started by webmasters whose sites have performed well for months and years. Then suddenly, without any change to their site, dropped form the index.
My point is that indexation does not guarantee that your page hasn’t broken the guidelines. A penalty can be applied at any time. And when it does it hurts.
I spent quite some time both analysing and researching the issues at hand (time that could and should have been applied elsewhere). Given that the developer of the site also happens to be the supplier of SEM services referred to in my SEO Ethics post, I cant say with any certainty that their responses to this situation were genuine. If so, then it displays ignorance/incompetence at best. If not, then I think their ethics must be called into question.
Have thoughts on this post? Head over and leave a comment on the blog: Using Text Replacement with Flash – Dangerous?
Follow RedCardinal on Twitter!
Have thoughts on this post? Head over and leave a comment on the blog: Offline RSS Reading – Google Gears & Greader
Follow RedCardinal on Twitter!
Google launched Gears recently. Gears enables the off-line storage of on-line data. This means that web-based apps can now take a cache of data and store it locally.
The main reason I installed Gears (it runs as a browser extension with a local DB server) was so that I can read some of the feed backlog that’s been building lately. After installing Gears Greader now gives me a new option up in the utility navigation:
That small arrow within a green oval is the selector for on-line / off-line reading. Clicking it begins the magic that is Gears:
Funnily, Greader doesn’t give you the total number of unread posts when online – you simply get the more than 100 reference:
But in Gears off-line mode it does – I have 1,347 unread posts in my reader:
And when you go back on-line just click the toggle button to re-sync back up and (hopefully) reduce that unread count:
If you’re new to RSS and syndicated content my Really Simple Guide to RSS might be worth reading.
This is really fantastic in my view. The ability to access on-line data when offline has always been the holy grail. While others have also managed to implement similar functionality (dojo comes to mind), when Google cracks this the likelihood of mass take up increases exponentially.
Gears is certainly helping me get through all those unread posts (believe me the number was far higher), but my one quip is that the posts don’t list in chronological order. It appears as though feeds are clumped together in offline mode. I’m sure it’s just a wrinkle and something that will iron itself out as Google improves the product iteratively over time.
I wonder what Google’s spam-fighting supremo will report from his Gears use on the flight to SMX. [Wish I was going to that....]
Have thoughts on this post? Head over and leave a comment on the blog: Offline RSS Reading – Google Gears & Greader
Follow RedCardinal on Twitter!
Use Javascript where you should really be using plain HTML? Well here's a nice $65m example that teaches a good lesson.
Have thoughts on this post? Head over and leave a comment on the blog: $65m, And Some Of The Worst Use Of Javascript Ever
Follow RedCardinal on Twitter!
Take for instance all the great apps Google makes freely available. Gmail would still work, but the crippled Javascript-free version doesn’t cut it for me. And then there’s Reader, another of my most used apps. It doesn’t degrade quite so gracefully. Actually, like Calendar, it doesn’t degrade at all.
Complicated web apps can be forgiven for not degrading in the absence of Javascript. By their nature they rely on JS to handle heavy lifting functions. For more elementary functions graceful degradation can enable similar functionality. Personally I favour scripting to the DOM after the web page loads and replacing HTML functionality with a Javascript alternative that adds behaviour to the page. That way your site remains functional without Javascript.
But what happens when you use Javascript when you really shouldn’t?
Back in 2006 one of Ireland’s largest Internet acquisitions saw Ireland.com, the web property of The Irish Times newspaper, purchase MyHome.ie for a reported $65m. I covered the story here. MyHome.ie is (apparently) an extremely profitable property (real-estate) website.
But apparently MyHome.ie has a smaller sister, a little known site called MyHome2Let.ie. And perhaps little known for good reason. Here’s a shot of their site:
Looks innocuous enough. But the kicker is in how the primary navigation is coded:
<td id="mainmenu_td_t0" class="mainmenu mainmenu_t_on" valign="middle" align="center" onclick="window.open('http://www.myhome2let.ie/','_self');" onmouseout="mainmenu_Roll(0);" onmouseover="mainmenu_Roll(0);">HOME</td>
<td width="1" rowspan="2">
</td>
<td id="mainmenu_td_t1" class="mainmenu mainmenu_t_on" valign="middle" align="center" onclick="window.open('http://www.myhome2let.ie/search/', '_self');" onmouseout="mainmenu_Roll(1);" onmouseover="mainmenu_Roll(1);">SEARCH</td>
<td width="1" rowspan="2">
</td>
<td id="mainmenu_td_t2" class="mainmenu mainmenu_t" valign="middle" align="center" onclick="window.open('http://www.myhome2let.ie/advertise/', '_self');" onmouseout="mainmenu_Roll(2);" onmouseover="mainmenu_Roll(2);">ADVERTISE</td>
<td width="1" rowspan="2">
</td>
<td id="mainmenu_td_t3" class="mainmenu mainmenu_t" valign="middle" align="center" onclick="window.open('http://www.myhome2let.ie/services/', '_self');" onmouseout="mainmenu_Roll(3);" onmouseover="mainmenu_Roll(3);">USEFUL SERVICES</td>
<td width="1" rowspan="2">
</td>
<td id="mainmenu_td_t4" class="mainmenu mainmenu_t" valign="middle" align="center" onclick="window.open('http://www.myhome2let.ie/info/', '_self');" onmouseout="mainmenu_Roll(4);" onmouseover="mainmenu_Roll(4);">INFO & ADVICE</td>
In case your wondering what all that code does, well it does what should have been accomplished in about 10% of that mark-up. It instructs the browser, in the most convoluted way possible, how to handle the primary navigation links that you can see in the image.
But worse still, those links, the primary site navigation links, are implemented in Javascript. Turn off JS and the site no longer works. You simply can’t navigate the site.
And the real cherry on this particular pie is that search engines have never been good at handling Javascript very well. Do you think that any of the pages linked to via the Javascript code are cached by Google? I’ll let my silence answer that question.
Javascript is a great language. Used well the potential of the web expands massively. But make sure you use HTML where it was designed to be used, and Javascript only when you need to do some heavy lifting that HTML can’t handle.
Have thoughts on this post? Head over and leave a comment on the blog: $65m, And Some Of The Worst Use Of Javascript Ever
Follow RedCardinal on Twitter!
You probably wont know what all the fuss is about. But believe me when I say that this is going to go into a text book someday. And it won't be under a 'best' title.
Have thoughts on this post? Head over and leave a comment on the blog: Blind People Can’t Eat Chocolate
Follow RedCardinal on Twitter!
That’s the view you get if you visit their site in Lynx.
But as if that’s not bad enough, there is an even worse issue with this site which will seriously affect Lily O’Briens at the business level.
Being at SES London, and meeting guys who are literally making millions from SEO (some of these guys are billing €1k per hour), is teaching me that I shouldn’t be just giving away my knowledge. So in this case I’m going to keep my powder dry for the moment.
Have thoughts on this post? Head over and leave a comment on the blog: Blind People Can’t Eat Chocolate
Follow RedCardinal on Twitter!
I think Pat is on to an absolute winner. And with a couple of tweaks to the website I think this will become even more of a no-brainer for customers.
But is my advice any good? You decide (and there's a free link for anyone who can figure out what I'm thinking at the end :mrgreen:)
Have thoughts on this post? Head over and leave a comment on the blog: The Tale of a Link Whore, a Mobile PC, a Site Review, and some Clever Market Disruption
Follow RedCardinal on Twitter!
Pat: Hi Richard. Would appreciate any mention. Am beginning to feel like a link wh***
http://blog.roam4free.ie/win-a-pocket-pc-with-roam4freeMe: May I ‘dismantle’ it? It might not be pretty
Pat: Lmao Please do
Of course I’m not one to turn down a free sacrificial lamb, so…
Before I go any further here’s my policy on posting and linking out on request. If something is worthy of a post or a link I’m normally quite happy to oblige. There, that was easy.
Roam4Free.ie is a great idea. I’ve been with O2 since they were Digiphone, and for many years was always over-charged on my roaming. And I received many a refund to prove it
(Fortunately all O2′s roaming partners ‘set’ the same price a year or two back – how anti-competitive can they get?)
I think Pat is on to a winner. But I also think he really has to nail the website because (and I’m assuming here) it is the primary sales channel.
Oh, and by the way, my comments here could easily apply to any website, so I hope they might be useful to other readers, not just Pat.
Well more often then not it’s the page title. As one of the top elements in any HTML page, the title is very first on-screen element to be populated in your browser.
When I visit Roam4Free.ie I see this title:
Welcome to Roam4free.ie – The end of sky high roaming charges !
Two things strike me. First there are no targeted keyword phrases in that title. Other than ‘roaming charges’, which I doubt people search on, there are no reasons for people to discover Pat’s service via the number #1 Internet gateway – Search Engines.
So what phrases might I suggest to Pat?
Well I can see that ‘roaming charges’ and broad matches have very little volume. But the terms ,’international sim cards’, ‘mobile international’ and ‘cell international’ (cell = mobile in the US) and a number of long tail derivatives of those phrases have fairly good volume (000′s per month).
Now just for a minute I’m going to take off my SEO cap and put on my marketing cap. Pat is doing a great job of promoting this (Read/WriteWeb just popped up in my reader). So he’s going to get traffic. Therefore he needs to balance the SEO stuff with pure marketing. And the page title can be a powerful marketing tool.
The title has to quickly establish the product’s benefits for visitors. It should also attract some Search Engine Love if possible. Here are some of my suggestions:
Reduce your International Cell & Mobile Roaming Charges by up to x% with Roam4free.ie
or maybe:
Turn your Mobile into a Free International Cell Phone with Roam4Free
Both of those titles include some relevant keywords that might help with SEO efforts. But more importantly, they both tell the visitor exactly what the product does in simple English, and include a clear call to action. I always think the best way to get your message across to the widest audience is to speak in plain simple language (and pop a couple of nice high-volume keywords in there for measure :mrgreen:).
I posted about the benefit of using good marketing copy in your META description tag a while ago, and I think Pat could look at editing his current Description:
Works in over 115 countries. Receive calls for FREE in over 65 countries. Up to 90% discount on standard mobile rates
I would spell it out – mention ‘international sim card’ somewhere in that copy. (Good use of upper-cased ‘FREE’ though.)
One other point worth mentioning here is that different pages can effectively become honey pots for various search phrases (you should always try to target different phrases on your various pages) . And while I’m on the subject, remember that people can land on any page, not just the homepage, so you should consider every page a selling opportunity.
The homepage makes good use of contrast and visual boundaries to break up the main page areas:
I think I have made it clear on numerous occasions that I am not a big flash fan. The flash image on the homepage (sorry, you cant see it in the image above) really doesn’t reinforce the copy on the page. A static image of a sim card will have the same effect in explaining the product. And as for placing static text within the flash file – silly, silly..
And then there’s the font colour, which I feel is too close in contrast to the background colour (blue on blue in places).
Personally I think the homepage should be the seller. If you can convert from the homepage you’re on to a winner. The more pages people check out the more opportunity they have to reconsider that purchase decision.
So how would I make the homepage sell? Perhaps a three point storyboard that explains the product, how to get it, and how to use it:
Yes I know all that info is all ready there, but I think it needs to be simplified and given more prominence on the page. Make it feel as simple as possible – 1, 2, 3. Get prospects into the comfort zone.
I would place all the ancillary info into the appropriate story element above, e.g. ‘Use in over 115 countries’, ‘Compatible with most mobile phones and networks across the globe.’, ‘No call set up.’ in step one Get a Sim Card
‘Easy to use. Top up from where ever, when ever.’, ‘Per-minute billing. Save up to 90% on standard mobile rates.’ in step two ‘Add Credit’.
‘No line rental. No minimum contract. No hidden costs – Just FREE incoming calls in over 65 countries, and up to 90% off standard mobile rates’ in the final step.
OK, you might have to edit this last group a little. But the point is to keep the decision process as simple as possible and the purchase path short. Give customers the info needed to make the all important purchase decision without leaving that homepage. (Not sure if implementing the pricing would be possible here though?)
Take a look at the image above. Apart from there being no obvious homepage link (we read left to right and expect the homepage link to appear top right LEFT of the page), (oops, a little typo there – I’m slightly dyslexic…) I can see the most glaring Achilles heel. But before I declare my hand, some history.
The Internet has been around for a while now, and over time a number of conventions have formed and been widely accepted. The most widely accepted convention is how to link. Unfortunately Roam4Free.ie breaks that convention, badly.
Take a look at the navigation bar:
Do you see the link for signing up? Well it’s there all right. But if you are like most Internet users you glance rather than read, and you’d be forgiven if you missed the sign up link.
It’s actually there at the top of the navigation bar: ‘New User ? Buy a sim to get an account today!’.
The link is not underlined, and worse still, it uses the same color as the labels on the login form. One of the most important links on the site, ‘Sign Up’, doesn’t look like a link at all, it looks like plain text.
This page is straight-forward and to the point. In fact it’s a little thin on content – there’s a lot a free real estate there, so I would consider increasing the font size to make reading easier. The font size should also be varied to give a visual cue as to the importance of various text elements.
But this next bit pisses me off. When you visit a site you have a goal. You want the shortest and quickest path to achieving that goal so you can move to your next goal.
So every moment of time waisted due to poor design reduces the goodwill you have toward the site in question. On the purchase (sign up) page I am given a link to ‘More Details’. Here’s what I get:
How has that improved my experience? I just waisted 2 clicks – one to view a useless page, and another to go back to go back to where I came form. And a small fraction of visitors wont bother to go back.
Try this without Javascript. OK, maybe I’m a little pedantic on this one, but what about mobile browsers? After all, mobile users are the target market here, and I do think mobile Internet might catch on sooner or later…
Nice use of XMLHttpRequest though.
Yes! It’s a great idea, and I hope it catches on.
My criticisms of the website might seem harse. I only checked a few pages TBH, and I’ve seen far worse. It’s a nice site, and with a few tweaks could probably really pull in traffic that converts.
I see an affiliate system also in the offing. That should push the boat out further as those clever affiliate people target some of the juicy long tail phrases I noticed.
On a final note, the site has one thing I haven’t mentioned that I think could be a huge asset and very serious linkbait. I’m not going to say what, but I might whisper it to Pat at some stage :mrgreen:.
(If anyone can guess what it is I’ll give them any link they request – no baddies though.)
Have thoughts on this post? Head over and leave a comment on the blog: The Tale of a Link Whore, a Mobile PC, a Site Review, and some Clever Market Disruption
Follow RedCardinal on Twitter!
The full study is available for download from within this post.
Have thoughts on this post? Head over and leave a comment on the blog: eGovernment Accessibility Analysis
Follow RedCardinal on Twitter!
The websites of a number of Government Departments, Agencies and Political Parties were tested for accessibility and coding standards. The sites were also checked for contemporary web technologies such as RSS.
Results Overview:
Government Department websites tested: 16
Valid CSS, (X)HTML & passing WCAG 1.0 Level A: 4 (25%)
Sites passing WCAG 1.0 Level A: 12 (75%)
Sites utilising RSS: 4 (25%)Other Public websites tested: 18
Valid CSS, (X)HTML and passing WCAG 1.0 Level A: 0 (0%)
Sites passing WCAG 1.0 Level A: 12 (67%)
Sites utilising RSS: 2 (11%)Political Party websites tested: 7
Valid CSS, (X)HTML and passing WCAG 1.0 Level A: 0 (0%)
Sites passing WCAG 1.0 Level A: 3 (43%)
Sites utilising RSS: 3 (43%)
There is one entity that impacts daily on each of our lives. That entity is the Government.
The Irish government is the body tasked with the administration of the land of Ireland. As such the government is responsible for making the law, enforcing the law, and maintaining the welfare of the citizens. It is no surprise that the interface of citizen and government is one of the most important elements of any political system.
The first Information Society Action Plan was published in January 1999 and in November 2001 Ireland had become the top performer in an EU benchmarking report on public service on-line delivery.
In March 2002 the Irish Government published “New Connections – A strategy to realise the potential of the Information Society”. The document set forth an action plan identifying key infrastructures that required development, one of which was eGovernment.
eGovernment refers to the use of information and communication technology (ICT) as an interface between the citizens and government of a nation. Most often the term refers to the use of the Internet as a communication platform to allow the exchange of information and the execution of processes that had previously been undertaken via direct human interaction.
Introduction of eGovernment is an EU-level policy, and part of a broader EU strategy to make Europe the most dynamic and efficient economic block in the world. ICT is seen as the key facilitator of this strategy:
The successes and potential of eGovernment are already clearly visible with several EU countries ranking amongst the world leaders. Electronic invoicing in Denmark saves taxpayers €150 million and businesses €50 million a year. If introduced all over the EU, annual savings could add up to over €50 billion. Disabled people in Belgium can now obtain benefits over the Internet in seconds, whereas previously this took 3 or 4 weeks. Such time savings and convenience can become widespread and benefit all citizens in Europe in many public services. (Source: COM(2006) 173 final)
ICT is also seen as an enabler and facilitator of inclusive strategies as set out by the EU.
The 2002 document makes a number of references to the availability and accessibility of government websites:
The National Disability Authority is a statutory agency tasked with policy development, research and advice on standards designed to safeguard the rights of people with disabilities.
The purpose of the study is to measure the accessibility of the primary government agency websites. The websites of the main political parties were also tested as those organisations are inherently connected to the administration of a democracy through their stated goals and policies.
The following tests were conducted to ascertain a measure of web standards and accessibility:
In this study the 3 automated accessibility validators were used and in some case supplemented by manual evaluation in the Lynx text browser. Tests were limited to the homepage of each site (in some cases an inner page was tested – e.g. where splash pages were used and the home page was therefore an inner page). All tests were conducted during the period 20-31 November 2006.
While these tests cannot be guaranteed to properly ascertain the accessibility of any webpage, they do serve to highlight a number of flaws that would ordinarily render a page inaccessible via screen-reading technology.
The Internet is evolving. Buzzwords such as web2.0 are common place. In my view what we are seeing is not a change but a natural progression. Today’s Internet is about interaction, multiple-way dialogue, and innovative communication channels.
This study therefore includes tests for interactive techniques and alternative distribution channels.
RSS
Homepages were checked for RSS (Really Simple Syndication) feeds. RSS is fast becoming the de-facto transport for on-line information syndication (note the recent integration of RSS into the latest browsers from both Microsoft and Mozilla). In cases where a feed was not apparent on a homepage the Press section (or similar) was also checked.
It would seem both appropriate and desirable that any entity which relies on news agencies to broadcast their message would utilise RSS.
Blogs
While not appropriate for every context, blogs have been found to add transparency and openness within a political setting. Blogs also allow for meaningful dialogue between writer and audience.
Real-Time Chat
Used by the software industry for many years, real-time chat facilities allow Internet users to ‘chat’ with a support agent through a real-time messaging system.
[NOTE: Please click on the above image for a larger resolution and an alternative accessible version.]
The study tested a total of 41 websites: 27 sites passed the automated WCAG 1.0 Priority 1 (A) validation tests.
Of the Government Department websites tested 12 from a total of 16 were compliant with WCAG 1.0 Priority 1 (A).
The lack of RSS feeds on 12 department websites was a particularly odd result given the relationship of Government with the public and press, and the Government’s need to shape public perception through the news channels.
The websites of the main political parties were found to be lacking in terms of contemporary Internet technologies: Only 3 of the 7 party websites included an RSS feed and none offered multiple feeds targeting different content and audiences.
4 of the 7 party websites tested failed WCAG 1.0 Priority 1 (A), and none validated for valid CSS/HTML coding standards.
A positive feature of this survey was the number of Government websites that aspired for a higher standard of validation than the basic WAI WCAG 1.0 Priority 1 (A).
At least 4 sites displayed WCAG Priority 2 (AA) badges on their homepages. Unfortunately only 1 actually attained that level of Accessibility.
At least 2 sites displayed or made claim to WCAG Priority 3 (AAA) Accessibility, the highest level of accessibility, however none did validate to this standard.
Some websites tested stated a clear aspiration to achieve high accessibility and informed visitors of the ongoing effort toward attaining that goal.
Validation is a binary test – a site either validates or it does not. In some cases failure can be remedied with minimal effort, while in others achieving compliance with both WAI WCAG 1.0 and W3C coding standards will require a substantial undertaking.
Creating a website that complies with WCAG is perhaps the easier phase of providing an accessible website. Maintaining WCAG compliance is by far the most difficult area of website accessibility, even more so given the dynamic nature of many of the sites tested.
Web standards, such as those developed by W3C and WAI, are the foundation of the ‘Inclusive Web’. Websites which comply with these standards will ensure that the broadest spectrum of visitors can access their information and benefit from the full potential the Internet has to offer.
In cases where accessibility anomalies were flagged by automated evaluation tools the site in question was manually evaluated in the Lynx text-browser.
The search facility on a number of Government sites was found to cause practical accessibility issues:
1. Department of the Taoiseach:
Here is the mark-up for the search feature:
<form id="basicSearch" action="search.asp" method="get">
<div class="searchTop"><label for="searchWord" accesskey="4" /></div>
<div class="searchMiddle"><input class="searchFormInput" type="text" name="searchWord" id="searchWord" size="16" value="Enter keyword" /></div>
<div class="searchBottom"><input type="image" value="submit" name="search_go" id="search_go" src="/images/search/button_search.gif" alt="Search" /></div>
</form>
This is Andy Harold’s opinion on the above code:
This is an attempt to resolve the need to have a label tag and to put some default text in the text field. But appears to be done purely to satisfy accessibility checkers than real life requirements, and may even upset some screen readers. I’d say this is poor practice. The label should have some text within it and there shouldn’t be a ‘value’ attribute in the text field.
Putting default text in comes from 10.4 (Priority 3): Until user agents handle empty controls correctly, include default, place-holding characters in edit boxes and text areas. But this became outdated almost as soon as it was written, because all the user agents used by people with sight difficulties can handle empty controls. So the use of label tags meets all needs.
2. National Disability Authority
The mark-up powering the search facility:
<label for="query" accesskey="4">
<input name="q" id="query" title="Enter keywords to search for" value="" size="30" type="text">
<input title="Submit your search and view results" value="Search" type="submit">
</label>
Andy Harold’s opinion:
Enclosing input’s within a label is allowed by the standards so that you don’t have to supply a ‘for’ attribute as it the label implicitly refers to the enclosed input. Having the two inputs enclosed by the label, as in your example, makes this confusing. The fact that there is no text in the label tag makes this more confusing still. So although technically you can do this – ie passes automatic validation tests – it’s not the correct use of the label element and so wouldn’t be what a user agent (eg a screen reader) would be expecting and so may cause it problems. So, on that basis I wouldn’t pass it as P3 simply because it makes little sense.
Remember that the standards can’t cover every situation and so are purely there to guide you into making good decisions. In this case you could put some text in the label (and take the input elements out of it) if you really want it to be passed as P3. But if this makes the search facility too visually unappealing, just drop the label altogether. This may not make it technically ‘P3′ but more importantly it will still be accessible because of the title attribute, so it shouldn’t matter.
3. Pobail
Here is the Lynx view of the English version Pobail homepage:
And here is the underlying mark-up:
<label for="search">
<input type="text" name="qt" id="search" value="" maxlength="1991" />
</label>
<input type="submit" value="Go" class="submit" />
While the search element may pass automated validators, the form itself has little value to users of screen reading technologies. The ‘Advanced search options’ link is in another div
.
[NOTE: Andy Harold is the developer of Total Validator. The tool is available as either a free Firefox plug-in or a professional desktop application.]
Page URLs
Government Departments
Foreign Affairs, Dept. of
Agriculture and Food, Dept. of
Arts, Sport and Tourism, Dept. of
Communications, Marine and Natural Resources, Dept. of
Health and Children, Dept. of
Education and Science, Dept. of
Enterprise, Trade and Employment, Dept. of
Environment, Heritage and Local Government, Dept. of
Finance, Dept. of
Defence, Dept. of
Justice, Equality and Law Reform, Dept. of
Community, Rural and Gaeltacht Affairs, Dept. of
Community, Rural and Gaeltacht Affairs, Dept. of
Taoiseach, Dept. of the
Transport, Dept. of
Social and Family Affairs, Dept. of
Government Informational Portals
Business Access to State Information and Services
Public Service Information for Ireland
Other Government Websites
Office of the Revenue Commissioner
Official website of the President of Ireland
The Courts Service of Ireland
The Office of Public Works
Central Statistics Office
Political Party Websites
Fianna Fail
Fianna Geil
The Labour Party
The Green Party
Progressive Democrats
Socialist and Workers Part
Sinn Fein
Websites Highlighted in Society Action Plan
Revenue Online Service (ROS)
FÁS e-recruitment
Land Registry
Examination results
CAO (Central Applications Office)
Driving tests
Government Contracts
Public Service Recruitment
National Sheep Identification System (NSIS)
eForms
Welfare.ie
Libraries
Infrastructure.ie
Farmer IT Training
Have thoughts on this post? Head over and leave a comment on the blog: eGovernment Accessibility Analysis
Follow RedCardinal on Twitter!
Have thoughts on this post? Head over and leave a comment on the blog: Does Google Know Your MSN & Y! Searches?
Follow RedCardinal on Twitter!
It is important to follow the technical aspect of search engines. There is undoubtedly one person who is the authority on both today’s technology and the technology the search engines are currently building to serve us tomorrow. He is Bill Slawski of SEO by the Sea.
SEObytheSEA specialises in patent watching. Yesterday I saw Bill Slawski’s post about Microsoft snooping Google search history. It’s quite interesting from a number of perspectives. But first a little background on what’s going on.
It appears that Firefox has a little known service called search suggest. Search suggest is controlled via the browser.search.suggest.enabled
parameter and basically allows third party access to the search history of your search bar.
So whenever you use the built in search bar of Firefox the search query is added to your history so that suggestions can be made based on your prior behaviour.
Now this is where it gets interesting. Apparently Firefox allows third party search plug-ins access to your history so that they too can offer suggestions based on your previous searches. But whereas you might presume that one search engine wouldn’t, or shouldn’t, have access to searches executed on another, well, you’d be wrong.
Apparently Microsoft Live suggested some of Bill’s previous Google queries. Bill then saw that his search history was being sent to Microsoft Live via the browser.search.suggest
feature of Firefox. That feature transports your history via a JSON encoded file when this feature is turned on.
Of course SEObytheSEA is renowned for its coverage of search engine patents. Low and behold, haven’t Microsoft a patent (published November 16) entitled ‘System and method for automatic generation of suggested inline search terms’.
The default setting of browser.search.suggest.enabled
is TRUE
in the latest version of Firefox (2.0). (This can be changed via about:config
.)
This means that if you are using the built in search bar, a search engine can see your query history regardless of whether it executed those queries. From the SEO by the SEA post:
I performed a search in Windows Live for a term that I don’t believe I ever searched for before on a search engine. I then went to Google Suggest, and started typing in the first couple of letters of the that word to see if it would suggest my Windows Live search term.
It did.
While most people understand that additional toolbars (e.g. Google Toolbar) commonly track your behaviour, it may not be apparent that your search history is made available via this relatively unknown feature of Firefox 2.0.
Of course it’s not as if the major search engines aren’t already collecting enough data on us….
[Some concerned viewers might be interested in CustomizeGoogle plugin for Firefox.]
Have thoughts on this post? Head over and leave a comment on the blog: Does Google Know Your MSN & Y! Searches?
Follow RedCardinal on Twitter!
Have thoughts on this post? Head over and leave a comment on the blog: Golden Spider Awards – Are These Really Ireland’s Best Websites?
Follow RedCardinal on Twitter!
Total Sample: 128 sites;
Valid CSS: 33 (26%);
Valid HTML: 12 (9%);
Valid Section 508: 28 (22%);
Valid WCAG 1.0 Priority 1: 26 (20%);
Valid WCAG 1.0 Priority 2: 4 (3%);
Valid WCAG 1.0 Priority 3: 3 (2%);
Valid TV Core: 37 (29%);
Valid TV HTML: 15 (12%);
Valid TV WCAG 1.0 Priority 1: 43 (34%);
Valid WAVE Overlay: 27 (21%);
Sites with consistent mark-up: 76 (59%);
CSS, HTML, Section 508 & WCAG 1.0 Priority 1 Compliant: 6 (5%)
Special mention should be made of both www.ssiaoptions.ie and www.ulsterbank.ie. Both sites validated to WCAG 1.0 Priority 3 standard and require only limited changes in order to become fully compliant with all tests conducted.
This analysis was born from the comments left in this thread about W3C standards compliant coding.
Have thoughts on this post? Head over and leave a comment on the blog: Golden Spider Awards – Are These Really Ireland’s Best Websites?
Follow RedCardinal on Twitter!
Have thoughts on this post? Head over and leave a comment on the blog: Golden Spiders Analysis Notes
Follow RedCardinal on Twitter!
#1 Framed Site
#2 Unable to test framed site in WAVE
#3 Very minor issue possibly from CMS
#4 Site not accessible without JavaScript enabled
#5 Very minor error in CSS
#6 Very minor omission
#7 In-line CSS could not be validated
#8 In-line CSS could not be validated
#9 Parser failed to access file
#10 Parser could not access file – possibly behind a firewall
#11 No CSS used on page
#12 Flash Website with no HTML alternative
#13 Parser failed to parse file.
#14 No CSS used on page
#15 Could not validate – maybe __VIEWSTATE value?
#16 Validation errors were caused by non-critical image elements. But ALT attribute was missing not empty.
#17 LABEL for attribute != input ID attribute
#18 Empty LABEL tag
#19 WAVE couldn’t parse page overlay
#20 Cynthia – HTTP Transfer Error – 12007: [12007] Internet Name Not Resolved.
Have thoughts on this post? Head over and leave a comment on the blog: Golden Spiders Analysis Notes
Follow RedCardinal on Twitter!
Have thoughts on this post? Head over and leave a comment on the blog: Golden Spiders Take #4
Follow RedCardinal on Twitter!
(NB You can click on the image for a larger version.)
The final four categories contained 29 websites of which 1 passed the benchmark for standards compliance – www.primaryscience.ie.
I will post a complete ranking overview of the final results shortly.
Have thoughts on this post? Head over and leave a comment on the blog: Golden Spiders Take #4
Follow RedCardinal on Twitter!
Have thoughts on this post? Head over and leave a comment on the blog: Golden Spiders Take #3
Follow RedCardinal on Twitter!
Please note that I have made a very slight change to the legends used. ‘DUAL-CASE’ has been changed to ‘SINGLE-CASE’ to denote the use of consistent type-case throughout the HTML document.
(NB Again you can click on the image for a larger version.)
In these four categories no websites fully passes the benchmark for standards compliance.
Quite notable, however, was the Professional Services category. Four of the eight sites short-listed passed both Section 508 and WCAG 1.0 Priority 1. A number of these sites also displayed high coding standards.
It is possible that the professional service firms may have a better grasp of the accessibility standards. This may be due to the inherently human nature of many of these businesses.
Categories 9, 10 and 12 contain primarily Internet-based businesses. Thus far these categories are tending toward the least standards compliant.
Total Sample: 99 sites;
Valid CSS: 26 (26%);
Valid HTML: 10 (10%);
Valid Section 508: 20 (20%);
Valid WCAG 1.0 Priority 1: 18 (18%);
Valid WCAG 1.0 Priority 2: 4 (4%);
Valid WCAG 1.0 Priority 3: 3 (3%);
Valid TV Core: 32 (32%);
Valid TV HTML: 11 (11%);
Valid TV WCAG 1.0 Priority 1: 32 (32%);
Valid WAVE Overlay: 23 (23%);
Sites with consistent mark-up: 59 (59%);
CSS, HTML, Section 508 & WCAG 1.0 Priority 1 COMPLIANT: 3 (3%).
Have thoughts on this post? Head over and leave a comment on the blog: Golden Spiders Take #3
Follow RedCardinal on Twitter!
This sample includes the Best Web Design Agency category.
Have thoughts on this post? Head over and leave a comment on the blog: Golden Spiders Take #2
Follow RedCardinal on Twitter!
After a somewhat disillusioning start to my analysis of the Golden Spiders short-listed websites I’m quite sure that things will be getting better.
The next 4 categories include the Best Web Design Agency group and they, being experts in this field, are more likely to have a higher compliance with coding and accessibility standards.
I wont go into the methodology used, full details of which are included in my first report (Golden Spiders Take #1).
(NB Again you can click on the image for a larger version.)
In categories 5 not one single site passed the coding and accessibility tests performed, while category 6 entertained two sites which had virtually 100% compliance (each had un-encoded ampersands or brackets within text elements).
Category 7 showed some promise. bluecubeinteractive.com/property_news was fully code and accessibility compliant (although the page in question is very, very basic), and I have to tip my hat to Ulster Bank – their site was WCAG 1.0 Priority 3 compliant and the two failed tests were nominal at best (1 hours work and this could be one of the most code-compliant sites in the land).
It is worth stating at this time that the tests I perform are in no way meant to be full audits of these websites’ accessibility levels. Such testing requires a more wholesome array of tests which would include subjective evaluation. My methodology is restrained purely to those tests that can be automated and require no subjective input.
The Web Design category short-list contains those design agencies who, as experts in their field, should have the highest knowledge of both coding and accessibility disciplines. Well done to Lightbox.ie and Tower.ie who both achieved a very high level of compliance with the coding and accessibility standards. To be fair, a number of the failed tests in this category were marginal and those pages could be easily repaired.
Unfortunately, I was particularly surprised at the results for Magico.ie and Strata3.com. Both pages analysed displayed an extremely high level of coding errors.
One final notable point was the proliferation of dual-case coding. By this I mean the coding of some tags in lower case and others in capital or upper-case. Generally, although not always, this is a sign of ‘cut-and-paste’ re-use of old code.
Within the second sample of 34 websites there was an improvement in coding and accessibility standards. ThreeFive sites fully conformed with the accessibility guidelines. A special mention of the Ulster Bank site is also worthy.
Total Sample: 66 sites;
Valid CSS: 18 (27%);
Valid HTML: 8 (12%);
Valid Section 508: 15 (23%);
Valid WCAG 1.0 Priority 1: 13 (20%);
Valid WCAG 1.0 Priority 2: 3 (5%);
Valid WCAG 1.0 Priority 3: 3 (5%);
Valid TV Core: 22 (33%);
Valid TV HTML: 9 (14%);
Valid TV WCAG 1.0 Priority 1: 24 (36%);
Valid WAVE Overlay: 18 (27%);
Sites with consistent mark-up: 33 (49%);
CSS, HTML, Section 508 & WCAG 1.0 Priority 1 COMPLIANT: 5 (8%).
Further updates will appear shortly. Again, any feedback on this is greatly appreciated.
Have thoughts on this post? Head over and leave a comment on the blog: Golden Spiders Take #2
Follow RedCardinal on Twitter!
Here's the first batch of results.
Have thoughts on this post? Head over and leave a comment on the blog: Golden Spiders Take #1
Follow RedCardinal on Twitter!
The eircom Golden Spiders are widely regarded as the oscars of the internet industry and were established exactly this day 10 years ago – to reward excellence in design, functionality, creativity and innovation in Ireland’s internet industry.
After receiving some feedback on this issue, I thought it might be appropriate to conduct a small study into one particular area of web design that is not alluded to directly, but is extremely important – ACCESSIBILITY.
It is generally accepted that objective measures of accessibility are defined relative to global standards as set out by the World Wide Web Consortium and other bodies.
The W3C has responsibility for the Web Accessibility Initiative (WAI) and also sits on the advisory commission for the revision of U.S. Section 508 standards. The WAI maintained Web Content Accessibility Guidelines (WCAG) and U.S. Section 508 are the accepted international standards for Web accessibility measurement.
Alongside these accessibility standards it is a generally accepted wisdom that valid mark-up (the code that runs all web pages) is a best practice for ensuring standardised delivery across client platforms.
Using the URLs listed at www.goldenspiders.ie I set about constructing a testing mechanism to appraise the accessibility and coding practices of the short-listed websites.
Each site underwent the following tests:
3 separate tools were used to assess the accessibility of each page short-listed for the Golden Spider Awards. This methodology conforms to the best practice as set out by the WAI. No other pages within those sites were tested for this study.
(NB Click on the image for a larger version.)
Of this initial sample of 32 websites one site comes close to fully conforming to the accessibility guidelines. In fact www.ssiaoptions.ie actually conformed to the highest level of WCAG 1.0 – Priorities 1, 2 and 3. Although the site’s CSS failed to validate it is an extremely accessible website.
Total Sample: 32 sites;
Valid CSS: 9 (28%);
Valid HTML: 2 (6%);
Valid Section 508: 4 (13%);
Valid WCAG 1.0 Priority 1: 3 (9%);
Valid WCAG 1.0 Priority 2: 2 (6%);
Valid WCAG 1.0 Priority 3: 2 (6%);
Valid TV Core: 10 (31%);
Valid TV HTML: 1 (3%);
Valid TV WCAG 1.0 Priority 1: 9 (28%);
Valid WAVE Overlay: 7 (22%);
Sites with consistent mark-up: 16 (50%);
CSS, HTML, Section 508 & WCAG 1.0 Priority 1 COMPLIANT: 0 (0%).
[I updated the above overview at 12:30pm Nov 5 to correct an error in the 'No. of sites with consistent mark-up' and also to include the figure for 'Zero visible WAVE Errors'. Second update at 10:45pm Nov 5 to correct some figures and typos in the overview.]
I would be very interested to hear any views you might have on the above. Further results will be posted later today.
Have thoughts on this post? Head over and leave a comment on the blog: Golden Spiders Take #1
Follow RedCardinal on Twitter!
Check out this innovative UI built on Yahoo's User Interface Library.
Have thoughts on this post? Head over and leave a comment on the blog: Is This The Future Of Blog Commenting?
Follow RedCardinal on Twitter!
A couple of weeks ago I came across this blog and thought the UI was pretty cool.
Now, via Ajaxian, I see that Jack Slocum has integrated a very new and very innovative commenting system:
The interface is built on Yahoo User Interface Library, and allows the ‘block commenting’ of posts – you can literally attach your comment to any block of the main post.
The code is available for download from Jack’s site and works with WordPress blogs.
Very interesting and might just change the way blogs are commented going forward.
Have thoughts on this post? Head over and leave a comment on the blog: Is This The Future Of Blog Commenting?
Follow RedCardinal on Twitter!
Have thoughts on this post? Head over and leave a comment on the blog: Dublin Coastal Development a LOT Slicker Than Funda.ie
Follow RedCardinal on Twitter!
Look, I’m really, really sorry Funda, but I hope this isn’t the end product? Come on? After the great viral I know you have to have something else for us?
Well I suppose I’ll have to give them the benefit of the doubt for the moment. Their site obviously wasn’t ready in time (the help page is a bit of a laugh).
Maybe after watching that video I was expecting to get bells and whistles galore and some strong unique site features to attract home buyers.
Well it’s all very basic (especially when compared with funda.nl). I’m a wee bit surprised and somewhat disappointed to be honest. The dutch are well known for their JS coders, and those page refreshes onSelect of county/city are ugly. In fairness, the detailed views aren’t too bad, but again a bit of spiffing up with some JS and DOM scripting would make the user experience a whole lot better.
Curiously (for me anyway), Funda.ie don’t bother with the page title (unless they hope to rank for ‘Detail’) and file name structure is going to make it extremely hard to get the site indexed. (Come on, I had to get in something about the SEO!)
I did notice that Funda are looking for both a lead and a developer so I’m sure the website will improve soon.
Oh, and I’m still waiting for the big surprise follow-through that gets us all talking about Funda Ireland and not Dublin Coastal Development. Please don’t let me down Funda.
[Edit - Jason Roe has also been blogging about Funda and the Dublin Coastal Development and has conducted one or two experiments to rank for related key phrases.]
Have thoughts on this post? Head over and leave a comment on the blog: Dublin Coastal Development a LOT Slicker Than Funda.ie
Follow RedCardinal on Twitter!
Have thoughts on this post? Head over and leave a comment on the blog: Another Distraction
Follow RedCardinal on Twitter!
I opened my RSS reader this morning to find that Google had completely renovated the place. And I’m not talking a lick of paint here.
The interface is so impressive relative to what was there previously, and a feature that I always wished I could have is now available – I can now browse by category simply by clicking on a folder:
The list view is great for a quick look at news headings (memo to self – re-read Brian Clarke’s entry about good titles), while the expanded view gives you all the stories in one pane:
In the expanded view items are automatically marked as read when you scroll by them (sweet!). You can elect to turn this feature off in your preferences.
The number of unread items is displayed in aggregate and next to each feeds name (although I noticed that if the feed had a long name then this number was hidden – I hope that maybe Google will fix this by adding it to the tooltip). You can also select ‘[only list updated]‘ next to your subscriptions to filter out all previously read entries.
Google seem to have added a number of search facilities also under the ‘Browse »’ tab. This should help people discover new content and easily add it to their reading lists.
And then you have the ‘Goodies’ under your settings tab. The ‘Subscribe as you surf’ feature is something I like a lot. I’m now off to find or create a Google Toolbar button to add this feature (it has always annoyed me that the ‘Add To Reader’ feature of the toolbar didn’t give Reader as an option). I also noticed a mobile reader (reader.google.com/m) although I’m not 100% sure if this is new or not. I do wonder whether a reader might be a killer-app on mobile platforms (all those millions of commute man-hours every day)?
I liked the old Reader. I have a preference for browser-based readers and have tried quite a few others. In the past I overlooked the lack of filtered views in Reader but now I have just that.
The slickness of the interface is what really appeals to me though. Google really have got the user experience to a T. Their apps just keep getting better and really are an indicator of the future of on-line services.
Now I really need to get some work done – if it weren’t for all these cool distractions…
Have thoughts on this post? Head over and leave a comment on the blog: Another Distraction
Follow RedCardinal on Twitter!
Well the new Gucci site is built on top of these excellent Javascipt libraries and you may be interested in taking a look.
Have thoughts on this post? Head over and leave a comment on the blog: Gucci places faith in Scriptaculous over Flash
Follow RedCardinal on Twitter!
Well even if you haven’t heard of either of these Javascript libraries there’s a high chance your made use of them somewhere or other in your travels around the interweb. Both libraries are used extensively in ‘Web 2.0′ applications such as Digg, a couple of 37signals properties, and Fluxiom (the teaser movie for this application was very cool and the application itself has the best web UI I have seen). Scriptaculous has been responsible for adding a new depth of interactivity and content dynamism that had previously been the sole domain of Macromedia’s Flash application.
A post over on Ajaxian mentioned that Gucci, the world famous fashion house, had implemented a new website built on the Sriptaculous library. I think this is a huge vote of confidence in Javascript in general, and Scriptaculius and Prototype in particular.
The site itself has an extremely clean look-and-feel and I’m sure that many visitors will believe they are in a Flash environment. I did find getting to grips with the navigation a little testing, and from a usability perspective I think this is a bit of a letdown. One feature I liked (but I’m sure it was a necessity) is the system of loading on-demand the glossy images on the site (although, this system is based on a bastardised src
tag in the mark-up which I don’t like).
I have used these libraries previously for a database application running on a corporate intranet and the results, from a usability perspective, were really impressive. Of course, on an intranet it was possible to control the client environment. On the Internet you have issues such as cross-browser and cross-platform compatibilities, to name but a few.
If you are interested in other sites that use Javascript to mimic Flash then the portfolio page of Christof Wagner, which was built by Thomas Fuchs’ to showcase the Scriptaculous library, might also be worth a look.
Of course one further issue, which is probably more peculiar to Ireland than many other locations, is the additional overhead these libraries add to page-load (circa 100Kb). If you have seen the Digg homepage load on dial-up you will know what I’m talking about!
Have thoughts on this post? Head over and leave a comment on the blog: Gucci places faith in Scriptaculous over Flash
Follow RedCardinal on Twitter!
Have thoughts on this post? Head over and leave a comment on the blog: Need WebStats? Google Analytics goes free for all!
Follow RedCardinal on Twitter!
Previously invite-only, Google Analytics has now opened to the masses for business:
I’m happy to tell you that we’ve just removed the wait to receive a Google Analytics account. Now anyone with a website can instantly create one for free by simply by visiting google.com/analytics or by clicking on the “Analytics” tab within AdWords.
I have been using Google Analytics for quite a while now and have to say I like many of the data-rich features available and the very well deigned (in the most part) UI. The ability to share the accounts with my clients has also come in very handy.
The one warning I would give people new to Google Analytics is that, being a javascript based system, it doesn’t give you any information on server errors. Keeping an eye on the server logs form time to time is a wise move to spot any server-related issues.
Have thoughts on this post? Head over and leave a comment on the blog: Need WebStats? Google Analytics goes free for all!
Follow RedCardinal on Twitter!