SiteMorse, ran an automated test of 55 of the leading sites. Of the 55 sites tested, only one (Local E-Government) was error-free, over half (29) had over 100 errors, and 10 had over 500, according to Business2www Limited.
In contrast, some sites were impressive - The Equal Opportunities Commission, Local E-Government and the Child Support Agency topped the league table, with the Department of Culture, Media and Sport leading the 'cabinet minister' sites. In contrast to some of the 'online specialist' sites, the Office of the E-Envoy also performed well.
Equally, some sites have shown a dramatic improvement since our last comparative league table published in September/October 2002 - most notably the Inland Revenue which has significantly reduced the number of errors.
This is the second league table of government and public service sites conducted by B2W. Whilst some sites have shown a significant improvement (and some the reverse) there are clearly major problems remaining in maintaining and ensuring the quality and performance of these large and complex sites.
Business2www focused exclusively on the comprehensive, cost effective and cost efficient testing and monitoring of the functional performance of websites, using unique automated diagnostic testing system and software SiteMorse.
Without apology, the primary purpose of these league tables is to assist the marketing of a unique automated testing system, which enables webmasters, IT teams and external website suppliers to ensure that the website is functionally capable of delivering the customer access, satisfaction and interest. In most cases, on line communication provides both convenience and cost efficiency. However, if the website simply fails to perform effectively, there are inevitable negative impacts on the company, its business, its reputation, its brand and perhaps most importantly its market competitiveness.
In addition, websites are not cheap - to build or maintain. There remains a belief that websites cannot fail, but this contrasts with the general customer and visitor experience that most websites are prone to errors and other operating problems, which slow site speed, deny access to critical elements of the site, and add to operating costs.
Internal/manual testing of the site, usually with at best a modem and stand alone PC could never practically emulate the many possible user access permutations - even say home versus corporate user.
Two of the possible causes of website problems could be, firstly, of website (procurement) specifications not having documented minimum standards or targets in terms of functional performance. A suggested example of these standards is set out at the end of this document, offering standards that website quality can be measured against [and with the FREE initial test there can be no reason not to carry out]. The second is the failure to test sites both prior to 'go live' and following changes or updates.
SiteMorse offers fully automated, complete website diagnostic testing - currently the only automated tool that can test every possible page combination or option, reporting on problems visors will experience, by problem type, page and line - allowing very fast and efficient eradication.
Comparative Site Ratings are based on four key functional performance criteria:
Site Errors - are problems that SiteMorse found that cause site visitors problems. Examples include: faulty email addresses, server errors, missing pages/images, faulty page links, DNS setting and bad paths on the site.
Site Warnings - are not causing your site fundamental operational problems, but they represent poor site operating code. They may be affecting the visual display of your site or slowing it down unnecessarily. Warnings are frequently violations of W3C or IETF standards.
Server Response Time - is the time in seconds, taken by the server to respond to a request to access - both the average for all pages and the performance of specific pages are reported. This is not the page download / item download.
Calculated Download Speed - is the maximum speed at which the server is capable of transmitting data. As with Response Time, this is measured for the average of the site as a whole. Please note that this measure should not be confused with Download Time - which is determined substantially by the size and nature of individual page content we measure and report this separately.
The full reports are available from email@example.com