I started out as a designer in technology nearly twenty years ago. To say that I have perspective would be an understatement. This said, I’d be remiss to not rebuke John Naughton for his thoughtless article, Graphic designers are ruining the web. It’s a highly misinformed screed, full of baseless conclusions.

While there certainly is no shortage of bloated pages online, it would be wrong to blame this on designers. That would be about as accurate as me saying all developers are socially inept and need to bathe more often. Anyone who understands online media knows that this is wrong. It’s a clear sign of a lack of design experience.

Mr. Naughton begins his argument by discussing the simplicity of early web pages, which were heavily constrained in appearance:

In the beginning, webpages were simple pages of text marked up with some tags that would enable a browser to display them correctly.

Naughton continues on to describe frustrated designers who could not relinquish visual display control to the browser and proceeded to fight this battle:

But that meant that the browser, not the designer, controlled how a page would look to the user, and there’s nothing that infuriates designers more than having someone (or something) determine the appearance of their work.

Yes, designers were (and continue to be) frustrated at the lack of control with regards to technological constraints. However, these constraints are also what ignites many creative technologists to engineer their way out of such limits.

Without these early constraints, front-end engineering would not be the highly specialized skill it is today. We wouldn’t have accessible sites, not to mention, CSS and AJAX to make our online experiences more usable and engaging.

So they embarked on a long, vigorous and ultimately successful campaign to exert the same kind of detailed control over the appearance of webpages as they did on their print counterparts – right down to the last pixel.

Naughton also implies that the designer is acting alone in their wish to “control the appearance.”  By 1996, many businesses in the US, especially in the San Francisco Bay Area, were figuring out how to get online and what to do once they got there.

As soon as this happened, the commercial web immediately became about branding and experiences.  Entire cross-functional teams were bringing brands online for very different reasons. Designers, business people, technologists all pushed for increasing detail and control. To say this was a long, vigorous campaign brought on by designers is completely inaccurate and narrow-sighted.

Naughton moves on to discuss the increase of visual elements, design detail  and dynamically rendered sites:

And in order to make this possible, webpages ceased to be static text-objects fetched from a file store; instead, the server assembled each page on the fly, collecting its various graphic and other components from their various locations, and dispatching the whole caboodle in a stream to your browser, which then assembled them for your delectation.

While this is true, his timing and metrics are off for the argument he pursues:

 All of which was nice and dandy. But there was a downside: webpages began to put on weight. Over the last decade, the size of web pages (measured in kilobytes) has more than septupled. From 2003 to 2011, the average web page grew from 93.7kB to over 679kB.

Indeed, the size of web pages have increased significantly in the past decade. However, this is not due to dynamically generated content as implied by Mr. Naughton above.  I began working on dynamically driven sites in 1996. If you look at the same data it’s clear that between 1996 and 2003, there was only a nominal page size increase. Clearly, dynamic sites are not the factor.

So, what is the factor of the increase in page size? Royal Pingdom looked at this very problem, only they viewed the data for one year, not seven. Based on the top 1k sites between November 2010 and November 2011, the results clearly indicate that while images are still the largest factor, Javascript is the fastest-growing content type, by far.

You can see this for yourself by switching on the “view status” bar in your browser; this will tell you how many discrete items go into making up a page. I’ve just looked at a few representative samples. The BBC News front page had 115 items; the online version of the Daily Mail had a whopping 344 and ITV.com had 116. Direct.gov had 71 while YouTube and Wikipedia, in contrast, came in much slimmer at 26 and 15 respectively.

It’s irrelevant to compare the number of items that constitute a page for disparate, unrelated sites. What Mr. Naughton fails to recognize is that commercial websites are a business. They serve multiple objectives, have multiple stakeholders and entirely different audiences. So, for example, one cannot compare the BBC News front page to Direct.gov.uk.  They serve extremely different purposes, and are directed at entirely different audiences.

In addition, site design/development can vary radically based on the experience of the team, business objectives, design goals, etc.  A lot more thought as to context is required for Naughton to pursue his line of thinking.

Whether you view this as a good thing or not depends on where you sit in the digital ecosystem. Aesthetes (and graphic design agencies) drool over the elegance of pages whose look and feel is determined down to the last pixel.

Yet again, this is a sweeping generalization. If Mr. Naughton ever cared to look into the design and development process for online firms, he might see that yes, the look and feel is very much down to the pixel, but that this practice is agreed upon by a cross-disciplinary team.

This is because every pixel and its placement can radically alter the revenue for some tech firms as found in exhaustive testing. Having experienced this firsthand, I absolutely know this to be the case.

The websites and pages that I like tend to be as underdesigned as they are cognitively loaded. Take for example, the home page of Peter Norvig, who is Google’s director of research. In design terms it would make any graphic designer reach for the sickbag. And yet it’s highly functional, loads in a flash and contains tons of wonderful stuff .

There’s nothing wrong with enjoying utilitarian sites such as Peter Norvig’s link directory.  Such simple link repositories are fantastic for the purposes they serve. Just take a look at Craigslist and Reddit. They’re simple, utilitarian and wildly popular.

The point is that Naughton continually fails to recognize the context and objectives of online businesses.  Comparing repositories of information, like Norvig’s homepage, to branded experiences, is about as absurd as comparing Craigslist to the BBC iPlayer.  The business and user goals are entirely different. There is no basis for comparison! I can’t help but wonder why his editors at The Observer didn’t know better than to caution Naughton by allowing him to indulge such ridiculous appraisals.

I’ve done a lot  of work in London. I know the city has a wonderful blend of creative, startup and technology communities. Thus it was all the more startling to see such poorly researched tech journalism also be so antagonistic to the global design community, in a place where it truly matters.  It’s destructive. There’s really no need. Though, admittedly I don’t actively seek out my technology news from The Observer, I had much higher expectations, as I’m a regular reader of all things Guardian Unlimited.

As The Guardian family of publications becomes a must-visit resource on the global technology business, it ought to expect the same level of fact-checking and topical expertise as it does of its political journalism. Allowing ill-suited pundits such as John Naughton to run roughshod over a discipline they have no business covering is a sign that editorial needs to do a better job of overseeing tech coverage. Everyone failed this time.