High Performance Web Sites 132
Michael J. Ross writes "Every Internet user's impressions of a Web site is greatly affected by how quickly that site's pages are presented to the user, relative to their expectations — regardless of whether they have a broadband or narrowband connection. Web developers often assume that most page-loading performance problems originate on the back-end, and thus the developers have little control over performance on the front-end, i.e., directly in the visitor's browser. But Steve Souders, head of site performance at Yahoo, argues otherwise in his book, High Performance Web Sites: Essential Knowledge for Frontend Engineers." Read on for the rest of Michael's review.
The typical Web developer — particularly one well-versed in database programming — might believe that the bulk of a Web page's response time is consumed in delivering the HTML document from the Web server, and in performing other back-end tasks, such as querying a database for the values presented in the page. But the author quantitatively demonstrates that — at least for what are arguably the top 10 sites — less than 20 percent of the total response time is consumed by downloading the HTML document. Consequently, more than 80 percent of the response time is spent on front-end processing — specifically, downloading all of the components other than the HTML document itself. In turn, cutting that front-end load in half would improve the total response time by more than 40 percent. At first glance, this may seem insignificant, given how few seconds or even deciseconds it takes for the typical Web page to appear using broadband. But any delays, even a fraction of a second, accumulate in reducing the satisfaction of the user. Likewise, improved site performance not only benefits the site visitor, in terms of faster page loading, but also the site owner, with reduced bandwidth costs and happier site visitors.
High Performance Web Sites | |
author | Steve Souders |
pages | 168 |
publisher | O'Reilly Media |
rating | 9/10 |
reviewer | Michael J. Ross |
ISBN | 0596529309 |
summary | 14 rules for faster Web pages |
Creators and maintainers of Web sites of all sizes should thus take a strong interest in the advice provided by "Chief Performance Yahoo!," in the 14 rules for improving Web site performance that he has learned in the trenches. High Performance Web Sites was published on 11 September 2007, by O'Reilly Media, under the ISBNs 0596529309 and 978-0596529307. As with all of their other titles, the publisher provides a page for the book, where visitors can purchase or register a copy of the book, or read online versions of its table of contents, index, and a sample chapter, "Rule 4: Gzip Components" (Chapter 4), as a PDF file. In addition, visitors can read or contribute reviews of the book, as well as errata — of which there are none, as of this writing. O'Reilly's site also hosts a video titled "High Performance Web Sites: 14 Rules for Faster Pages," in which the author talks about his site performance best practices.
The bulk of the book's information is contained in 14 chapters, with each one corresponding to one of the performance rules. Preceding this material are two chapters on the importance of front-end performance, and an overview of HTTP. Together these form a well-chosen springboard for launching into the performance rules. In an additional and last chapter, "Deconstructing 10 Top Sites," the author analyzes the performance of 10 major Web sites, including his own, Yahoo, to provide real-world examples of how the implementation of his performance rules could make a dramatic difference in the response times of those sites. These test results and his analysis are preceded by a discussion of page weight, response times, YSlow grading, and details on how he performed the testing. Naturally, if and when a reader peruses those sites, checking their performance at the time, the owners of those sites may have fixed most if not all of the performance problems pointed out by Steve Souders. If they have not, then they have no excuse, if only because of the publication of this book.
Each chapter begins with a brief introduction to whatever particular performance problem is addressed by that chapter's rule. Subsequent sections provide more technical detail, including the extent of the problem found on the previously mentioned 10 top Web sites. The author then explains how the rule in question solves the problem, with test results to back up the claims. For some of the rules, alternative solutions are presented, as well as the pros and cons of implementing his suggestions. For instance, in his coverage of JavaScript minification, he examines the potential downsides to this practice, including increased code maintenance costs. Every chapter ends with a restatement of the rule.
The book is a quick read compared to most technical books, and not just due to its relatively small size (168 pages), but also the writing style. Admittedly, this may be partly the result of O'Reilly's in-house and perhaps outsource editors — oftentimes the unsung heroes of publishing enterprises. This book is also valuable in that it offers the candid perspective of a Web performance expert, who never loses sight of the importance of the end-user experience. (My favorite phrase in the book, on page 38, is: "...the HTML page is the progress indicator.")
The ease of implementing the rules varies greatly. Most developers would have no difficulty putting into practice the admonition to make CSS and JavaScript files external, but would likely find it far more challenging, for instance, to use a content delivery network, if their budget puts it out of reach. In fact, differences in difficulty levels will be most apparent to the reader when he or she finishes Chapter 1 (on making fewer HTTP requests, which is straightforward) and begins reading Chapter 2 (content delivery networks).
In the book's final chapter, Steve Souders critiques the top 10 sites used as examples throughout the book, evaluating them for performance and specifically how they could improve that through the implementation of his 14 rules. In critiquing the Web site of his employer, he apparently pulls no punches — though few are needed, because the site ranks high in performance versus the others, as does Google. Such objectivity is appreciated.
For Web developers who would like to test the performance of the Web sites for which they are responsible, the author mentions in his final chapter the five primary tools that he used for evaluating the top 10 Web sites for the book, and, presumably, used for the work that he and his team do at Yahoo. These include YSlow, a tool that he created himself. Also, in Chapter 5, he briefly mentions another of his tools, sleep.cgi, a freely available Perl script that tests how delayed components affect Web pages.
As with any book, this one is not perfect — nor is any work. In Chapter 1, the author could make more clear the distinction between function and file modularization, as otherwise his discussion could confuse inexperienced programmers. In Chapter 10, the author explores the gains to be made from minifying JavaScript code, but fails to do the same for HTML files, or even explain the absence of this coverage — though he does briefly discuss minifying CSS. Lastly, the redundant restatement of the rules at the end of every chapter, can be eliminated — if only in keeping with the spirit of improving performance and efficiency by reducing reader workload.
Yet these weaknesses are inconsequential and easily fixable. The author's core ideas are clearly explained; the performance improvements are demonstrated; the book's production is excellent. High Performance Web Sites is highly recommended to all Web developers seriously interested in improving their site visitors' experiences.
Michael J. Ross is a Web developer, freelance writer, and the editor of PristinePlanet.com's free newsletter.
You can purchase High Performance Web Sites from amazon.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
Is it just me... (Score:2)
Re: (Score:3, Insightful)
What's suspicious about the fact that a book written by the creator of YSlow addresses the very issues that YSlow, a free open source Firefox extension, addresses? It would be pretty strange if it didn't.
If you want to be so paranoid about the intentions of an author, at least find one it's reasonable to be suspicious about in the first place.
YSlow (Score:1)
Seeing that this is about YSlow, just thought I'd mention that it isn't much of an extension. It'll make a few notes about your web pages (that you should probably already know if you created them) then give you links to Yahoo's website for suggestions on how to fix them.
So, in the spirit of cutting out the middleman, here's all the information you'd get about speeding up your web site without having to install YSlow: developer.yahoo.com... [yahoo.com]
Re: (Score:2)
Incidentally, YSlow gives Slashdot a C.
Rule #34: Don't be the first Java site of the day (Score:4, Funny)
Rule #34: Don't be the first Java site your users visit during the day. (Unfortunately, this pretty turned into "don't use Java applets" unless you could find a hidden way to load an throwaway applet in another frame, etc.)
Re:Rule #34: Don't be the first Java site of the d (Score:1, Insightful)
Re: (Score:3, Funny)
Re:Rule #34: Don't be the first Java site of the d (Score:4, Interesting)
Interesting (Score:1)
All my sites load fast (Score:2, Interesting)
All my pages are static HTML. Not a web application in site, not even PHP. Yes, it's a drag when I need to do some kind of sitewide update, like adding a navigation item.
I also have less to worry about security, as long as my hosting service keeps their patches up to date, I know I haven't introduced any holes myself.
Also, for the most part, my pages are very light on graphics, with most of the graphics present being repeated on every page such as my site's logo, which gets cached.
Finally, all
Re:All my sites load fast (Score:5, Funny)
You forgot to link to your site... [amish.org]
Re: (Score:2)
It's easy to find parking space for my car (Score:5, Funny)
It's a bicycle!!1
Solution (Score:5, Interesting)
This is a great point, but here is my anecdotal experience:
Years ago, I tested static HTML vs. PHP by simply benchmarking a simple document (I used the GPL license). On the particular box, I was able to serve over 400 pages per second with static HTML but only about 12 pages per second with PHP. I was blown away. I went one step further and used PHP to fetch the data from Oracle (OCI8, IIRC) and that went down to 3 requests/sec. You can see that caching does help, but not a whole lot.
So, rather than whine about it, what is the solution?
AJAX, done properly, will solve the problem. Basically, instead of serving dynamic pages with PHP, JSP, ASP or whatever... just serve an AJAX client (which is served in a speedy manner with no server side processing to bog things down). This client loads in the browser and fetches a static XML document from the server and then uses the viewer's browser to generate the page - so everything thrown down by the server is static and all processing is done on the client side.
Now, to facilitate a dynamic website (e.g. - message board, journal, or whatever), you have to generate the XML file upon insert (which are generally a small fraction of the read load) using a trigger or embedded in the code.
Viola! Static performance with dynamic content using browser-side processing.
This is unnecessary - and dangerous (Score:3, Insightful)
Using AJAX you ha
Re: (Score:3, Informative)
Re:Solution (Score:5, Informative)
I find that sites built with the method you describe are the asshole sites that fuck with browser history, disable the back button, try to disable the context menu, and those dumb ass tricks to get around the fact they don't know how to write proper server side code.
There's no reason you can't make a fast serverside site (with ajax too, that works without the stupid tricks I described above), if you can't I suggest you educate yourself, or don't use a wallmart PC for production use.
I've personally written many J2EE webapps (no EJB BS, spring & struts & jsp/velocity) that where very fast, with proper coding you can let the browser cache stuff so it constantly doesn't have to refetch crap. when you do this, all you push down to the client is the HTML to render, which browsers are really good at doing quickly.
Re: (Score:1)
http://highscalability.com/ [highscalability.com]
http://www.allthingsdistributed.com/ [allthingsdistributed.com]
Re: (Score:3, Insightful)
sounds good, except you may or may not know that a lot of javascript implementations are sloooow. not to mention you usually have to set the no cache headers for everything in the page so your javascript works right.
I find that sites built with the method you describe are the asshole sites that fuck with browser history, disable the back button, try to disable the con
Re: (Score:2)
I've used ajax type techniques in many projects. I know what it does well, and what it doesn't.
I'm assuming that your talking about sites where the user never submits data. Using no serverside validation on user input is completely retarded, just ask slashdot what happened when they didn't scrub the input from their users for j
Re: (Score:2)
12 page/sec eh? You didn't put a busy wait in there did you? I've never seen per
Accessibility and search indexing (Score:3, Insightful)
Re:Solution - Not (Score:1)
This is silly.
If you have to "generate the XML file" every time the data changes, why not just write an x/html file and serve it?
Even better, why not cache the x/html file instead of generating it all the time.
Re: (Score:1)
You generate XML for each piece of content. Multiple templates can then serve the dynamic content while the templates themselves are static. You only need to generate the content once, regardless of how many templates it appears in.
That said, I still think generating the entire pages is the way to go. Less requests should give better performance. Not to mention the other issues (search engine friendly (or other non-JS aware), proper back/foward support, less work for the browser m
precompile your HTML (Score:3, Informative)
Re: (Score:2)
PHP and database are not *that* slow (Score:2)
If you want really fast and scalable, AOLserver still blows PHP away today (like it did at the time you were testing), despite PHP's (and Apache's!) improvements.
AJAX is a terrible solution in my opinion. First of all, it doesn't free you from server-side processing; all you are doing is caching what would otherwise have come from the
Re: (Score:2)
PHP by default can be slow, because it has to be parsed, tokenized and then execut
Re: (Score:2)
Re: (Score:2)
I'm not sure you intended to say it does, but that doesn't really solve any of the problems you list for AJAX. It might solve the cl
Re: (Score:2)
It's possible (and easy) to write PHP scripts that do 50 or more selects, a few inserts and updates, tons of string manipulation and still load in about a second over a decent connection. Besides, when every page you generate is dynamic, and the content changes every second, the only things cache-able are images, scripts and style-sheets. So you generate the HTML on the fly,
Re:All my sites load fast (Score:5, Funny)
In other words, it's a smalltime hobby site, and you're not a web developer. That's fine, and I agree that it's quite nice and reassuring to simplify like this where possible. However...
Go on out into the job market advertising your incredible "static page" skills, and what lightening fast load times you'll bring to your employer. Offer to convert their entire 20GB of online content to static XHTML 1.0 Strict to obtain the peace of mind that comes with knowing you haven't introduced any holes yourself. Hell, I'm going to go right now and submit a patch to MediaWiki that generates static versions of every article and then deletes all the PHP from the entire web root! I'm sure as soon as I tell them about the performance boost, they'll be right on board!
Re: (Score:2)
I think the grandparent thinks of it as "not complicating" rather than as "simplifying" ...
I also find it slightly amusing that you can tell from this that he's not a web developer ;-)
Actually my sites are commercial (Score:2)
Re: (Score:2, Insightful)
Also, don't the adds call some sort of script? I wouldn't call that static.
Re:All my sites load fast (Score:4, Insightful)
Umm... there are plenty of content management systems (say, Cascade [hannonhill.com]) that manage content and publish it out to HTML. Even Dreamweaver's templating system will do this. Just because you use pure HTML, doesn't mean you have to lose out on sitewide management control.
Re: (Score:1)
Re: (Score:2, Informative)
They added a new requirement to the validator! (Score:1)
I've been adding it to my pages as I work on them, but I haven't worked on that page for a while.
And yeah, such requirements for sitewide updates is the best argument against my method.
Re: (Score:2)
You should use something like PHP. Then you could have an includable header... oh wait, nevermind.
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
gzip (Score:1)
Re: (Score:1)
Re:gzip (Score:4, Insightful)
Re: (Score:1)
Re: (Score:2)
gzip content can display while it is still loading (Score:2, Interesting)
Firefox 2 and IE7 do indeed begin to display gzip-compressed pages while they are still loading over the net. The method used to verify this was to insert a local Squid cache that uses "delay pools" to limit transfer bandwidth and that records time and duration of all network transfers made. Using this method, I could see that a lengthy compressed HTML page was transferred in 14 seconds, and the content became visible in IE7 after 6 seconds and finished loading after 18 seconds.
If you have a physically
Re: (Score:2)
The book would be a lot more believable... (Score:5, Insightful)
Re: (Score:2)
Its because in Soviet Yahoostan "minifying" your pages embiggens YOU!
Doh! Test yer pages! (Score:4, Insightful)
Then every added feature has to be justified -- perceived added value versus cost-to-load. Sure, the artsies won't like you. But it isn't your decision or theirs. Management must decide.
For greater sophistication, you can measure your dl rates by file to see how much is in users caches. And decide whether these are also not a cause of slowness!
Interesting Points... (Score:2, Insightful)
Re: (Score:3, Insightful)
Exellent Subject (Score:1)
It's my general experience (Score:2)
MeeVee bloat (Score:2)
Re: (Score:3, Interesting)
It's not an excuse, it's just that teams are so fluid, project work is chaotic and project management is driven by marketing considerations (read: "get it out", not "enterprise stability") that site performance is seen as a server hardware issue.
Shame r
Re: (Score:1)
Where is the rule "Avoid Ad-Networks"? (Score:5, Interesting)
I guess I am not alone in noticing that often the ads on a page drag the load time way down. I find it interesting, that there is no rule about minimizing content dragged in from other servers you have no or little control over. Blind spot because of Yahoo's business, I guess.
Re: (Score:2)
This is especially annoying on sites where the ads are apparently forced to load before things like the text (i.e., the content I am actually looking for) render. Anandtech used to really piss me off in this respect - the ad server would take forever, and there was nothing to read until the ads loaded (haven't noticed this behavior lately).
I suppose I might be able to block the ads, but it is my feeling that as long
Re: (Score:1)
On the nose, Josef.
How much time have we all spent looking at a blank browser window with "...completed 12 of 13 items." at the bottom?
Whatever else [spacetoast.net] I might think of it, Facebook has a nice trick that appears to work as follows. The page loads with a blank graphic where the ad should be. Afterward, an onLoad script fires requesting the ad and replacing the blank graphic with it. The ads take a moment to load: the page is instantly on. Proper priorities.
(As a corollary, I've got a Dice ad at the to
Re: (Score:2)
This is covered by rule 9 (reduce DNS lookups) and rule 1 (make fewer HTTP requests).
You shouldn't confuse what's in a book review with what's in the book itself...
The book is about speed, not performance. (Score:4, Interesting)
"Performance" is not a general-purpose synonym for "speed." "Performance" is a much more general term; it can refer to memory utilization, fault tolerance, uptime, accuracy, low error rate, user productivity, user satisfaction, throughput, and many other things. A lot of people like to say "performance" just because it's a longer word and it makes them sound smart. But this habit just makes them sound fake -- and more importantly, it encourages people to ignore all the other factors that make up the bigger picture. This book is all about speed, and the title should reflect that.
So, I beg you: resist the pull of unnecessary jargon. The next time you are about to call something "performance," stop and think; if there's a simpler or more precise word for what you really mean, use it.
Thanks for listening!
Re: (Score:1)
I see where you are coming from but I don't like this quote. Performance was a decent term to use since this covers a lot of ground. A fast performing website 'performs' well for the user. All your examples are factors of a well performing website:
memory utilization [browser uses less memory]
low error rate [user doesn't make so many mistakes, doesn't misclick something due to lag, doesn't forget what they
Re: (Score:1)
It isn't a book about making users more productive, or a book about reducing user error. This is a book about making websites fast. The other factors are only peripheral effects. Those 14 rules that Souders is pushing are all about speed, not these other factors.
Making a website fast may improve many things about the experience. But speed is not the only thing you need to make a website perform well.
The reviewer makes the same m
Odd Summary (Score:4, Insightful)
Let's correct this summary a little bit. First, it's NOVICE Web developers who would think this. Any web developer worth their weight knows the basic idea that java, flash, and other things like it make a PC work hard. The website sends code, but the PC has to execute the code, rather than the website pushing static or dynamic HTML and having it simply render. We bitch and moan enough here on slashdot about flash/java heavy pages, I feel this summary is misdirected as if web developers here didn't know this.
Secondly, there's no argument, so Steve doesn't have to argue with anyone. It's a commonly accepted principle. If someone didn't learn it yet, they simply haven't learned it yet.
Now, I welcome a book like this because #1 it's a great tool for novices to understand the principle of optimization on both the server and the PC, and #2 because it hopefully has tips that even the above average admin will learn from. But I scratch my head when the summary makes it sound like it's a new concept.
Pardon me for nitpicking.
Re: (Score:2)
Sort of... (Score:3, Insightful)
Advertising slows everything down (Score:3, Insightful)
Putting an adblocker of some sort or Mozilla Adblock Plus is a great way to speed up any page (from the user's point of view, of course).
Third-party content. (Score:3, Insightful)
Basically anything that's not under your control can slow your site down significantly.
wish it did focus on the backend (Score:2)
Web development is especially bad at optimization. This thread demonstrates the problem:
http://forums.devnetwork.net/viewtopic.php?t=74613 [devnetwork.net]
People there are actually recommending you wait until your server fails before you look to optimize.
Re: (Score:2)
The problem is poor requirement specifications. I worked for so many companies that had all the pretty UML architecture and usecases down, but no requirement specifications.
Uptime requirements, response time, security requirements (so all around QOS), maintenance requirements, support, ANYTHING would help, but they don't. So when the application (web or otherwise) is slow, the developers don't know if its "good enou
Head of site performance at Yahoo, huh? (Score:2, Funny)
http://validator.w3.org/check?uri=http%3A%2F%2Fwww.yahoo.com%2F&charset=(detect+automatically)&doctype=Inline&group=0 [w3.org]
Double WTF:
http://validator.w3.org/check?uri=http%3A%2F%2Fwww.slashdot.org&charset=(detect+automatically)&doctype=Inline&group=0 [w3.org]
Re: (Score:1)
Web 2.0 performance costs (Score:5, Insightful)
- Spike
Freeware OpenGL arcade game SOL, competitor in the 2008 Independent Games Festival: http://www.mounthamill.com/sol.html [mounthamill.com]
Re: (Score:2)
Now, I'm on a "modern" PC (1GB RAM, 1.8Ghz CPU
Re: (Score:2)
"Variable and path" in HTML? What are you talking about?
Anyway, Yahoo's site started off relatively small in '95 or so, as most sites were then. But as I remember it, they were one of the first to unleash those bloated late-nineties "portal" sites, complete with stock ticker, 14-day forecast, and the latest celebrity gossip.
I hardly think they were terribly concerned about fast
Re: (Score:2)
Swap "Filename and path" for "Variable and path" and you will glean enlightenment.
I guess he's not used the new Yahoo Mail interface (Score:4, Interesting)
Re:I guess he's not used the new Yahoo Mail interf (Score:1)
Re:I guess he's not used the new Yahoo Mail interf (Score:1)
When I switch back, I always make a comment about the performance, but haven't seen them do anything to fix it. The new version of My Yahoo! seems to have the same issues.
Still the same with web masters. (Score:3, Insightful)
Those Web designers should be called "Unemployed"
Outsource everything (Score:2)
So not only do they now outsource the web page designers, they are outsourcing the technical writers?
What's next? Outsource the audience?
ISBN redundancy (Score:3, Informative)
There's no need to list both the ISBN 10 and the ISBN 13. ISBN 13 is a superset of ISBN10. Notice that both numbers contain the exact same 9 data digits:
0596529309
9780596529307
The only difference is the 978 "bookland" region has been prepended, and the check digit has been recalculated (using the EAN/UPC algorithm, instead of ISBN's old algo). You can just give the ISBN 10, or just the ISBN 13. You can trivially calculate one from the other. All software that deals with ISBNs should do this for you. e.g., if you search either the ISBN13 or ISBN10 on amazon, you'll end up at the exact same page.
Re: (Score:2)
Re: (Score:2)
My advice to speed up your website (Score:1)
It's designed from the ground up as an HTTP accelerator. It's extremly fast, in most cases way faster than Squid. However if you rely a lot on cookies you should look somewhere else.
Ad-Networks (Score:1)
I have lost countless hours of my life waiting for pages to render while they suck down banner ads from overloaded delivery networks (e.g. Falkag).
Have read, mixed feelings (Score:4, Insightful)
I have looked at the book again now, and there seem to have been some changes. For example, there were only 13 rules when I was reviewing those before. Now there are 14. As one example, ETags were advised to not be used at all (IIRC, my biggest WTF about the book - if used correctly, ETags are marvellous things and compliment 'expires' very nicely), instead of the current 'only use if done correctly'. Some other things are nigh impossible to do correctly crossbrowser (think ETag + GZIP combo in IE6, AJAX caching in IE7, etc). To be honest, I found pretty much all of this stuff being WebDevelopment 101. If you're not at the level that you should be able to figure most of these things out for yourself, you probably won't be able to put them into practise anyway, and you should not be in a place where you are responsible for these things.
I might pick up this book just to read it again, see about the changes and read the full chapters, just to hear the 'other side of the story', but IMHO this book isn't worth it. In all honesty, the only thing I got out of it so far that I didn't know is the performance toll CSS expressions take (all expressions are literally re-evaluated at every mouse move), but I hardly used those anyways (only to fix IE6 bugs), and in response have written a jQuery plugin that does the required work at only the wanted times (and I've told you this now, so no need to buy the book).
My conclusion, based solely on the fairly large number if excerpts I've read is: if you're a beginner, keep this book off for a while. If you're past the beginner stage but your pages are strangly sluggish, this book is for you. If you've been around, you already know all this stuff.
Language Nazi Note (Score:2)
Flash (Score:2)
Some think Flash is essential to the web brousing experience and a site without Flash is not worth the bother.
Others think that a site with flash is sure evidence of a triumph of style over content, and guarantees its not worth waiting for it to load.
Since Adobe choose not to support FreeBSD, its fairly clear that freeBSD users all fall in the second category. You will have to do other analyses yourself.
HP's Website consistently has CRAPPY performance! (Score:1, Offtopic)
No useless intro in flash (Score:1)
Huh? What's that? (Score:2, Funny)
Errata for sample chapter: gzip vs. deflate (Score:2)
I started reading the first chapter and I was surprised when I read the following paragraph:
Re: (Score:2)
Also in the sample chapter, "Table 4-2. Compression sizes using gzip and deflate" shows that gzip performs better than deflate:
Unfortunately, this table does not specify which compression settings were used for each method. Although both mod_gzip and mod_deflate should default to compr
Re: (Score:1)
Bert