Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

High Performance Web Sites

samzenpus posted more than 7 years ago | from the heavy-duty-net dept.

Book Reviews 132

Michael J. Ross writes "Every Internet user's impressions of a Web site is greatly affected by how quickly that site's pages are presented to the user, relative to their expectations — regardless of whether they have a broadband or narrowband connection. Web developers often assume that most page-loading performance problems originate on the back-end, and thus the developers have little control over performance on the front-end, i.e., directly in the visitor's browser. But Steve Souders, head of site performance at Yahoo, argues otherwise in his book, High Performance Web Sites: Essential Knowledge for Frontend Engineers." Read on for the rest of Michael's review.The typical Web developer — particularly one well-versed in database programming — might believe that the bulk of a Web page's response time is consumed in delivering the HTML document from the Web server, and in performing other back-end tasks, such as querying a database for the values presented in the page. But the author quantitatively demonstrates that — at least for what are arguably the top 10 sites — less than 20 percent of the total response time is consumed by downloading the HTML document. Consequently, more than 80 percent of the response time is spent on front-end processing — specifically, downloading all of the components other than the HTML document itself. In turn, cutting that front-end load in half would improve the total response time by more than 40 percent. At first glance, this may seem insignificant, given how few seconds or even deciseconds it takes for the typical Web page to appear using broadband. But any delays, even a fraction of a second, accumulate in reducing the satisfaction of the user. Likewise, improved site performance not only benefits the site visitor, in terms of faster page loading, but also the site owner, with reduced bandwidth costs and happier site visitors.

Creators and maintainers of Web sites of all sizes should thus take a strong interest in the advice provided by "Chief Performance Yahoo!," in the 14 rules for improving Web site performance that he has learned in the trenches. High Performance Web Sites was published on 11 September 2007, by O'Reilly Media, under the ISBNs 0596529309 and 978-0596529307. As with all of their other titles, the publisher provides a page for the book, where visitors can purchase or register a copy of the book, or read online versions of its table of contents, index, and a sample chapter, "Rule 4: Gzip Components" (Chapter 4), as a PDF file. In addition, visitors can read or contribute reviews of the book, as well as errata — of which there are none, as of this writing. O'Reilly's site also hosts a video titled "High Performance Web Sites: 14 Rules for Faster Pages," in which the author talks about his site performance best practices.

The bulk of the book's information is contained in 14 chapters, with each one corresponding to one of the performance rules. Preceding this material are two chapters on the importance of front-end performance, and an overview of HTTP. Together these form a well-chosen springboard for launching into the performance rules. In an additional and last chapter, "Deconstructing 10 Top Sites," the author analyzes the performance of 10 major Web sites, including his own, Yahoo, to provide real-world examples of how the implementation of his performance rules could make a dramatic difference in the response times of those sites. These test results and his analysis are preceded by a discussion of page weight, response times, YSlow grading, and details on how he performed the testing. Naturally, if and when a reader peruses those sites, checking their performance at the time, the owners of those sites may have fixed most if not all of the performance problems pointed out by Steve Souders. If they have not, then they have no excuse, if only because of the publication of this book.

Each chapter begins with a brief introduction to whatever particular performance problem is addressed by that chapter's rule. Subsequent sections provide more technical detail, including the extent of the problem found on the previously mentioned 10 top Web sites. The author then explains how the rule in question solves the problem, with test results to back up the claims. For some of the rules, alternative solutions are presented, as well as the pros and cons of implementing his suggestions. For instance, in his coverage of JavaScript minification, he examines the potential downsides to this practice, including increased code maintenance costs. Every chapter ends with a restatement of the rule.

The book is a quick read compared to most technical books, and not just due to its relatively small size (168 pages), but also the writing style. Admittedly, this may be partly the result of O'Reilly's in-house and perhaps outsource editors — oftentimes the unsung heroes of publishing enterprises. This book is also valuable in that it offers the candid perspective of a Web performance expert, who never loses sight of the importance of the end-user experience. (My favorite phrase in the book, on page 38, is: "...the HTML page is the progress indicator.")

The ease of implementing the rules varies greatly. Most developers would have no difficulty putting into practice the admonition to make CSS and JavaScript files external, but would likely find it far more challenging, for instance, to use a content delivery network, if their budget puts it out of reach. In fact, differences in difficulty levels will be most apparent to the reader when he or she finishes Chapter 1 (on making fewer HTTP requests, which is straightforward) and begins reading Chapter 2 (content delivery networks).

In the book's final chapter, Steve Souders critiques the top 10 sites used as examples throughout the book, evaluating them for performance and specifically how they could improve that through the implementation of his 14 rules. In critiquing the Web site of his employer, he apparently pulls no punches — though few are needed, because the site ranks high in performance versus the others, as does Google. Such objectivity is appreciated.

For Web developers who would like to test the performance of the Web sites for which they are responsible, the author mentions in his final chapter the five primary tools that he used for evaluating the top 10 Web sites for the book, and, presumably, used for the work that he and his team do at Yahoo. These include YSlow, a tool that he created himself. Also, in Chapter 5, he briefly mentions another of his tools, sleep.cgi, a freely available Perl script that tests how delayed components affect Web pages.

As with any book, this one is not perfect — nor is any work. In Chapter 1, the author could make more clear the distinction between function and file modularization, as otherwise his discussion could confuse inexperienced programmers. In Chapter 10, the author explores the gains to be made from minifying JavaScript code, but fails to do the same for HTML files, or even explain the absence of this coverage — though he does briefly discuss minifying CSS. Lastly, the redundant restatement of the rules at the end of every chapter, can be eliminated — if only in keeping with the spirit of improving performance and efficiency by reducing reader workload.

Yet these weaknesses are inconsequential and easily fixable. The author's core ideas are clearly explained; the performance improvements are demonstrated; the book's production is excellent. High Performance Web Sites is highly recommended to all Web developers seriously interested in improving their site visitors' experiences.

Michael J. Ross is a Web developer, freelance writer, and the editor of PristinePlanet.com's free newsletter.

You can purchase High Performance Web Sites from amazon.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.

Sorry! There are no comments related to the filter you selected.

Is it just me... (1)

djones101 (1021277) | more than 7 years ago | (#20930295)

Or does this sound suspiciously like an advertisement for YSlow in book form? Not only is YSlow specifically mentioned as part of the book review, but every single item covered is one of those little checks YSlow completes.

Re:Is it just me... (2, Insightful)

NickFitz (5849) | more than 7 years ago | (#20930441)

does this sound suspiciously like an advertisement for YSlow in book form?

What's suspicious about the fact that a book written by the creator of YSlow addresses the very issues that YSlow, a free open source Firefox extension, addresses? It would be pretty strange if it didn't.

If you want to be so paranoid about the intentions of an author, at least find one it's reasonable to be suspicious about in the first place.

YSlow (1)

jessiej (1019654) | more than 7 years ago | (#20930861)

Seeing that this is about YSlow, just thought I'd mention that it isn't much of an extension. It'll make a few notes about your web pages (that you should probably already know if you created them) then give you links to Yahoo's website for suggestions on how to fix them.

So, in the spirit of cutting out the middleman, here's all the information you'd get about speeding up your web site without having to install YSlow: developer.yahoo.com... [yahoo.com]

Re:YSlow too inaccurate (0)

Anonymous Coward | more than 7 years ago | (#20931237)

Frankly I find YSlow to be nearly useless because it can be totally inaccurate. For example, on our site it give us an F for not using a CDN - in fact we use Akamai, and our home page elements are served to the end-user from the Akamai CDN roughly 98% of the time. It also give us an F for not gzipping content - in fact when we switched from apache-windows to apache-linux in July, gzipping was on by default, and our outgoing bandwidth from the web servers dropped by 45%.

Rule #34: Don't be the first Java site of the day (3, Funny)

xxxJonBoyxxx (565205) | more than 7 years ago | (#20930297)

Learned the hard way:

Rule #34: Don't be the first Java site your users visit during the day. (Unfortunately, this pretty turned into "don't use Java applets" unless you could find a hidden way to load an throwaway applet in another frame, etc.)

Re:Rule #34: Don't be the first Java site of the d (1, Insightful)

Anonymous Coward | more than 7 years ago | (#20930429)

Everyone knows that Rule 34 on the internet is "If it exists, there is porn of it".

Re:Rule #34: Don't be the first Java site of the d (3, Funny)

Anonymous Coward | more than 7 years ago | (#20930805)

Of course. Grandparent started talking about Java and Rule 34, and I was preparing to avert my eyes.

Re:Rule #34: Don't be the first Java site of the d (4, Interesting)

Cyberskin (1171659) | more than 7 years ago | (#20930739)

Rule #34 is all about how slow loading the java plugin is for any browser. It's always been slow, it was supposed to improve w/6 and really it's still slow. The main problem is that NOTHING shows up until the plugin gets loaded. My solution was two-fold. Write the object in javascript (which conveniently allows the rest of the html in the page to load and display, but also eliminates the IE problem of "click on this" to activate the applet) and create an animated gif loading screen div which I block when the applet div finished loading (I ended up loading the applet at the bottom of the page below the watermark because otherwise I couldn't catch the finished loading event. I just made the loading screen match the page background so the only way you could tell anything was going on was that the scrollbar on the right changed sizes). Not exactly elegant, but it was better then the blank screen you get waiting for the plugin to load and provided a nice custom animated loading gif instead of the default applet loading logo.

Interesting (1)

BosstonesOwn (794949) | more than 7 years ago | (#20930311)

Think I will buy a couple dozen copies of the ebook version and send it to many of the sites that get slashdotted every time they are posted.

first reply (-1, Offtopic)

Anonymous Coward | more than 7 years ago | (#20930361)

Let me be the first to reply to this.

All my sites load fast (2, Interesting)

MichaelCrawford (610140) | more than 7 years ago | (#20930421)

Why?

All my pages are static HTML. Not a web application in site, not even PHP. Yes, it's a drag when I need to do some kind of sitewide update, like adding a navigation item.

I also have less to worry about security, as long as my hosting service keeps their patches up to date, I know I haven't introduced any holes myself.

Also, for the most part, my pages are very light on graphics, with most of the graphics present being repeated on every page such as my site's logo, which gets cached.

Finally, all my pages are XHTML 1.0 Strict with CSS, with the CSS being provided by a single sitewide stylesheet. This means less HTML text to transfer compared to formatting with HTML tags.

Re:All my sites load fast (5, Funny)

NickFitz (5849) | more than 7 years ago | (#20930483)

You forgot to link to your site... [amish.org]

Re:All my sites load fast (0)

Anonymous Coward | more than 7 years ago | (#20931817)

lol, they have 500% more page visits than ever thanks to slashdot :D

Re:All my sites load fast (1)

badran (973386) | more than 7 years ago | (#20932207)

Well you have to know that they are technologically impaired, and haven't payed their phone bill since..... well anyways I guess Weird Al put it best with his song... Amish paradise... By the way... It appears south park is really close ;) ...

Re:All my sites load fast (1)

StikyPad (445176) | more than 7 years ago | (#20933253)

My website [tallyhouniforms.com] is not only 99% pure HTML, it can save you time and money on air travel, AND you can tell everyone you got laid [tallyhouniforms.com] . Perfect for people who never get laid.

It's easy to find parking space for my car (4, Funny)

Sciros (986030) | more than 7 years ago | (#20930591)

Why?
It's a bicycle!!1

Solution (5, Interesting)

dsginter (104154) | more than 7 years ago | (#20930597)

All my pages are static HTML. Not a web application in site, not even PHP.

This is a great point, but here is my anecdotal experience:

Years ago, I tested static HTML vs. PHP by simply benchmarking a simple document (I used the GPL license). On the particular box, I was able to serve over 400 pages per second with static HTML but only about 12 pages per second with PHP. I was blown away. I went one step further and used PHP to fetch the data from Oracle (OCI8, IIRC) and that went down to 3 requests/sec. You can see that caching does help, but not a whole lot.

So, rather than whine about it, what is the solution?

AJAX, done properly, will solve the problem. Basically, instead of serving dynamic pages with PHP, JSP, ASP or whatever... just serve an AJAX client (which is served in a speedy manner with no server side processing to bog things down). This client loads in the browser and fetches a static XML document from the server and then uses the viewer's browser to generate the page - so everything thrown down by the server is static and all processing is done on the client side.

Now, to facilitate a dynamic website (e.g. - message board, journal, or whatever), you have to generate the XML file upon insert (which are generally a small fraction of the read load) using a trigger or embedded in the code.

Viola! Static performance with dynamic content using browser-side processing.

This is unnecessary - and dangerous (2, Insightful)

arete (170676) | more than 7 years ago | (#20931433)

As a solution to speed alone, the right answer (as some other posts mentioned) is a CMS/publishing solution that makes static HTML pages once on a change. The most braindead way to do this is to put an aggressive squid/apache cache in front of your server, and only refresh the cache every half-hour or on demand; nobody gets to go directly to the dynamic site and you have a minimal investment in the conversion. But certainly just using an automated system to write-out HTML files works too.

Using AJAX you have to also remember that you're giving away all your code - and that any user with GreaseMonkey can REWRITE your code to do whatever they want. So your scenario only works out if 100% of your application data for all time is supposed to be viewable (at least) by all users. (Which is not to mention a significant number of other AJAX security potholes.)

Use AJAX to save page refreshes (eg Google Maps) - and only that. For any real world app, your server needs to control your data.

And if you need help implementing this, drop me a reply ;)

Re:This is unnecessary - and dangerous (3, Informative)

Anonymous Coward | more than 7 years ago | (#20931977)

Uh... most implementations of Ajax are used in conjunction with a server side programming language of some sort. The only performance boost is that you don't have to reload the entire page... only the part that you need to update. The obvious drawback is that if users don't have javascript enabled you have eliminated your users... or you write a second site to handle users without javascript. It can be used to help, but be careful of the suggestions you carelessly throw out there.

Re:Solution (5, Informative)

chez69 (135760) | more than 7 years ago | (#20931605)

sounds good, except you may or may not know that a lot of javascript implementations are sloooow. not to mention you usually have to set the no cache headers for everything in the page so your javascript works right.

I find that sites built with the method you describe are the asshole sites that fuck with browser history, disable the back button, try to disable the context menu, and those dumb ass tricks to get around the fact they don't know how to write proper server side code.

There's no reason you can't make a fast serverside site (with ajax too, that works without the stupid tricks I described above), if you can't I suggest you educate yourself, or don't use a wallmart PC for production use.

I've personally written many J2EE webapps (no EJB BS, spring & struts & jsp/velocity) that where very fast, with proper coding you can let the browser cache stuff so it constantly doesn't have to refetch crap. when you do this, all you push down to the client is the HTML to render, which browsers are really good at doing quickly.

Re:Solution (2, Insightful)

dsginter (104154) | more than 7 years ago | (#20933735)

Wow - this is wonderful, constructive feedback. But allow me to make some suggestions on your wording. For example, the following statement:

sounds good, except you may or may not know that a lot of javascript implementations are sloooow. not to mention you usually have to set the no cache headers for everything in the page so your javascript works right.

I find that sites built with the method you describe are the asshole sites that fuck with browser history, disable the back button, try to disable the context menu, and those dumb ass tricks to get around the fact they don't know how to write proper server side code.


Could be reworded as follows:

AJAX isn't quite mature and it is still slow on those Wallmart PCs so I suggest that, in lieu of the AJAX client, I suggest that you simply apply a stylesheet to the XML with XSLT to provide the best of both worlds. But mature AJAX toolkits (such as GWT) are improving and do a speedy rendering job while adequately managing the browser history and other nuances of the UI. ...And the following...

There's no reason you can't make a fast serverside site (with ajax too, that works without the stupid tricks I described above), if you can't I suggest you educate yourself, or don't use a wallmart PC for production use. ...could be reworded as...

I do most of my work in server-side work. I will disregard the evidence that was provided about the test environment from "years ago" and instead insult you as if you were my nemesis, instead of someone that I have never met prior to this discussion.

Fixed that for you.

Re:Solution (1)

chez69 (135760) | more than 7 years ago | (#20934655)

Yeah, i'm an old mofo. I've worked professionally for a while. You refute what I said with I don't "get it". You really didn't refute my claim that a full ajax web client is dumb.

I've used ajax type techniques in many projects. I know what it does well, and what it doesn't.

I'm assuming that your talking about sites where the user never submits data. Using no serverside validation on user input is completely retarded, just ask slashdot what happened when they didn't scrub the input from their users for javascript.

GWT and other frameworks can only take you so far, it still has to get rendered by the client, which once again, most have crappy slow dom and javascript implementations. GWT doesn't do shit for a user once it's reached the browser. Once there, it's a bunch of javascript and HTML.

Not using browser cache is retarded. If you are writing your ajax sites in a way that doesn't use the no-cache headers for all the requests, your more clueful then most. Most full ajax sites don't, and it's really lame.

Finally, you admitted you didn't have the ability to make a simple, dynamic PHP site fast. that's where my comment about learning came in. As much as I hate PHP, even the most crappy PHP code I've seen is somewhat fast.

Seriously, server side web apps are where you write serious apps that actually do more then display a couple of photos of your dog.

Re:Solution (0)

Anonymous Coward | more than 7 years ago | (#20931625)

And let me guess, those pesky back/forward buttons don't work anymore either?

Re:Solution (1)

lena_10326 (1100441) | more than 7 years ago | (#20932327)

Years ago, I tested static HTML vs. PHP by simply benchmarking a simple document (I used the GPL license). On the particular box, I was able to serve over 400 pages per second with static HTML but only about 12 pages per second with PHP. I was blown away. I went one step further and used PHP to fetch the data from Oracle (OCI8, IIRC) and that went down to 3 requests/sec. You can see that caching does help, but not a whole lot.
12 page/sec eh? You didn't put a busy wait in there did you? I've never seen performance that slow with PHP on a production server even with 2 or 3 SELECTs on a database, without APC, and without DB pooling (per request). Either that was a very large number of years ago or your server was a real piece of shit.

On Apache/Linux, I've seen jumps from 75-90 req/sec to about 250-400 req/sec with a PHP CGI by just adding APC. It was on a fairly fast multi-core machine (2 or 4, can't remember). It's been my experience that PHP compilation is the 2nd heaviest performance hit (opening database connections would be 1st).

Accessibility and search indexing (2, Insightful)

shmlco (594907) | more than 7 years ago | (#20932477)

Not to mention that that particular approach is probably a huge no-no when it comes to accessibility and search indexing. I mean, do you really expect Google to run all of your scripts when it spiders your page?

There was an old woman who swallowed a fly (0)

Anonymous Coward | more than 7 years ago | (#20932555)

AJAX, done properly, will solve the problem.

Are you nuts? Ajax isn't the answer at all. Vanilla mod_php isn't especially fast but with an opcode cache, it's easily fast enough for 99% of web sites. Database queries are always expensive, cache your generated html and watch your database bottleneck disappear. My cached PHP pages are served in milliseconds and even if a cache entry has expired there's only light load on the database so page generation takes mere tenths of a second at most. If you don't need dynamic content, don't invoke an interpreter on each request or offload static files to a separate server.



I don't even have script enabled; if you think Ajax is the answer then you obviously don't even understand the question.

Re:Solution - Not (1)

Iaughter (723964) | more than 7 years ago | (#20933085)

Now, to facilitate a dynamic website (e.g. - message board, journal, or whatever), you have to generate the XML file upon insert (which are generally a small fraction of the read load) using a trigger or embedded in the code.
This is silly.
If you have to "generate the XML file" every time the data changes, why not just write an x/html file and serve it?
Even better, why not cache the x/html file instead of generating it all the time.

Re:Solution - Not (1)

Pootie Tang (414915) | more than 7 years ago | (#20934783)

I think the answer is this:

You generate XML for each piece of content. Multiple templates can then serve the dynamic content while the templates themselves are static. You only need to generate the content once, regardless of how many templates it appears in.

That said, I still think generating the entire pages is the way to go. Less requests should give better performance. Not to mention the other issues (search engine friendly (or other non-JS aware), proper back/foward support, less work for the browser means faster end-user experience especially for slower computers).

precompile your HTML (2, Informative)

victorvodka (597971) | more than 7 years ago | (#20933099)

One solution that gives you a dynamic website with the advantages of a database and server-side scripting is to precompile your site to static HTML - you update it by recompiling more HTML. It can be done fairly transparently, with all the actual precompiling happening via automatic scripts. Obviously you can't have a user-login-based site work effectively this way, but for a site of modest dynamics (such as a blog, product catalog, or even some message boards), pre-compiling to HTML can be a real benefit. You can also precompile pieces of pages, although the benefits are less because includes require a certain amount of backend processing (unless they are slurped in from the front end using DHTML or whatever).

Use Page Caching (1)

tentac1e (62936) | more than 7 years ago | (#20933235)

Better yet, use a system like Rails' page caching.

With URL rewrite rules, have your server check for a static page matching your URL (e.g. index.html -> index.html.cache). If you get a 404, pass the request to your interpreter of choice, and write out the result to a cache. After the initial request, it's just static pages.

If your site is more complex, use fragment caching, [rubyonrails.com] which sounds like the solution you've described.

PHP and database are not *that* slow (1)

daBass (56811) | more than 7 years ago | (#20934263)

The other solution might have been to use something not as dog slow as PHP. OK, that is trolling, it is a lot better these days, especially with caching compilers and the like.

If you want really fast and scalable, AOLserver still blows PHP away today (like it did at the time you were testing), despite PHP's (and Apache's!) improvements.

AJAX is a terrible solution in my opinion. First of all, it doesn't free you from server-side processing; all you are doing is caching what would otherwise have come from the database. If you do similar caching using something like memcached, or even files on disk and make sure your dynamic pages do nothing more than reading from the cache while assembling the pages, you certainly won't see a 40x speed drop compared to static pages like you saw in your early PHP tests. In fact, I doubt there will be a 2x drop. (again, with the right technology used correctly and optimized)

And of course AJAX sites don't get indexed by search engines, back/forward buttons are hard to make work properly and so is bookmarking. The initial load time of complex Ajax apps, like the one you describe, is usually quite long too.

AJAX is great for proper web-based applications, like a mail client or word processor or a business app. What you describe sounds terrible for your average website.

I like my dynamic pages and with my choice of technology and experience, I have no problem making my complex dynamic pages (without much caching and several DB queries) run at over 100/sec even on my lowly 700MHz Athlon test box. It would take quite a Slashdotting to take the production quad Xeon out...

Re:All my sites load fast (5, Funny)

Anonymous Coward | more than 7 years ago | (#20930615)

In other words, it's a smalltime hobby site, and you're not a web developer. That's fine, and I agree that it's quite nice and reassuring to simplify like this where possible. However...

Go on out into the job market advertising your incredible "static page" skills, and what lightening fast load times you'll bring to your employer. Offer to convert their entire 20GB of online content to static XHTML 1.0 Strict to obtain the peace of mind that comes with knowing you haven't introduced any holes yourself. Hell, I'm going to go right now and submit a patch to MediaWiki that generates static versions of every article and then deletes all the PHP from the entire web root! I'm sure as soon as I tell them about the performance boost, they'll be right on board!

Re:All my sites load fast (1)

jgrahn (181062) | more than 7 years ago | (#20933071)

In other words, it's a smalltime hobby site, and you're not a web developer. That's fine, and I agree that it's quite nice and reassuring to simplify like this where possible. However...

I think the grandparent thinks of it as "not complicating" rather than as "simplifying" ...

I also find it slightly amusing that you can tell from this that he's not a web developer ;-)

Actually my sites are commercial (1)

MichaelCrawford (610140) | more than 7 years ago | (#20933399)

I make thousands per month in AdSense.

Re:Actually my sites are commercial (2, Insightful)

kyofunikushimi (769712) | more than 7 years ago | (#20933815)

And the ads don't slow things down at all?

Also, don't the adds call some sort of script? I wouldn't call that static.

Re:All my sites load fast (3, Insightful)

Dekortage (697532) | more than 7 years ago | (#20930633)

All my pages are static HTML. Not a web application in site, not even PHP. Yes, it's a drag when I need to do some kind of sitewide update, like adding a navigation item.

Umm... there are plenty of content management systems (say, Cascade [hannonhill.com] ) that manage content and publish it out to HTML. Even Dreamweaver's templating system will do this. Just because you use pure HTML, doesn't mean you have to lose out on sitewide management control.

Re:All my sites load fast (1)

hellsDisciple (889830) | more than 7 years ago | (#20931517)

Are there any good FOSS solutions which work on this principle? I have home rolled a CMS which stores stuff in a MySQL database but writes it all out to static HTML files which I can upload to the server via rsync. Advantages are that should I move college I just can rsync my site to the new web server without having to get and maintain a database on it.

Re:All my sites load fast (1)

DJ_Maiko (1044980) | more than 7 years ago | (#20930677)

I couldn't agree more w/Mr. Crawford. Unfortunately, not everyone has the luxury of only using static HTML. It's the 21st century & the majority of people think that, like with corny ass rappers sampling the same beats & being unoriginal, you're only legit if you're rockin' the bling-bling!

rockin' the bling-bling (0)

Anonymous Coward | more than 7 years ago | (#20932007)

Dear Myspace: You know you're in trouble when it takes 168 pages to document just one of the reasons why your site sucks.

Re:All my sites load fast (2, Informative)

perlwolf (903757) | more than 7 years ago | (#20930905)

Your music page [geometricvisions.com] fails the XHTML W3C check though it says its XHTML compliant.

They added a new requirement to the validator! (0)

MichaelCrawford (610140) | more than 7 years ago | (#20931091)

The page used to validate, but the W3C validation service recently added a requirement that the html element declare its namespace.

I've been adding it to my pages as I work on them, but I haven't worked on that page for a while.

And yeah, such requirements for sitewide updates is the best argument against my method.

Re:They added a new requirement to the validator! (1)

oni (41625) | more than 7 years ago | (#20931263)

I've been adding it to my pages as I work on them, but I haven't worked on that page for a while.

You should use something like PHP. Then you could have an includable header... oh wait, nevermind.

Re:They added a new requirement to the validator! (0)

Anonymous Coward | more than 7 years ago | (#20934651)

The page used to validate, but the W3C validation service recently added a requirement that the html element declare its namespace.

The validator is not the spec. It's meant to be a tool to help you catch silly mistakes and typos, it's not meant to be used as a substitute for knowing the language. The namespace has always been required, it's just that the validator didn't always spot that particular error in your code. It was a bug in the validator that has been fixed, not a newly-added requirement.

Re:All my sites load fast (1)

kevin_conaway (585204) | more than 7 years ago | (#20930923)

I bet you don't even own a television [theonion.com]

Re:All my sites load fast (1)

InsurgentGeek (926646) | more than 7 years ago | (#20933591)

What is it with technology advances and people? Is there an old-fart gene operating here? Every damn time someone talks about a new technology someone has to pipe up with the "I build systems out of sand an raw electrons" argument as if that somehow is attached to great achievement and moral superiority. Use the tools you pretentious Luddites.

Re:All my sites load fast (1)

thc69 (98798) | more than 7 years ago | (#20933917)

What is it with technology advances and people? Is there a thoughtless idiot genre operating here? Every damn time someone pipes up with "I build systems out of sand and raw electrons", some idiot starts babbling about using the latest and greatest...while conveniently ignoring that it's results that matter, not how pretty your tools are.

Re:All my sites load fast (0)

Anonymous Coward | more than 7 years ago | (#20934907)

it's results that matter

Precisely. You can get MichaelCrawford's results without giving up on server-side scripting, as many people have pointed out. MichaelCrawford's approach isn't about the results, it's about feeling superior by being contrary.

gzip (1)

Ant P. (974313) | more than 7 years ago | (#20930445)

Why is gzip the only content encoding option browsers support? It seems to me they'd be better off supporting something like bzip2 since it works far better on plain text.

Re:gzip (1)

Outland Traveller (12138) | more than 7 years ago | (#20930639)

Bzip2 consumes far more memory and CPU cycles than gzip. There's a lot of scenarios where this tradeoff isn't desirable for a busy webserver.

Re:gzip (3, Insightful)

cperciva (102828) | more than 7 years ago | (#20930875)

Unlike bzip2, gzip is a streaming compression format; so the web browser can start parsing the first part of a page while the rest is still being downloaded.

Re:gzip (1)

LordVorp (988488) | more than 7 years ago | (#20934389)

Oh yeah? Not contradicting you, by any means, but can you name one HTTP server program that actually DOES this? All of my research shows that the best you can hope for is a GZIP document broken into HTTP/1.1 Chunked transfer-encoding bits.

Web Devs know Databases? (0, Troll)

techpawn (969834) | more than 7 years ago | (#20930449)

When did THAT happen? They may BLAME the database but I've seen few who understand it enough to know WHY...

The book would be a lot more believable... (4, Insightful)

QuietLagoon (813062) | more than 7 years ago | (#20930453)

... if Yahoo's website were not dog slow all the time.

Re:The book would be a lot more believable... (1)

trolltalk.com (1108067) | more than 7 years ago | (#20930521)

Its because in Soviet Yahoostan "minifying" your pages embiggens YOU!

Re:The book would be a lot more believable... (0)

Anonymous Coward | more than 7 years ago | (#20931569)

Several reasons why all newspaper sites are dog slow, but the main reason is probably adertising. Nothing makes a site suck more than advertising, unless the only ads are Google text ads.

First, graphics load slow. Animated graphics load even slower.

Loading those graphics from a different server make it even slower.

As I'm renting a little piece of real estate on your newspaper, I can stick an iFrame and do pretty much anything I want with it, including stopping the rest of your page from loading as long as I want. Judging from most newspapers' dog slow performance and the prevalance of supposedly professionally done pages that pop up a dialog box asking me if I want to debug their shitty javascript, this is exactly what happens.

Then there's said javascript slowing things down.

I'm guilty of breaking all the rules myself [mcgrew.info] , but you'll notice that despite the linked page's extreme length and excessive number of illustrations, it still appears to load quickly. I say "appears to" because you'll notice a slight 2 or 3 second wait on a T1 for the first illustration to load. But the text is there "immediately", even though what's outside the visible screen is still loading. Kids, unless you want to make your audience wait until a graphic has loaded (and I've deliberately done this before), use those "hieght=" and "width=" operators in your <img> tags.

If you're using Front Page or some other inane HTML generator, stop it for God's sake! Learn HTML you lazy doofus! Those programs write abysmal code, slow to render and hard for a human to parse for errors, and they all produce many.

Also, you dumb kids should learn that my monitor is not the same size as your monitor, and is not set to the same resolution. Realize that you're not working with paper. You're not going to get it to look the same on any two different monitors or browsers!

Now get off my lawn [kuro5hin.org] you damned kids! [mcgrew.info]

Doh! Test yer pages! (3, Insightful)

redelm (54142) | more than 7 years ago | (#20930493)

If you're responsible for the response time of some webpages, then you've got to do your job! First test a simple static webpage for a baseline.

Then every added feature has to be justified -- perceived added value versus cost-to-load. Sure, the artsies won't like you. But it isn't your decision or theirs. Management must decide.

For greater sophistication, you can measure your dl rates by file to see how much is in users caches. And decide whether these are also not a cause of slowness!

Interesting Points... (2, Insightful)

morari (1080535) | more than 7 years ago | (#20930585)

I hate that the typical webpage assumes that everyone has broadband these days. The finesse and minimalist approach of yesteryear no longer applies. Even with broadband at 100%, smaller is always better. No one wants to put the effort in that would go toward efficiency though.

Re:Interesting Points... (2, Insightful)

guaigean (867316) | more than 7 years ago | (#20931471)

No one wants to put the effort in that would go toward efficiency though.
That's not an accurate statement. A LARGE amount of time is spent on the very big sites to maximize efficiency. It is the largest of sites that truly see the benefits of optimization, as it can mean very large savings in fewer servers, bandwidth fees, etc. A better statement might be "People with low traffic sites don't want to put the effort in that would go toward efficiency though."

Re:Interesting Points... (0)

Anonymous Coward | more than 7 years ago | (#20931911)

Just like how all application developers assume that everyone has a 3GHz Core 2 Duo with 4 GB RAM. No one wants to take the time anymore to develop beautiful, efficient applications like MenuetOS.

Obligatory Question (-1, Offtopic)

Anonymous Coward | more than 7 years ago | (#20930593)


Where is Jon Katz?

Thanks.

Re:Obligatory Question (0)

Anonymous Coward | more than 7 years ago | (#20932481)

I'll bite. Did you mean Paul Katz [wikipedia.org] ?

Exellent Subject (1)

curmudgeon99 (1040054) | more than 7 years ago | (#20930619)

This is a great idea for a book--hope the execution lives up to the idea. Without having read the book, I will venture some obvious things: * Profile your app before you optimize. Don't guess where you are slow: know. * If you use Struts, don't do client-side validation. (Look at the mass of JavaScript that gets added to your page if you question this.) * Use AJAX if you can. (Also an amazing speed boost). * Use few images. * Do AJAX validation without leaving the page.

It's my general experience (1)

Colin Smith (2679) | more than 7 years ago | (#20930671)

That web site designers don't give a flying fuck about the speed their web sites load.

 

MeeVee bloat (1)

linuxwrangler (582055) | more than 7 years ago | (#20931103)

Seems true. Meevee is my current peave. Lots of cool eye-candy, pop-up show descriptions and such but all I really want is to know when "Reaper" or "Nova" is on. Google would have it back to me in plain html in a fraction of a second - sponsored ads included. Loading the guide page of MeeVee involved 114 separate requests, 900kB of data and took over 20 seconds to load and render. And that's connecting via T1. Poor suckers using modem or mobile connections are SOL.

Re:It's my general experience (2, Interesting)

PHPfanboy (841183) | more than 7 years ago | (#20931159)

From my observations of web developers in the wild this does seem to be scientifically true. Actually, most web developers are so overloaded with projects that even if they did give a shit, they simply don't have time to benchmark, test and optimize properly.

It's not an excuse, it's just that teams are so fluid, project work is chaotic and project management is driven by marketing considerations (read: "get it out", not "enterprise stability") that site performance is seen as a server hardware issue.

Shame really, in these times of green awareness, I hate to think how much wattage is being wasted on processor cycles and disk I/O when a simple caching strategy and site optimization can do so much via software and memory management.

Where is the rule "Avoid Ad-Networks"? (5, Interesting)

Josef Meixner (1020161) | more than 7 years ago | (#20930723)

I guess I am not alone in noticing that often the ads on a page drag the load time way down. I find it interesting, that there is no rule about minimizing content dragged in from other servers you have no or little control over. Blind spot because of Yahoo's business, I guess.

Re:Where is the rule "Avoid Ad-Networks"? (1)

demonbug (309515) | more than 7 years ago | (#20931399)

I guess I am not alone in noticing that often the ads on a page drag the load time way down.

This is especially annoying on sites where the ads are apparently forced to load before things like the text (i.e., the content I am actually looking for) render. Anandtech used to really piss me off in this respect - the ad server would take forever, and there was nothing to read until the ads loaded (haven't noticed this behavior lately).
I suppose I might be able to block the ads, but it is my feeling that as long as the ads are not overly obtrusive (pop-ups, etc.) I owe it to the website I am visiting to allow them their ad revenue. Seems pretty lame to me to block the source of income for a site I am visiting and enjoying (of course, for all I know they actually get the same income whether or not I block ads).

But yeah, I've noticed that the ads on a lot of sites tend to be the biggest source of problems (never mind the insistence some sites have on pissing you off with obnoxious ads that cover up the text you are trying to read - a very good way to make me not want to visit your website ever again).

Re:Where is the rule "Avoid Ad-Networks"? (1)

SpaceToast (974230) | more than 7 years ago | (#20933773)

On the nose, Josef.

How much time have we all spent looking at a blank browser window with "...completed 12 of 13 items." at the bottom?

Whatever else [spacetoast.net] I might think of it, Facebook has a nice trick that appears to work as follows. The page loads with a blank graphic where the ad should be. Afterward, an onLoad script fires requesting the ad and replacing the blank graphic with it. The ads take a moment to load: the page is instantly on. Proper priorities.

(As a corollary, I've got a Dice ad at the top of this page sapping so many cycles it's making it hard to type. Pri-or-i-ties. Something tells me I'm shouting at the wind.)

The book is about speed, not performance. (4, Interesting)

zestyping (928433) | more than 7 years ago | (#20930793)

The title of the book should be "High Speed Web Sites" or just "Fast Web Sites."

"Performance" is not a general-purpose synonym for "speed." "Performance" is a much more general term; it can refer to memory utilization, fault tolerance, uptime, accuracy, low error rate, user productivity, user satisfaction, throughput, and many other things. A lot of people like to say "performance" just because it's a longer word and it makes them sound smart. But this habit just makes them sound fake -- and more importantly, it encourages people to ignore all the other factors that make up the bigger picture. This book is all about speed, and the title should reflect that.

So, I beg you: resist the pull of unnecessary jargon. The next time you are about to call something "performance," stop and think; if there's a simpler or more precise word for what you really mean, use it.

Thanks for listening!

Re:The book is about speed, not performance. (1)

improfane (855034) | more than 7 years ago | (#20933523)

lot of people like to say "performance" just because it's a longer word and it makes them sound smart.
I see where you are coming from but I don't like this quote. Performance was a decent term to use since this covers a lot of ground. A fast performing website 'performs' well for the user. All your examples are factors of a well performing website:

memory utilization [browser uses less memory]
low error rate [user doesn't make so many mistakes, doesn't misclick something due to lag, doesn't forget what they are doing due to loading times]
user productivity [user gets more done]
user satisfaction [pages load fast! no waiting, no technology related hassles]
throughput [download speed]

The title of the book should be "High Speed Web Sites" or just "Fast Web Sites."
I disagree. A web site's purpose is to be downloaded and consumed. Performance includes speed and refers to allowing users to perform 'higher' not just achieve faster download speeds. Speed doesn't cover that.
I would expect Fast Websites or High Speed Websites to cover something to do with network configuration and the like. Perform makes me think theatrical and therefore, users.

Re:The book is about speed, not performance. (1)

zestyping (928433) | more than 7 years ago | (#20933907)

Sure, many of these factors are related. But what is the book really about?

It isn't a book about making users more productive, or a book about reducing user error. This is a book about making websites fast. The other factors are only peripheral effects. Those 14 rules that Souders is pushing are all about speed, not these other factors.

Making a website fast may improve many things about the experience. But speed is not the only thing you need to make a website perform well.

The reviewer makes the same mistake of saying "performance" many times where he really means "speed". The more often that mistake is made, the easier it is to forget that speed is not the answer to every problem. See my point now?

Odd Summary (3, Insightful)

hellfire (86129) | more than 7 years ago | (#20930815)

Web developers often assume that most page-loading performance problems originate on the back-end, and thus the developers have little control over performance on the front-end, i.e., directly in the visitor's browser. But Steve Souders, head of site performance at Yahoo, argues otherwise in his book, High Performance Web Sites: Essential Knowledge for Frontend Engineers."

Let's correct this summary a little bit. First, it's NOVICE Web developers who would think this. Any web developer worth their weight knows the basic idea that java, flash, and other things like it make a PC work hard. The website sends code, but the PC has to execute the code, rather than the website pushing static or dynamic HTML and having it simply render. We bitch and moan enough here on slashdot about flash/java heavy pages, I feel this summary is misdirected as if web developers here didn't know this.

Secondly, there's no argument, so Steve doesn't have to argue with anyone. It's a commonly accepted principle. If someone didn't learn it yet, they simply haven't learned it yet.

Now, I welcome a book like this because #1 it's a great tool for novices to understand the principle of optimization on both the server and the PC, and #2 because it hopefully has tips that even the above average admin will learn from. But I scratch my head when the summary makes it sound like it's a new concept.

Pardon me for nitpicking.

Re:Odd Summary (1)

mcmonkey (96054) | more than 7 years ago | (#20930997)

Well, as the summary itself sez:

As with any book, this one is not perfect -- nor is any work.

Sort of... (2, Insightful)

Roadkills-R-Us (122219) | more than 7 years ago | (#20931149)

It's really irrelevant whether they actually understand the real problem or not when what they do is broken. I don't care of they really don;t know or just have a mandate from someone who doesn't know or if they're just too clueless to realize that what happens on their high end system on their high speed LAN has little to do with what Jenny and Joey Average see at home on their cheap Compaq from WalMart with about half the RAM it should have for their current version of Bloated OS. The end result is the same.

And, in fact, a lot of web site developers fit one, two or three of the above categories. It's not just novices. a ridiculous percentage of websites suck performance-wise, and it's not just the myspaces and hacked up CMSes and such; a lot of corporate sites fall into this category as well, from financial institutions to ebay to auto manufacturers and dealers to swimming pool installation companies.

Re:Odd Summary (0)

Anonymous Coward | more than 7 years ago | (#20931829)

Well, genius, he's not talking about "Java and Flash" here. He's actually talking about how CSS, basic JavaScript (not crazy AJAX), and standard HTML can render much more quickly on a browser if you pay attention to some basic guidelines. So maybe there is something to be learned here after all, even for people like you who are not NOVICEs.

Advertising slows everything down (2, Insightful)

lgordon (103004) | more than 7 years ago | (#20930885)

Getting rid of banner ads at the source is what causes most page loading time, and it's usually a fault of the browser renderer than anything else. A lot of times these javascript ad servers are horrible performance wise. It can also be the fault of the ad networking company when their servers get overloaded, causing undue delay before the ad is served to the client. Something to think about when choosing ad placement on a site.

Putting an adblocker of some sort or Mozilla Adblock Plus is a great way to speed up any page (from the user's point of view, of course).

Third-party content. (2, Insightful)

shmlco (594907) | more than 7 years ago | (#20932661)

Ads from third-party sites. Scripts and trackers from third-party sites (like Google Analytics or page counters). Scripted web page widgets from third-party sites.

Basically anything that's not under your control can slow your site down significantly.

wish it did focus on the backend (1)

prockcore (543967) | more than 7 years ago | (#20930891)

I wish it did focus on the backend more. Optimization is the second biggest problem with software these days, security/stability being number one.

Web development is especially bad at optimization. This thread demonstrates the problem:
http://forums.devnetwork.net/viewtopic.php?t=74613 [devnetwork.net]

People there are actually recommending you wait until your server fails before you look to optimize.

Head of site performance at Yahoo, huh? (2, Funny)

abloylas (857408) | more than 7 years ago | (#20930967)

Web 2.0 performance costs (5, Insightful)

spikeham (324079) | more than 7 years ago | (#20931109)

In the mid-90s Yahoo! pared down every variable and path in their HTML to get the minimum document size and thus fastest loading. You'd see stuff in their HTML like img src=a/b.gif and a minimum of spaces and newlines. However, back then most people had dialup Internet access and a few KB made a noticeable difference. In the past few years, mainstream Web sites pretty much assume broadband. Don't bother visiting YouTube or MySpace if you're still on a modem. Aside from graphics and videos, one of the main sources of bloat is Web 2.0. Look at the source of a Web 2.0 site, even Yahoo!, and often you see 4 times as many bytes of Javascript as HTML. All that script content not only has to be retrieved from the server, but also takes time to evaluate on the client. Google is one of the few heavily visited sites that has kept their main page to a bare minimum of plain HTML, and it is reflected in their popularity. If you visit a page 10 times a day you don't want to be slowed down by fancy shmancy embedded dynamic AJAX controls.

- Spike
Freeware OpenGL arcade game SOL, competitor in the 2008 Independent Games Festival: http://www.mounthamill.com/sol.html [mounthamill.com]

Re:Web 2.0 performance costs (1)

garett_spencley (193892) | more than 7 years ago | (#20933891)

While talking about Yahoo ... I've been using yahoo mail since the late 90's. I don't use it for everything but since I've had that e-mail address for almost 10 years now I still use it for certain purposes.

Now, I'm on a "modern" PC (1GB RAM, 1.8Ghz CPU ... starting to show it's age but still plenty capable of surfing Youtube and MySpace if I'm so inclined) and I have a business-grade cable line that I pay extra for to get more bandwidth since I'm running a home business and I have to transfer a lot of data to/from my web servers etc. (mostly I'm paying it for a faster uplink but the downlink is increased as well).

When Yahoo introduced their new AJAX e-mail service I tried it out for a few days and ended up switching back to the classic one. It was too slow and "quirky" to be usable.

Moral = Just because a user has broadband and a fast computer doesn't mean jack. Yes you have to take "extra special care" for dial-up users ... but we shouldn't be slacking off just because users have broadband. Every microsecond of downloading and processing time counts for all users.

I guess he's not used the new Yahoo Mail interface (3, Interesting)

mr_mischief (456295) | more than 7 years ago | (#20931147)

The new interface is a joke for performance compared to the old server-generated HTML one. Sure, they might be saving some hardware resources, but it's slow, and the message bodies are the bulk of the data anyway. The main transfers they cut out using JavaScript and dynamic loading seem to be updates to the message list when you delete a bunch of spam. That would be better handled by putting it in the spam folder where it belongs. OTOH, I often delete non-spam messages without reading them as I do subscribe to a few legit mailing lists from my Yahoo address but don't want to read every message.

Re:I guess he's not used the new Yahoo Mail interf (1)

jessiej (1019654) | more than 7 years ago | (#20931283)

I'd mod this informative, funny and interesting if I could!

Re:I guess he's not used the new Yahoo Mail interf (1)

jvschwarz (92288) | more than 7 years ago | (#20932625)

Mod parent up! Each time I try the new Yahoo mail, it seems to get slower.

When I switch back, I always make a comment about the performance, but haven't seen them do anything to fix it. The new version of My Yahoo! seems to have the same issues.

Still the same with web masters. (2, Insightful)

geekoid (135745) | more than 7 years ago | (#20931311)

"Web developers often assume that most page-loading performance problems originate on the back-end, and thus the developers have little control over performance on the front-end,"

Those Web designers should be called "Unemployed"

First post!!!!111one (0, Troll)

Dancindan84 (1056246) | more than 7 years ago | (#20931377)

Pages need to load faster so people can erect their epeens faster.

Re:First post!!!!111one (1)

kanweg (771128) | more than 7 years ago | (#20931769)

So, most sites appear to have been designed for the youth-challenged, then.

Bert

Outsource everything (1)

digitaldc (879047) | more than 7 years ago | (#20931557)

The book is a quick read compared to most technical books, and not just due to its relatively small size (168 pages), but also the writing style. Admittedly, this may be partly the result of O'Reilly's in-house and perhaps outsource editors -- oftentimes the unsung heroes of publishing enterprises.

So not only do they now outsource the web page designers, they are outsourcing the technical writers?
What's next? Outsource the audience?

ISBN redundancy (2, Informative)

merreborn (853723) | more than 7 years ago | (#20931741)

FTFA:

High Performance Web Sites was published on 11 September 2007, by O'Reilly Media, under the ISBNs 0596529309 and 978-0596529307

There's no need to list both the ISBN 10 and the ISBN 13. ISBN 13 is a superset of ISBN10. Notice that both numbers contain the exact same 9 data digits:
0596529309
9780596529307

The only difference is the 978 "bookland" region has been prepended, and the check digit has been recalculated (using the EAN/UPC algorithm, instead of ISBN's old algo). You can just give the ISBN 10, or just the ISBN 13. You can trivially calculate one from the other. All software that deals with ISBNs should do this for you. e.g., if you search either the ISBN13 or ISBN10 on amazon, you'll end up at the exact same page.

Re:ISBN redundancy (1)

psbrogna (611644) | more than 7 years ago | (#20931953)

The typical clerk at a Barnes and Noble register can not trivially recalculate an alternate ISBN.

Re:ISBN redundancy (1)

YourMotherCalled (888364) | more than 7 years ago | (#20933113)

Wow. And I thought *I* was ornery.

My advice to speed up your website (1)

siDDis (961791) | more than 7 years ago | (#20931759)

Use Varnish HTTP cache http://en.wikipedia.org/wiki/Varnish_cache [wikipedia.org]
It's designed from the ground up as an HTTP accelerator. It's extremly fast, in most cases way faster than Squid. However if you rely a lot on cookies you should look somewhere else.

Ad-Networks (1)

gpuk (712102) | more than 7 years ago | (#20931963)

Squeezing a few milliseconds here and there using clever optimisation is fine (and worth doing) but isn't the whole objective defeated somewhat as soon as you have to embed adverts from the major ad-delivery networks (which most sites of any size do)?

I have lost countless hours of my life waiting for pages to render while they suck down banner ads from overloaded delivery networks (e.g. Falkag).

Have read, mixed feelings (4, Insightful)

SirJorgelOfBorgel (897488) | more than 7 years ago | (#20931979)

I have read a large number of excerpts (one for every paragraph) of this book in response to a mention of this book in the #jquery IRC channel. A few people were very much anticipating this book. A lot of discussion followed on some of the subjects. Ofcourse, this book makes some very good points, like how the front-end speed is important and only partially dependant on server response times. I will not go into the specifics (I could write a book myself :D), but some things, you might think the author is smoking crack.

I have looked at the book again now, and there seem to have been some changes. For example, there were only 13 rules when I was reviewing those before. Now there are 14. As one example, ETags were advised to not be used at all (IIRC, my biggest WTF about the book - if used correctly, ETags are marvellous things and compliment 'expires' very nicely), instead of the current 'only use if done correctly'. Some other things are nigh impossible to do correctly crossbrowser (think ETag + GZIP combo in IE6, AJAX caching in IE7, etc). To be honest, I found pretty much all of this stuff being WebDevelopment 101. If you're not at the level that you should be able to figure most of these things out for yourself, you probably won't be able to put them into practise anyway, and you should not be in a place where you are responsible for these things.

I might pick up this book just to read it again, see about the changes and read the full chapters, just to hear the 'other side of the story', but IMHO this book isn't worth it. In all honesty, the only thing I got out of it so far that I didn't know is the performance toll CSS expressions take (all expressions are literally re-evaluated at every mouse move), but I hardly used those anyways (only to fix IE6 bugs), and in response have written a jQuery plugin that does the required work at only the wanted times (and I've told you this now, so no need to buy the book).

My conclusion, based solely on the fairly large number if excerpts I've read is: if you're a beginner, keep this book off for a while. If you're past the beginner stage but your pages are strangly sluggish, this book is for you. If you've been around, you already know all this stuff.

Language Nazi Note (1)

fm6 (162816) | more than 7 years ago | (#20932311)

a broadband or narrowband connection
Suggest "fast or slow connection".

Flash (1)

Anne Thwacks (531696) | more than 7 years ago | (#20933319)

Actually, eery user's impressions are created by Flash:

Some think Flash is essential to the web brousing experience and a site without Flash is not worth the bother.

Others think that a site with flash is sure evidence of a triumph of style over content, and guarantees its not worth waiting for it to load.

Since Adobe choose not to support FreeBSD, its fairly clear that freeBSD users all fall in the second category. You will have to do other analyses yourself.

HP's Website consistently has CRAPPY performance! (0, Offtopic)

spiedrazer (555388) | more than 7 years ago | (#20933451)

I know it's not totally relevant, but I just wanted to vent. Their website is ALWAYS horribly slow compared to all the others I frequent no matter the state of the pipe on my end. It amazes me that a company that big hasn't figured out where the bottlenecks are in their architacture and fixed them, unless their basic architecture is the problem. Anyway, I feel marginally better.

No useless intro in flash (1)

J4nus_slashdotter (953890) | more than 7 years ago | (#20934111)

The best pratice is "don't put a high loaded home page due to a huge flash animation" !
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?