On 7/30/06, Nathaniel B. [email protected] wrote:
Rasmus has done several talks on how to architect a system which can handle
the load of a place such as Yahoo.
Any chance you could link to those slides?
Might be worth sifting through his slides for the diagrams he mentioned.
Whether its PHP or Rails, they are similar enough that you can leverage his
knoweldge when you get down to something like system architecture.
Not sure why everyone is jumping down this guys throat. Who cares if he
landed a client like Digg or YouTube? He was just asking for how it would be
done.
The intent of my question was to figure out what changes when you move
to, say, 100 dynamic pages per second, which my laptop can handle, to
1000 dynamic pages per second, which probably a couple servers could
handle, to 10k pages per second.
I probably should’ve phrased the original question better.
Again, this is a large amount of traffic in a burst, a short amount
of time. Think victoriasecret.com advertised during the superbowl.
Not quite at that level, but the general idea applies.
In my situation, handling 500 to 1,000 dynamic pages per second
without any slow down would be great, and quite honestly, is probably
all we’ll ever need. But, the folks I’m doing this for want to be
assured that it’s not too difficult to go higher. I don’t have much
experience at that level of performance, hence the question.
Another question: Assuming I’ve got some initial architecture in
place, how do I test everything? Using the Apache benchmark ‘ab’
program seems to only measure the performance of downloading one
single page, so it wouldn’t measure the effect of having a couple
images, javascript includes, css files, etc.
Thanks for all your responses, even the snarky ones,
Joe