Prototype Size?

There’s a good article at
BBC NEWS | Technology | First impressions count for web that says, in a
nutshell, that people form their impression of a Web site in 50
milliseconds. With an 80K download for prototype, that leads me to ask:
“how can page sizes be trimmed and still use cool features”?

I recognize that browsers may feel free to download these scripts
asynchronously, but what is the best practice for keeping the page sizes
to a reasonable minimum and still retain Web 2.0 functionality?

Thoughts?

On 1/17/06, Steve R. [email protected] wrote:

Thoughts?
Don’t put all your large scripts and things on the front page of your
site. I did a quick cruise around the apps I use daily, and they all
have very basic HTML pages describing the service, with no 80k
javascripts to download.

But, if this is a real issue, try using moo.fx for some basic stuff.
It even comes with prototype-lite (coming in at under 3k).


Rick O.
http://techno-weenie.net

But, if this is a real issue, try using moo.fx for some basic stuff.
It even comes with prototype-lite (coming in at under 3k).

Yes, this is the conclusion the Cake-PHP folks arrived at because they
were unwilling to embrace an 80K Javascript download. Not my favorite
for this application, as it is compnentized and the components use
(eeek!) AJAX. I’ll have do do a one-off for the home page. Not exactly
DRY.

Lest I seem too crusty about this, I love RoR and everything about it
including prototype.

Hi,

You could look at moo.fx (but I don’t think the lack of features if
worth the smaller filesize). It has a lite prototype build it. There is
also a packed version of prototype/script.aculo.us available. Please
search the list for that.

Steve,

On 1/17/06, Steve R. [email protected] wrote:

Thoughts?
Prototype 1.4.0 is 46 KB. That’s about the size of a small JPEG photo.

File size is certainly a concern, but the best part about Prototype is
that as it becomes larger, your code becomes smaller. For building
web applications, 46 KB is (to me, at least) certainly worth the cost
of admission.

Finally, I’m fairly certain most browsers won’t even begin to render a
page until all files referenced in tags have been loaded.


sam

On 17 Jan '06, at 10:19 AM, Steve R. wrote:

There’s a good article at
BBC NEWS | Technology | First impressions count for web that says, in a
nutshell, that people form their impression of a Web site in 50
milliseconds. With an 80K download for prototype, that leads me to
ask:
“how can page sizes be trimmed and still use cool features”?

That’s 50ms after the page displays. The time it takes to load the
page is a separate matter; I’ve seen other studies estimating how
long people are willing to wait for a page to display before they
give up and do something else. IIRC it’s on the order of 5-10
seconds. (I’m sure Jakob Nielsen has written about this on http://
useit.com.)

The 50ms figure is an argument for making sure the first rendering of
the page looks decent. An obvious corollary is to specify height/
width attributes for images, so the page doesn’t look smooshed and
then change its layout several times as the images finish loading.

I would imagine that Ajax stuff like Prototype could be loaded after
the main page load finishes. Put the tag at the end of the
HTML and make sure it has “defer=true” (sp?) so loading the script
doesn’t block display of the page. Also make sure scripts are stored
in separate .js files instead of inline, so they can be cached.

–Jens

On 1/17/06, Jens A. [email protected] wrote:

I would imagine that Ajax stuff like Prototype could be loaded after the
main page load finishes. Put the tag at the end of the HTML and
make sure it has “defer=true” (sp?) so loading the script doesn’t block
display of the page. Also make sure scripts are stored in separate .js files
instead of inline, so they can be cached.

Also, there’s nothing that prevents prototype.js (and any other .js
files) from being defalted/gzipped for downloads. While most filters
don’t normally act on these files (by default they’ll only work on
.html and .xml files – or certain mime-types), I’ve configured a
number of servers to compress .js files if the accept-encoding header
is set. This can save a huge amount of download size… on one web
site that I help administrate, the average download size of
prototype.js is less than 8K, thanks to caching and compression.

Cheers,

bs.

On 17 Jan '06, at 11:44 AM, Sam S. wrote:

Finally, I’m fairly certain most browsers won’t even begin to render a
page until all files referenced in tags have been loaded.

That’s true unless the tag has the attribute “defer”:

Defer - WebSiteOptimization.com says:

When set, this boolean attribute provides a hint to the user agent
that the script is not going to generate any document content
(e.g., no “document.write” in javascript) and thus, the user agent
can continue parsing and rendering.
</
script>

–Jens

Here’s an interesting approach. Include an AJAX stub that issues an
XMLHttpRequest to do a lazy download of the rest.

http://ajaxpatterns.org/On-Demand_Javascript

I’m not sure if this is different from “defer” mentioned prior… In any
case, it’s something to think about. Also, WebSiteOptimization.com
claims just over 60% US have broadband. That means 4 out of 10 users are
on 56K modems (!!!). Just a couple of thoughts.

And yes, prototype is worth the cost! Great library.

Prototype and Scriptaculous should be hosted on a central server. This
way the browser caches the script, and doesn’t have to download it again
and again for every website.

But who wants to host this??

On 1/17/06, Jules [email protected] wrote:

Prototype and Scriptaculous should be hosted on a central server. This
way the browser caches the script, and doesn’t have to download it again
and again for every website.

But who wants to host this??

More importantly, who wants to trust a 3rd party to serve this?

James L. wrote:

On 1/17/06, Jules [email protected] wrote:

Prototype and Scriptaculous should be hosted on a central server. This
way the browser caches the script, and doesn’t have to download it again
and again for every website.

But who wants to host this??

More importantly, who wants to trust a 3rd party to serve this?

Sounds like a good opportunity for someone to make some money.

Also using mod_expire or something similar you can easily instruct a
browsers to keep prototype around for a few weeks or longer.

This is not strictly on topic but something to consider when looking
at the file size of the JS. Most web applications probably get 80% or
more of their traffic from returning browsers.

Prototype 1.4.0 is 46 KB. That’s about the size of a small JPEG photo.

File size is certainly a concern, but the best part about Prototype is
that as it becomes larger, your code becomes smaller. For building
web applications, 46 KB is (to me, at least) certainly worth the cost
of admission.


Tobi
http://jadedpixel.com - modern e-commerce software
http://typo.leetsoft.com - Open source weblog engine
http://blog.leetsoft.com - Technical weblog

Ben Schumacher wrote:

.html and .xml files – or certain mime-types), I’ve configured a
number of servers to compress .js files if the accept-encoding header
is set. This can save a huge amount of download size… on one web
site that I help administrate, the average download size of
prototype.js is less than 8K, thanks to caching and compression.

I concur: for me prototype.js is under 10K while effects.js is under 8K.
With lighttpd all you need to do is to extend server config file with
this
line:

compress.filetype = (“text/plain”, “text/html”, “text/javascript”,
“text/css”, “application/xml”)

Lighttpd will cache the gziped files on first request, then subsequently
serve them for clients that can handle compressed content. You can do
the
same with Apache as well, just simple configuration tweak.

Grin, this topic looks like a solution looking for a problem.

There are precious few, if any, websites that manage to render within
50ms after the request. (I honestly doubt there are any… don’t forget
the request time… if you request a page from a distant or “slow” server
you’re slower than those precious 50 ms)

No, the research quoted is all about first impression… If i understand
it well, all that happened is that people have been shown a screenshot
(thus no html, as that would have been too slow) of a site, had to rate
it, and later second group had to rate the same sites, but this time
there were given the time to examine them, Surprise surprise, the first
impression and the impression after closer examination were about the
same.
So your entry page(s) should be smooth, that’s all that is to this
research (well not really, but it’s the only thing that matters to a
webmaster).

As for a huge bulk of 49k… Most sites already use gzip compression, as
said before you can use it for any text file (and probably for binaries
too, but those don;t compress well, and would use too much resource to
compress). But still… even 50k is not that bad, as it is cached you
only need to load it once… I think optimising this should really be
your last concern, more important is to deliver those dynamic pages as
fast as possible :slight_smile: