Dienstag, September 06, 2005

Caching with AJAX applications

AJAX applications offer better response times and are faster (or at least seems to be faster) that traditional web applications.

The main reason behind this is the separation of the initial page loading from the loading of additional data and the absence of reloading this page again and again when the data of the page changes.

Building up a page works by using conventional call of a URL using the http-get mechanisms. Calling the server in the background by using asynchronous XMLHttpRequests is the second and often repeated part.

Caching can help to speed up both kinds of server requests and the best results can be bought out by preventing calls to the server.

Caching the initial page download

The page that is downloaded by navigating to the URL can be improved most effectively using the client-side cache features of the web browser. After adding the right http headers the browser will not ask the server for the specified time for new versions of the url-resource and will rely on the bytes that can be found in the client-side cache. Some things must be taken care of to make this working properly.

There is a very useful tool for windows from Eric Lawrence called fiddler available at http://www.fiddlertool.com/. He also wrote a good article on tuning and the http protocol: http://www.fiddlertool.com/Fiddler/help/http/HTTPPerf.mht.

Also http://msdn.microsoft.com/library/default.asp?url=/workshop/author/perf/perftips.asp is worth reading.

  • The initial page download MUST NOT contain any data or information that changes frequently or must be available to the client very soon after it changes. If any change of this data is initiated by a click or any other action on the client it is possible to force a request in this case even if the cache period has not ended yet. In this case you can live with a long caching period and an immediate change of the page.
  • The page MUST be designed to be a (almost) static resource from the view of the client.
    It however can vary for different users. When delivering personalized versions, the caching proxy servers as well as the caching features of the server must be turned off.
  • The smaller, the faster.
    I do not recommend using huge graphics and many inline style attributes. The Google applications show, that building fast web applications without many graphics is possible.
  • Use include files for JavaScript and CSS-files.
    Include files can be cached too on the client and can also be shared among different pages. It is good to use include files with common functionality or styles. Rarely ore once-only include files slow down the application.
  • Use only lowercase character in URLs.
    It is not obvious to windows users that page.aspx and Page.aspx are two different resources on the web. Even if the server (IIS and ASP.NET) treats these resources as equal, the client will retrieve and store them twice. The fact that makes them different is the kind of writing in the references in HTML tags "src" and "href" attributes. Because I am never sure how a reference is written in other places I prefer using lowercase character only.
  • Use the right http headers.
    For the IE there are the 2 special cache specific attributes pre-check and post-check that should be set correctly so that the IE does NOT even ask for a version but silently uses the version of the resources found in the client cache.

Caching the asynchronous requests

After the page is loaded there are more requests to the server now by using asynchronous calls in the background using the XmlHttpRequest objects.

When calling long running methods on the server for example complex SQL retrievals or expensive calculations is possible to instruct the server to cache the results and returning them without executing the same methods multiple times.

In ASP.NET you can use the CacheDuration property on the WebMethod attribute to specify the number of seconds the result may stay in the web server cache.


A simple sample on this can be found in the article at: http://support.microsoft.com/default.aspx?scid=kb;en-us;318299

The advantage in this approach is that all clients share the same cache and if there are multiple clients requesting the same service you might get very good response times.

It’s also possible to cache on the client. An approach that leads to less traffic on the net because repeating the same calls can be prevented. Http headers do not help in these situations because the request is not an http-get request and there is always a payload in the http body. Caching must therefore be done by some scripting on the client.

The caching feature in the JavaScript WebService proxy implementation can be enabled by calling the proxies.EnableCache method and passing the function that should further use caching. I added a button to the CalcFactorsAJAX.htm sample to how to enable show this:


Also the TableData.aspx sample uses caching to prevent retrieving the same records multiple times.

By calling this method a JavaScript object is added that stored all results and is used to prevent a call to the server if an entry for the parameter already exists inside this object.

This is not a perfect solution, but it works under the following circumstances:

  • The parameter must be a string or number that can be used for indexing the properties of a JavaScript object.
  • The cache doesn’t clear itself. It can be cleared by calling EnableCache once again.
  • Only methods with a single parameter are supported.

Caching works good into AJAX applications and speeds up by preventing calculations, downloads and webserver calls.


Anonym hat gesagt…

you might be interested in this page, or this one

Josito hat gesagt…

how to do secure caching? I need to do cache but secure applications. Encriptyng this cache for user or session, any idea?

MatHertel hat gesagt…

Hi Josito,

I agree, durable caches need some kind of protection.

This cache implementation just uses JavaScript variable space for minimizing network traffic. The cache is lost when the page is left.
So it's "protected" be deleting.