WAS Dynamic Cache

Figure 1 WebSphere Application Server Dynamic Cache overview

The Dynamic Cache is part of the IBM solution for improving performance of Java 2 Platform, Enterprise Edition (J2EE) applications running within WebSphere Application Server. It supports caching of Java servlets, JavaServer Pages (JSP), WebSphere command objects, Web services objects, and Java objects.

Figure 1 presents an overview of the Dynamic Cache engine. The Dynamic Cache stores its content in a memory-based Java object store. These objects can be accessed and manipulated by APIs that are provided with the service. The WebSphere Application Server also offers a Web application called Cache Monitor, which plugs into the cache to provide a view of its contents and statistics.
The Dynamic Cache uses a replacement policy like the least recently used (LRU) algorithm to create space for incoming entries when the assigned space is full by deleting contents based on the replacement policy. It can also be con- figured to push data to a disk cache from which it can reclaim the data if needed in the future. Removal of entries from a cache can also occur due to data invalidations, based on cache policies defined by the administrator.
Dynamic caching requires cache policies to be con-figured for an application, or for the application to explicitly use the available caching APIs. The Dynamic Cache (in the cachespec.xml file) stores a caching policy for each cacheable object. This policy defines a set of rules specifying when and how to cache an object (i.e., based on certain parameters and arguments), and how to set up dependency relationships for individual or group removal of entries in the cache.

Web Page/Fragment Caching

WAS Dynamic Cache provides caching for static and dynamic content of servlet and JavaServer Pages (JSP) pages and fragments making up Web pages. You cache servlets and JSPs in WAS declaratively and configure the caching by using a cache policy defined as an XML deployment descriptor included in your Web application. The cache policy file — named Cachespec.XML — is located in your Web module's WEB-INF directory along with the standard Web.XML file. When configuring the cache policy for Dynamic Cache, you must consider which servlets or JSPs to cache, what makes an invocation of that servlet or JSP unique, and when to remove content from the cache.

Identifying Servlets or JSPs to Cache

WAS Dynamic Cache parses the Cachespec.XML deployment descriptor on application startup and extracts from each . . . element a set of configuration parameters. Then, every time a new servlet or JSP is initialized (e.g., when the servlet is first accessed), the cache attempts to match that servlet/JSP to each cache-entry element to find the configuration information for that servlet.
Whenever a servlet of the specified class is initialized, WAS Dynamic Cache will match that servlet with the configuration for this element. The tag identifies the type of object being specified in the tag. For servlets and JSPs, the tag always has a value of servlet. Other types of cacheable entities (e.g., commands) use different values for the tag.
You can also identify a servlet or JSP using its Web URL path. For example, the following specification defines a cache policy for a servlet with a mapping in Web.XML of /action/view:
</ cache-entry>
The specification of the Web path is relative to the Web application's context root; therefore, the context root isn't included in the servlet's name.

Identifying What Makes a Cache Entry Unique

Most servlets and JSPs make use of a variety of inputs to generate unique dynamic content (otherwise, static HTML is used). This variable input usually comes in the form of request parameters, browser cookies, request headers, session attributes, path information, and request attributes from parent or peer servlets. To correctly cache a servlet or JSP, the cache policy author must identify the minimum set of input variables that makes a servlet's or JSP's output unique. Dynamic Cache supports declaring these inputs through the use of <cache-id> and <component> tags.

<name>/viewQuote </name>
<class>servlet </class>

<component id="ticker" type="parameter" />
<component id="style" type="session" />

Figure 2: Displaying the View Quote servlet's dependencies

For example, Figure 2 shows that the View Quote servlet's output is dependent on a request parameter named "ticker" and on a style attribute that was previously stored in the user's HTTP session. When a request is made to the URL http:/myhost/quoteapp/viewQuote?ticker=IBM by a user with an HTTP session attribute of style=frames, the following cache ID is generated:
Each request to the View Quote servlet with unique ticker and style attribute values will produce a unique cache entry instance. WAS supports many different input variable types that you can use to uniquely identify requests. Figure 3 (below) displays a list of valid types supported for use in generating cache IDs.

Removing Entries from the Cache

After populating the cache with servlet and JSP content, the next important consideration is how to remove the content. Dynamic Cache removes content based on an explicit timeout, an explicit invalidation based on cache ID or dependency ID, and replacement due to being selected for eviction by the cache's replacement algorithm. Dynamic Cache manages the replacement algorithm, which uses a cache entry's priority and frequency of access to determine which entry to evict when the cache has exceeded its capacity. The cache policy in Figure 4 sets each cache entry to have a timeout value of five minutes (300 seconds).
There are three ways to explicitly remove entries from the cache: programmatically by cache ID, programmatically by dependency ID, and declaratively by dependency ID.

Enabling and Analyzing Runtime Dynamic Caching

After you've created Dynamic Cache policies to cache and control appropriate page fragments in a cachespec.xml, next step as the WAS administrator is to enable servlet caching for the application. The WAS Web container configuration in the Administration Console provides a check box property for this. You'll find specific details for enabling caching by searching "Dynamic Cache" in the WAS Info Center at http://publib.boulder.ibm.com/infocenter/wasinfo/index.jsp.
Once you've enabled the Dynamic Cache, your next step is to monitor and optimize application caching at runtime. WAS 5.0 provides two methods for runtime analysis: the Dynamic Cache runtime monitor application and the Tivoli Performance Viewer (formerly the WAS Resource Analyzer). Both methods detail runtime cache behavior by showing cached servlets, JSPs, and objects, as well as cache invalidations, Least Recently Used (LRU) evictions, and other key runtime data.
The Dynamic Cache monitor is an installable Web application provided as part of WAS 5.0. To install the cache monitor application, follow these steps:
  1. Install the cachemonitor.ear application from the was_home>/installableApps directory.
  2. Access the cache monitor via a Web browser using the URL http://:/cachemonitor. For example, http://localhost:9080/cachemonitor would work for a WAS instance installed on the localhost node using the default HTTP port 9080.

Optimizing Dynamic Cache Runtime

You optimize caching performance by managing the WAS Dynamic Cache cached entries and memory usage. The number of distinct cached entries in a Web-based application can quickly grow. Each cache entry instance takes up memory in the WAS instance JVM. The amount of memory used varies based on the size of the object in the cache. A large JSP page or deeply nested Java object can require significant memory for each cache entry. You must consider memory constraints when setting the cache entry size. Setting it too large can result in system paging (drastically reducing performance) or even the dreaded Java out of memory error.
Memory limitations often restrict the Dynamic Cache from keeping all cache entries active, so you must take the following actions to keep performance high:
  • Adjust cache entry size for optimal cache size and memory requirements. The Dynamic Cache removes entries when the number of cached objects exceeds the WAS setting for cache entries. The default cache entry size in WAS 5.0 is 1,000 entries. If the number of cached objects exceeds the cache entry size, objects are evicted and must be re-created when they're accessed again, which reduces the performance efficiency of caching. The cache monitor application details the runtime statistics for cache entry size and number of active entries. If the number of active entries frequently reaches the cache entry size, you should increase the size setting on the Dynamic Cache configuration page to reduce overflow evictions.
  • Set cache entry priorities to "LRU out." When all cacheable objects can't be cached due to memory constraints, the Dynamic Cache offers a "priority" mechanism to better control which objects to evict. The cache uses an LRU algorithm when selecting candidates for eviction. This ensures that those cache entries hit often will remain in the cache. You can further tune this behavior by increasing the priority of cache entries that are expensive to compute. Priorities range from 1 to 10 and determine a cache entry's relative importance. For example, in the above scenario, you could raise the priority for user account cache entries. This would lead to stock quotes that are rarely accessed and less expensive to compute being evicted from the cache first. You set a cache entry priority in the cachespec.xml file, as Figure 3 shows.

<component id="ticker" type="parameter" />
<component id="style" type="session" />


Figure 3: Setting cache entry priority

  • Monitor for entries that have excessive invalidations and stop caching them. The Dynamic Cache also removes cached entries based on explicit timeouts and invalidations as defined by the caching policies in the application's cachespec.xml. Invalidating cache entries, such as objects in the Dynamic Cache, is expensive and can reduce performance. You can monitor for cache invalidations using the Tivoli Performance Viewer. You should disable caching for dynamic cache entries that have a low cache-hit ratio and/or high invalidation rate.

Distributed Caching

The WAS 5.0 Dynamic Cache can improve performance even more by providing caching in a cluster. A cluster is a cooperating group of WAS servers that are all running the same application code. WebSphere can provide dynamic replication to share cache data across a WAS cluster. You can replicate both application cache entries and invalidations.
There are three primary modes of operation for data replication:
  • None — Data is not replicated in the cluster.
  • Push — Data is immediately pushed to other cluster members as it's updated on one node.
  • Push Pull — the cache ID is immediately pushed to other cluster members so they know that an updated copy of the data is available. If a cluster member receives a request for that set of data in the future, it will retrieve the data then.

Disk Offload

When a memory-based cache is insufficient to hold the cached items required by your application as previously described, you can enable the Dynamic Cache to overflow the LRU cache items onto a disk. WebSphere has implemented this overflow-to-disk capability using a technology called Hashtable On Disk (HTOD).
HTOD manages a virtual hash table on the disk in 1 GB file chunks. Individual cache entries are then hashed and placed inside the virtual storage in a manner similar to a memory-based heap. This methodology provides high performance and low latency for disk storage and retrieval. To enable the overflow-to-disk capability, simply select the "Enable disk offload" option on the Dynamic Cache configuration page and specify a directory path for the cache in the "Offload location" field.

1. WebSphere Dynamic Cache: Improving J2EE application performance by R. Bakalova,A. Chow, C. Fricano, P. Jain, N. Kodali, D. Poirier, S. Sankaran, D. Shupp
2. IBM WebSphere Portal for Multiplatforms V5 Handbook – RED Book
3. WebSphere Information Center
0 Comments To ' WAS Dynamic Cache '

Post a Comment