Flexibility is of the utmost importance when designing the back-end of a website. Neither developers nor customers want a long, drawn-out process to simply add an item to a web store. The more flexible the site, the more data has to be stored outside of the source code; typically, in a database. All of this flexibility comes at a cost, however: performance. Every piece of data that is put into a database has to be retrieved somehow. This incurs performance costs, largely in round-trip time for database communication and query time on the database CPU itself. The question is: how do we balance flexibility vs. performance? The answer? Caching.
On a well-built, flexible website, most content that has been put into the database changes infrequently. Again, the web store is an excellent example. The data needs to be editable but is unlikely to be changed on a daily basis; this is mostly-static content. This kind of data is a prime candidate for application-wide caching. In fact, almost any data that is controlled by an admin screen can (and should) be easily cached. The mechanisms to flush and repopulate the cache can be built directly in to the admin screens, making control of the cache trivial.
ASP.Net provides several places for data caching. The two that are most relevant to this discussion are the Application Cache (System.Web.HttpRuntime.Cache) and the Session (System.Web.HttpContext.Current.Session). The Application Cache is used for generic application-wide data while the Session is used to store an individual user’s data. Using these caches can be approached in many ways, but there are two patterns that work very well: a utility class that manages all cached data directly and cache-aware business objects. In the former case, all the cache control logic is written in one class and all data needed by the website is accessed from that class. In the latter, each individual business object is cache-aware and responsible for its own data. The former option is easier to maintain while the latter maintains logical separation of data. In the end the data goes to the same cache.
One thing to keep in mind is memory: while memory is cheap, the memory available to any given website is not infinite. Caching very large data sets can be detrimental to the performance of a web server on the whole. A few thousand store catalog items should be fine — if you’re trying to cache the Library of Congress, you may need a bigger box.