Making an Application that Benefits from Caching

Create a new site. Call it UseDataCaching. Borrow the UseDataList code from the example in Tutorial 13. To bring it into your new project, right-click on the project in solution explorer. Choose Add Existing Item. Navigate the to the UseDataList.aspx and UseDataList.aspx.cs code from Tutorial 13.

The code you imported refers to the database in the SessionState example. That's okay. If you want to, you can change it to the database in this application's App_Data directory, but it's not strictly necessary as long as the path points to an available database somewhere on your system.

Examine the GetInventory, the BindToInventory, and the Page_Load methods. Following Listing shows the code.

   protected DataTable GetInventory()
   String strConnection =

      DbProviderFactory f =

      DbConnection connection = f.CreateConnection();
      connection.ConnectionString = strConnection;

      DbCommand command = f.CreateCommand();
      command.CommandText = "Select * from DotNetReferences";
      command.Connection = connection;

      IDataReader reader = command.ExecuteReader();

      DataTable dt = new DataTable();

      return dt;
   protected DataTable BindToInventory()
      DataTable dt;
      dt = this.GetInventory();
      this.DataList1.DataSource = dt;
      return dt;

 protected void Page_Load(object sender, EventArgs e)

    if (!IsPostBack)
        DataTable dt = BindToInventory();
        DataTable tableSelectedItems =
           Session["tableSelectedItems"] = tableSelectedItems;

Run the application to make sure it works. That is, it should connect to the DotNetReferences table and bind the DataList to the table from the database.

The GetInventory and BindToInventory methods are called by the Page_Load method. How often is Page_Load called? Every time a new page is created-which happens for every single HTTP request destined for the UseDataList page. In the case of running this application on a single computer with one client (in a testing situation), perhaps connecting to the database for every request isn't a big deal. However, for applications expecting to serve thousands of users making frequent requests, repeated database access actually becomes a very big deal. Accessing a database is actually a very expensive operation. As we'll see shortly, it may take up to a half second to simply connect to this access database and read the mere 25 rows contained in the DotNetReferences table. Data access can only get more expensive as the size of the tables in the database grows. A half-second in the computer processing time scale is eons to the program.

Now think about the nature of the inventory table. Does it change often? Of course, not in the case of this simple application. However, think about how this might work in a real application. The items carried within an inventory will probably not change often (and such changes will probably occur at regular, predictable intervals). If that's the case, why does the application need to hit the database each time a page is loaded? Doing so is certainly overkill. If you could take that data and store it in a medium offering quicker access than the database (for example, the computer's internal memory), your site could potentially serve many more requests than if it had to make a round-trip to the database every time it loads a page. This is a perfect opportunity to cache the data.