ASP.NET

The Road to ASP.NET 2.0

Until about 1993, there were very few Web servers in the world. Most of these earliest Web servers lived at universities or other research centers. In the early 1990s, the number of Web sites available began to increase dramatically. If you used the Web back in the early 1990s, chances are you probably came across little more than some HTML pages put together by the earliest Web site pioneers or some photo collections represented by links to GIF or JPEG files. Back then, there was no Google, no Yahoo, and certainly no MSN Search. The only way you could get to someone's site was if you either knew the site's Uniform Resource Locator (URL) or were referred to it through someone else's page.

Typing a URL like this:

http://www.somesite.com

into a browser's navigation window sent your request through a maze of routers, finally appearing at a server somewhere. The earliest Web servers lived on UNIX boxes. They performed the simple job of loading the HTML file and sending it back to the requestor (perhaps a browser such as Mosaic).

The advent of the Common Gateway Interface (CGI) introduced a standard way to interface with browsers to produce interactive Web applications. While a Web server that serves up plain, static HTML documents is useful in certain contexts (for example, a hyperlinked dictionary), more complex applications require a conversation between the user and end server.

That's where CGI comes in. With the help of HTML tags representing standard GUI controls, CGI applications can respond to requests dynamically. That is, CGI applications vary their output depending upon the state within the request and the application, paving the way for widely interactive applications. For example, a CGI application can examine an incoming request and determine the user is looking for a certain piece of information (perhaps a product code). The CGI application can perform a database lookup for the product and shoot some HTML that describes the product back to the client.

When it became clear that the Web was an important aspect of information technology, Microsoft entered the fray by introducing the Internet Services API (ISAPI) and a program to listen for HTTP requests: Internet Information Services (IIS). While the first UNIX Web servers started a new process to handle each HTTP new request (in keeping with the classical UNIX model), that model is very expensive. The Microsoft Web strategy is based on DLLs. It's much faster to load a DLL to respond to an HTTP request than it is to start a whole new process.

When programming to the Microsoft platform, IIS listens to port 80 for HTTP requests. IIS handles some requests directly, while delegating other requests to specific ISAPI extension DLLs to execute the request. In other cases, IIS will map a file extension to a specific ISAPI DLL. A number of ISAPI DLLs come preinstalled with Windows. However, IIS is extensible, and you may map different extensions to any ISAPI DLL-even one you wrote. To make a Web site work using IIS and ISAPI, developers employ ISAPI DLLs. These DLLs intercept the request, decompose it, and respond by sending back something to the client (usually some HTML).

While the IIS/ISAPI platform represents a very flexible and functional way to create Web applications, it's not without its downside. Specifically, ISAPI DLLs are traditionally written in C++ and are subject to the pitfalls of C++ programming (including such foibles as de-referencing bad pointers, forgetting to free memory, and traditionally lengthy development cycles). The other problem with ISAPI DLLs is that it's becoming increasingly more difficult to find C++ programmers. Enter Active Server Pages, or classic ASP.