The C++ Experience

As programmers wrestled with OOP, they also dealt with issues related to their chosen language . Visual Basic developers began to understand that the language and environment may be simple, but it is prone to poor performance and poor designs, leaving customers stranded with slow applications that they could not extend or maintain.

In C++, server-side developers found performance, but discovered another challenge. They did application development using a systems programming language. New terminology like memory-stompers and DLL Hell gave testament to the frustration of the masses. Simple problems dogged them.

Pointer Arithmetic

With C++, a pointer could point to any block of memory, regardless of whether it was the intention of the programmer. For example, consider the simple program in Example 2-1. It moves a block of memory from one location to another, and inverts it. Unfortunately, the example is off by 1. The code touches memory one byte beyond the from block. You would probably not see the error right away. You'd see it later, when you tried to manage the memory of this block, or another one. C and C++ compilers often manage memory with a linked list, and the pointers to the next block in the list sit just outside the allocated blocks! These types of errors hurt systems developers, and absolutely murdered applications developers, who didn't have the background to effectively troubleshoot these types of problems. Reliability also suffered.

Example 2-1. Move and invert a block of memory
// move and invert from_block into to_block with size size

int i;
for(i=0; i<size; i++) {
  to_block[size-i] = from_block[i];  // off by one!

Nested Includes

One of my most vivid and frustrating memories from working with IBM came from porting a C++ application that had include files nested 37 layers deep. It can be a very difficult problem to manage, especially for inexperienced developers.

The problem goes something like this. In C++, you specify interfaces to your methods, with other supporting information, in a header file, or .h file. For example, in MySQL, you have a main include file that has these includes (I've omitted most of the code for brevity):

    #ifndef _global_h               /* If not standard header */
    #include <sys/types.h>
    #include <custom_conf.h>
    #ifdef _ _LCC_ _
    #include <winsock.h>            /* For windows */
    #include "mysql_com.h"
    #include "mysql_version.h"

That doesn't look so bad, until you consider that some of these includes are compiled conditionally, so you really must know which compiler directives are set before you can decide definitively whether something gets included. Also, one of your include files might include another include file, like this line in mysql_version.h:

    #include <custom_conf.h>

In truth, this MySQL tree goes only three levels deep. It's an excellent example of how to code enterprise software in C++. It's not usually this easy. Any dependency will have an include file, and if that code also has dependencies, you'll have to make sure those include files and their associated libraries get installed and put in the right place. Lather, rinse, repeat.

Java does not have this problem at all. You deal with only one type of source file, with one kind of import, and no conditional compilation.


Many of the largest corporations used C++ for enterprise application development, even though it had very limited support for managing strings . C programs simply used arrays of characters for strings, like this:

    char str [  ] = "Hello";

This is going to allocate a fixed-length string to str. It's merely an array of characters. And it can never hold a string longer than six characters. You could decide to use the C++ string library instead.

C++ did support the C-style string library for some string-like features. For example, to assign one string to another when the memory has already been allocated, you need to copy the bytes instead, like this:

    strcpy (string1, string2);

C-style strings were ugly, dangerous, and tedious. As with any other type of pointer manipulation, you can walk off the end of a block and create an error that may not be discovered for hours or months. C++ strings are far more tedious than alternatives in languages, including Java.

Beginning in 1997, the ANSI standard for C++ introduced a more formal string. You could have a more natural representation that looked like this:

    String str = "Hello, I'm feeling a little better.";

And many C++ libraries had proprietary string libraries. But the damage was done. Many programmers already knew C, and never used the C++-style strings.

DLL Hell

On Microsoft operating systems and OS/2, you compiled libraries that might depend on other libraries. The operating system linked these together with a feature called Dynamic Linking Libraries (DLLs) . But the OS did not do any kind of dependency checking. As many applications share versions of the same programming libraries, it was possible, and even probable, that installing your application might replace a library that another application needed with an incompatible version. Microsoft operating systems still suffer from DLL Hell today.


As the C++ community grew, they looked to distribute their code in ways beyond client-server. Common Object Request Broker Architecture, or CORBA, emerged quickly. With CORBA, you could build applications from objects with well-defined interfaces. You could take an object, and without adding any remoting logic you could use it on the Internet. Companies like IBM tried to push a CORBA model into every object, and companies like Iona focused only on distributed interfaces around remote objects. The kindling around CORBA began to smolder, but never really caught fire. The distribution that was so transparent and helpful was actually too easy. People built applications that relied on fine-grained communication across the wire. Too many round-trip communications led to poor performance and reputation problems for CORBA.

Inheritance Problems

C++ nudged the industry in tiny steps toward OOP, but the steps often proved awkward and counterproductive. C++ had at least three major problems:

  • C++ actually did not force object orientation. You could have functions that did not belong in classes. As a result, much of the code written in C++ was not really object-oriented at all. The result was that the object-oriented C was often more like (C++ ).

  • C++ did not force one root object. That led to object trees with many different roots, which proved awkward for object-oriented developers.

  • C++ supported multiple inheritance . Programmers had not accumulated the wisdom born from experience to use multiple inheritance correctly. For this reason, many languages have a cleaner implementation of multiple inheritance, called a mixin .

Multiple inheritance is a powerful tool in the right hands, but it can lead to significant problems for the novice. Example 2-2 shows an example of multiple inheritance in action. A Werewolf is part Man and part Wolf. Problems arise when both Man and Wolf inherit from a common class, called Mammal. If Werewolf then inherits a method introduced in Mammal, it's ambiguous whether Werewolf would inherit through Man or Wolf, as in Figure 2-2. This problem, known as the diamond inheritance problem , illustrates just one of the problems related to multiple inheritance.

Example 2-2. Multiple inheritance in C++
class Werewolf: public Man, public Wolf

Multiple inheritance is like any power tool. It gives you leverage and speed and can save you time, but you've got to have enough knowledge and experience to use it safely and effectively to keep all your fingers and toes. Most developers using C++ as an applications language had neither.

Figure 2-2. The diamond inheritance problem is just one of the complexities that can arise with multiple inheritance


Like Perl, C++ is most definitely an expressive language, but that flexibility comes at an incredible cost. C++ is full of features that might make sense to a seasoned developer, but that have catastrophic effects at runtime. For example, = often doubles as an assignment and a test. Most new developers will get burned by this problem. It takes years and years of study and experience to become proficient with C++. For systems development, that makes sense, because you ultimately need the performance and control inherent in the ability to put every byte where you want to. Applications developers simply don't want to deal with those low-level details.


Most developers expected C++ to be more portable, but it didn't turn out that way. We were buried under mountains of incompatible libraries, and inconsistencies between libraries on different platforms. C++ left so much in the hands of the vendors implementing the spec that C++ turned out to be one of the least portable languages ever developed. In later years, problems got so bad that you often couldn't link a library built by different versions of the same compiler, let alone different operating systems.

Like mud accumulating on a boot, the language that once looked so cool on a resume began to weigh down the brightest developers, and stymie lesser developers completely. Instead of moving to a limited language like Visual Basic or Power Builder, they waited, and the storm clouds grew darker still.


You don't get a perfect storm without all the conditions. The primary success in the initial Java explosion was based on the extraordinary migration of the C++ community. To do this, Java had to walk a tightrope with excellent balance. C++ had some obvious warts, like an awkward syntax, multiple inheritance, primitives rather than objects, typing models, poor strings, and awkward libraries. In some cases, Sun decided to opt for a simpler, cleaner applications language. Java's research roots as an embedded language drove a simplicity that served it well. In other cases, it opted to cater conservatively to the C++ community.

It's easy to look at Java now and criticize the founders for decisions made, but it's clear to me that they walked the tightrope very well. The rapid growth of the hype around Java and the community allowed a success that none of us could have possibly predicted. All of this happened amid an all-out war between Microsoft and IBM! If Java had stopped at this point, it would have been successful. But it didn't stop here. Not by a long shot.