The enduring appeal of Java isn’t hard to understand: With Java, you write code once and it can run on almost any modern computer or operating system—PC or Mac, Windows, Linux, OS X, whatever. It works that way because the Java compiler turns the source code into a kind of ersatz machine code that each of these different systems can execute when equipped with the proper run-time software. So different computers running different operating systems can all become, in programmers’ parlance, Java virtual machines.
That wonderful write-once, run-anywhere capability has made Java the most popular programming language now in use and, by some measures, the most successful computer language of all time. But what you may not know, particularly if you don’t have graying hair, is that this kind of software portability didn’t start with Java (which Sun Microsystems released in 1995). Indeed, the roots of this approach date all the way back to the late 1970s and early 1980s, during the heyday of the Apple II, the first IBM PC, and many other personal microcomputers built by companies that are long gone.
Much of the relevant work took place at the University of California, San Diego (UCSD), and it influenced academic computer science, the design of the Pascal programming language, object-oriented programming, and graphical user interfaces. Although that work did not produce a commercial success, the story of these visionary programmers and their audacious plans offers some unique insights into how the computer industry evolved—for example, why the Apple Macintosh is what it is. It also explains how an accident of fate would later bring these ideas to the world again in the form of Java.