Remembering Steve Jobs and Apple Computer

Looking at the retrospectives on Steve Jobs, after news of his death, they mostly recalled the last 10 years of his life, the iPod, the iPhone, and finally the iPad. Most of the images show him as an old man, an elder of the digital age “who changed the world.” This is not the Steve Jobs of my memories, but it is one of our modern era, one that has created digital devices that run software in tightly controlled environments for the consumer market.

My fond memories of Apple began with some of the company’s first technology products, back in the early 1980s. I didn’t know who Steve Jobs was at first, but I found out rather quickly. My memories of him are as a vibrant young man who believed that small, personal computers were the wave of the future, though “small” meant something that would fit on your desk…

I can say that I “experienced” Steve through using the technology he helped develop.

Apple’s first big success: The Apple II

My first experience with their technology was the Apple II Plus computer.

The Apple II Plus, from Wikipedia.org

The electronics were designed by Steve Wozniak, who has been called a “Mozart” of electronics design. Through the creative use of select chips and circuitry, he was able to pack a lot of functionality into a small space. Jobs designed the case for the computer. At the time that the first Apple II came out, in 1977, it was one of the first microcomputers that looked like what we might expect of a computer today. Most computers of the time were constructed by hardware hackers. The Apple was different, because you didn’t have to worry about building any part of it yourself, if you didn’t want to. The thing I heard that was really appealing about the Apple when it was launched was that the company was very open about how it worked. They wouldn’t talk about 100% of everything in it, because some of it was proprietary, but they’d tell hackers about everything else. They said, “Go to town on the darn thing!” That was the reason it got an early lead on everyone else, because Jobs recognized that its market was mainly software hobbyists. It was appealing to people who wanted to do things with the electronics, to expand upon what was there, but it was targeted at people who wanted to manipulate the machine through software.

My first encounter with a II Plus was at my Jr. high school in 1982. The school had just 3 of them. One was in the library, and students had to sign up for time on it. The other two were owned by a couple teachers, and were kept in their offices. The following school year my school got a computer lab, which was filled with Apple IIe’s. That same year the local public library made an Apple II Plus available. Most of the programming I did in my teen years was done on the II.

It was a very simple, but a very technical machine, by today’s standards. When you’d start it up, it would come up in Applesoft Basic (written by Microsoft), a programming language environment that doubled as the computer’s command-line interface/operating system. All you’d see was a right-square-bracket on the lower-left side of a black screen, and a blinking square for a cursor.

Applesoft Basic, from Wikipedia.org

It offered an environment that allowed you to run an app., and manage your files on disk, by typing commands. What I liked about it was that it offered commands that allowed me to do commonsensical things. With other 8-bit microcomputers I had used, I had to go through gyrations, or go to a special program to maintain disk files. If I wanted to do some graphics, the commands were also right there in the language. With some other popular computer models, you had to do some complicated maneuvers to get that capability. It offered nothing for sound, though. If you wanted real sound, you had to get something like a Mockingboard add-on that had it’s own synthesizer hardware. The computer had several internal expansion slots. It was not designed for sound out of the box. If you had nothing else, you had to go into machine language to “tweak” the computer’s internal speaker to get that, since it was only designed to beep. This was not “fixed” until the Apple IIGS, which came out in 1986. Regardless, Apple games tried their best to get sound out of the computer’s internal speaker.

Basic was considered a beginner’s programming language, for newbies. It was less intimidating to learn, because the commands in the language looked kind of like English. Even though it was looked down upon by hackers, Basic was what I used on it most of the time. It was technology from an era when learning something about how the computer operated was expected of those who bought them.

To really harness the computer’s power you had to program in assembly language, or type bytes into the machine directly, using what was called the computer’s built-in machine monitor. The square bracket prompt would change into an asterisk (“*”), and you were in no man’s land. The Basic environment was turned off, and you were on your own, navigating the wilds of the machine’s memory, and built-in routines, giving commands and data to the machine in hexadecimal (a base-16 numbering system). This is what the pros used. You had total command of the machine. You also had all of the responsibility. There was no protected memory, and the machine gave you as much rope as you needed to hang yourself. If your program wandered off into disk operating system code by accident, you might corrupt the data you had on disk. Most of the commercial software written for the Apple II series was written in this mode, or using a piece of software called an assembler, that allowed the programmer to use mneumonic codes, which were easier to deal with. Programs written in machine code ran faster than Basic code. Basic programs ran in what’s called an interpreter, where the commands in the program were translated into executable code as the program ran, and this which was a slower process. As a consolation, some people used Basic compilers to translate their programs into a machine code version in one go, so they’d run faster.

If you wanted to run a commercial app., you would insert a disk that contained the app. into the floppy disk drive, and reboot the machine. The app. would automatically load off of disk. If you wanted to run a different app., you’d typically remove the disk from the disk drive, and insert a new one, and repeat the process. There was no desktop environment to operate from. You booted into each program, and you could only run one program at a time.

This was pretty much the world of the Apple II. Once graphical interfaces became popular, Berkeley Softworks came out with a piece of software called GEOS that gave the II a workable graphical interface, though I think most Apple users thought it was a novelty, because most of the applications people cared about didn’t run on it.

Another big market for the II was in the public schools. For many years it was the de facto standard in computing in America’s schools. A lot of educational software was written for it.

Stickybear on the Apple II, from atarimagazines.org

A third big market opened up for it when VisiCalc (Visible Calculator) came out in 1979, written by Dan Bricklin and Bob Frankston. It was the world’s first commercial spreadsheet, and it came out first on the Apple II. It was the II’s “killer app,” a piece of software so sought after that people would buy the computer just to be able to use it.

VisiCalc, from Wikipedia.org

I first learned what a spreadsheet was by using VisiCalc. Modern spreadsheets use many of the same basic commands, and offer the same basic features that it pioneered. The two main features it lacked were it did not support macros, and it had no graphing function. Each cell could contain a formula that could draw values from other cells into a calculation. Other than that it was not programmable.

An interesting bit of history from this era is that some of the software from it lives on. Castle Wolfenstein, by Muse Software, one of the popular games for the Apple II has had quite a lot of staying power, into our modern era. Remember Wolfenstein 3D by Id Software, and Return to Castle Wolfenstein on the PC? Wolfenstein started on the Apple II in 1981. The following video is from its sequel, Beyond Castle Wolfenstein, which came out in 1984. It gives you the same flavor of the game. Unlike its modern translations, it was a role-playing game. The object was to pick up items that would help you get to your objective. Shooting was a part of the game, but it wasn’t the only thing you had to do to get through it. As I remember, the ultimate goal was to blow up the castle.

Beyond Castle Wolfenstein

Another Apple original that has had a lot of staying power is Flight Simulator. It was originally written by a company called subLogic. They came out with Flight Simulator II, which was ported to a bunch of different computers, including the IBM PC. This was the second version of the product, which was a huge improvement on the original. It featured realistic maps of cities (as realistic as you could get with such a low-resolution display), colorized landscapes (rather than the wireframe graphics in the original), realistic weather conditions you could select, and a variety of aircraft you could fly. Later, expansion disks came out for it that featured maps of real cities you could fly to. Microsoft purchased the rights to Flight Simulator II, and developed all of its subsequent versions.

The original Flight Simulator on the Apple II

Their flops

Apple had some early flops. The first was a now-little-known computer called the Apple III, which came out in 1980. It was a slightly faster version of the II, using similar technology. It was designed and marketed as a business machine. Unlike the II it had an 80-column text display. The II had a 40-column text display, though in the early 1980s there were 80-column expansion cards you could get for the IIe. It had a higher memory capacity, and it was backward-compatible with the II, through a compatibility mode.

The Apple III, from Wikipedia

Their next flop came soon after, the Apple Lisa, which came out in 1983.

The Apple Lisa, from Wikipedia

A screen from the Lisa, from Wikipedia

It was also marketed as a business computer. Most people give props to the Macintosh as being Apple’s first computer with a graphical user interface, and a mouse, but it was the Lisa that had that distinction. This was Apple’s first crack at the idea. It had some pretty advanced features for microcomputers at the time. The main one was multi-tasking. It could run more than one application at a time. Its biggest problem was its price, about $10,000. Unlike the Apple III, the Lisa had some staying power. Apple marketed it for the next few years, trying variations on it to try to improve its appeal.

I had the opportunity to spend a little time with a Lisa at a computer show in the mid-1980s. It had a calendaring desk accessory that was a revelation to me. It was the first of its kind I had seen. In some ways it looked a lot like iCal on OS X. My memory is it functioned just like it. It would give you a monthly, weekly, and daily calendar view. If you wanted to schedule an event for it to alert you about, you entered information on a form (which looked like a conventional dialog box), and then when that date and time came up, it would alert you with a reminder.

When I was in Jr. high and high school, I used to carry around with me a pocket spiral-bound notebook so I could write down assignments, and when I had tests coming up. It looked pretty messy most weeks. I really wanted a way to just keep my schedule sane. The Lisa demonstrated that a computer could do that. I didn’t have regular access to a Lisa computer, though, and there was absolutely no way my mother could afford to get me one, especially just to give me something with a neat calendar! So in high school I set out to create my own weekly planner app. on an Apple II, using Basic. I didn’t own one, but the school had lots of them. I figured I could use them, or the one at the local public library, which I used regularly as well. I wrote about the development of it here. I called my app. “Week-In-Advance,” and I wanted it to have something of the feel of the Lisa calendar app. I saw. So I set out to create a small “graphical interface” in text mode. I succeeded in my efforts, and it showed me how hard it is to create something that’s easy to use! It was the biggest app. I had written up to that point.

The Macintosh

If you’re a modern Mac user, this was kind of its great-granddaddy… Anyway, it’s related. I’ll explain later. It came out in 1984, and was Steve Jobs’s baby.

The first Macintosh, from Wikipedia

I had the thought recently that Jobs invented the idea of the “beige case” for a computer with the Macintosh, which PC makers followed for years during the 1990s, and everyone got tired of it.

This almost was Apple’s third flop. It created a sensation, because of its simple design, and ease of use. Steve Jobs called it “The computer for the rest of us.” It was targeted at non-techie users who just wanted to get something done. They didn’t want to have to mess with the machine, or understand it. The philosophy was it should understand us, and what we wanted.

My local public library got a Mac for people to use a year after it came out. So I got plenty of time using one.

It was a cheaper version of the Lisa, so it was more accessible, but there wasn’t a whole lot you could do with it at first. The only applications available for it at its launch were from Apple: MacPaint, a drawing/painting program (rather like Microsoft Paint today, except with only two colors, and a bunch of patterns with which you could paint on the screen), and MacWrite, a word processor. Just from looking at it, you can tell that no Apple II programs would run on it, and I don’t think you’d want that anyway.

As you can see, it had a monochrome display. It could only display two colors, white and black. This drew some criticism, but it was still a useful machine. The Mac wouldn’t have a color display until the Macintosh II came out in 1987. Incidentally, other platforms had color graphical interfaces a year after the Mac first came out. There was GEM (Graphics Environment Manager) by Digital Research (which was mainly for the IBM PC), the Atari ST (which used GEM), and the Commodore Amiga, not to mention Version 1.0 of Microsoft Windows.

The Mac was probably the first computer Apple produced that represented a closed design. The first Macs were hardly expandable at all. You could expand their memory capacity, but that was it. It had a built-in floppy drive, but no internal hard drive, and no ability to put one inside. The Mac popularized the 3-1/2″ floppy disk format. Before it came along the standard was 5-1/4″ disks. It had an external connector for a hard drive, so any hard drive had to exist outside the case. It had some external ports so it could be hooked up to a printer, and I believe a phone modem.

In that era we were all used to floppy drives making noises. The first Mac’s floppy drive was also a bit noisy, but it had a “hum” sound to it, as it spun the disk. It sounded like it was “humming” some tune that only it knew. Bill Gates and Steve Jobs, when they talked about the development of the Mac at their D5 Conference appearance (below), called it a “twiggy” drive. The reason for this sound it made, I later discovered, is it used a data compression technique that Steve Wozniak had developed for the Apple II’s disk drives, called Group Code Recording (GCR). The drive was developed in an effort to store data uniformly on the disk, so as to fit more on it to make floppies a more reliable storage medium. The reason for the sound it made is they varied the speed of the drive, depending on where the read/write head was on the disk. You have to understand a little something about physics to get why they did this.

(Update: 10-21-2011: I realized after doing some research that I held a mistaken notion that the variable-speed drive was a way of achieving data compression. The reason they varied the speed of the drive is the format they used, called GCR, used a fixed sector size. The GCR format compressed data on the disk. Since they varied the speed, they were able to get fewer sectors closer in to the center than towards the outer edge. In effect, they inverted the “normal” way of storing data, and so more evenly distributed data on floppy disks. This article explains it in more detail, under Technical Details:Formatting:Group Code Recording.)

All other disk drives on other computers stored more data on the inner tracks of a disk than on the outer tracks. The reason was the disk was always spun at the same speed in those drives, no matter where the read/write head was on the disk, and the read/write head always read and stored data at a constant rate. In physics we know, though, that when you’re rotating anything at a constant rate, the part near the center moves at a slower speed than parts that are farther from the center. As a result, the disk drive will end up packing more data into a smaller space near the center of the disk than it will near the outside of it. The “density” of the data will vary depending on how far from the center it’s stored. Imagine dropping bits of material on a piece of paper that’s sliding beneath your hand. If you speed up the paper, the bits of material will be more spread out on it. I’m not sure how this was done on the Apple II drives, but on On the Mac, the way they tried to deal with this issue was to spin the disk faster when the head was moved towards the center, and slower as the head moved to the outer edge, thereby generating different “motor” sounds as it sped up and slowed down the disk.

Their GCR format compressed data, which made it possible to store a little more data per disk than on most other drives. Looking back on it, this technique might seem trivial, given the amount of data we store today, but back then it was rather significant. A conventional double-sided, double-density 3-1/2″ disk drive could store 720K on a disk. But a Mac could store 800K on the same disk, providing 11% more space per disk.

Back then most computer users didn’t have a lot of data to store. Applications and games were small in size. Documents might be 30K at most. Most people didn’t think of storing large images, and certainly not videos, on these early machines. The amount of data being passed around on networks tended to be pretty small as well. So even what seems like a piddly amount of extra disk space now was significant then.

Edit 7/9/2019: I found a video from YouTube user “Jason’s Macintosh Museum,” using a 128K Macintosh, running its Guided Tour disk. It gives you a feel for what running the original Mac was like.

Edit 10-17-2011: A minor point to add. You’ll notice in the picture of the Mac that there’s no disk eject button. This was because the computer apparently did some housekeeping tasks using your disk, and it would be damaging to data on your disk, and/or cause the system to crash, if you could eject the disk anytime you wanted. I have no idea what these housekeeping tasks were, but whenever the user wanted to eject the disk (which you could do by selecting a menu option, or pressing a command key combination, or dragging the disk icon to the trashcan icon on the desktop), the computer would spend time writing to the disk before ejecting it via. a servo mechanism. Sometimes it would spend a significant amount of time doing this. It may have been an operating system bug, but I saw instances where it would take 5 minutes to eject the disk! All the while the disk drive would be running…doing something you knew not what. In some rare instances the computer wouldn’t eject the disk at all, no matter what you tried. In those situations it had a pinhole just to the right of the disk slot, which you can see in the picture if you look closely, that you could stick the end of a paperclip into, to manually force the disk drive mechanism to eject the disk. I remember seeing a vendor once that sold nice colored “disk ejectors,” which had a handle that looked like a USB thumb drive, and a pin at one end that you’d stick into this hole. A paperclip did the trick just fine, though.

A major difference between the Mac and other computers of the day was it did not come with a programming language. There were programming languages available for it, but you had to get them separately. It was a bold departure from the hacker culture that had influenced all the other computers in the marketplace.

In contrast to the computers I had used prior to using the Mac, including the Apple II, the experience of using it was elegant and beautiful. It had the highest resolution of any computer I had used before. I loved the fact that I could look at images in greater, finer detail. On the desktop interface, windows appeared to “warp” in and out of view, except it was really the window’s frame that moved. It didn’t have enough computing power to move a whole window’s contents. The point is they didn’t just appear. You get that effect on the modern Mac UI as well, and it looks a lot neater.

The main drawback was that it could only run one app. at a time. Even so, it was possible to cut, copy, and paste across applications. How was this done? Well, Mac OS System (the name of the operating system at the time) had what we’re all familiar with, a clipboard function. It also had a desk accessory called “Scrapbook” that allowed you to add multiple images and document clippings to it. You could add clippings by using the cut, copy, and paste feature in an application. Scrapbook would automatically cache these clippings, possibly saving them to disk (this is reminding me that I used to do a lot of disk swapping with the old Macs, which was a reason that more financially well-endowed users bought a second floppy drive, or a hard drive). The scrapbook was finite in size, and would eventually cycle out old clippings, as I recall. Anyway, when you’d load a new application, what was in the clipboard would remain available, so you could paste it. In addition, if you put clippings in the scrapbook, those were also available. Desk accessories could be run at all times, and so you could open Scrapbook while you were using an app., go through its clippings, and grab anything else you wanted to paste. Needless to say, cutting and pasting across applications was an involved process.

This was soon fixed by a system utility written by Andy Hertzfeld called Switcher, once the Mac was given more memory (the very first model came with a total of 128 kilobytes of memory, and so wasn’t such a great candidate for this). The idea was to enable users to load multiple apps. into memory, and allow them to switch between them without having to quit out of one to get to the others. It enabled you to go back to the desktop to launch an app. without having to quit out of the apps you had already loaded. It was rather like how apps. on mobile devices work now.

I read up on the history of Switcher a few years ago. Microsoft was very enthusiastic about it at the time, because they recognized that users would buy more apps. if it was easier to launch more than one at a time. It was really nice to use. It was like using OS X’s multiple desktop feature, except that you could only see one app. on the screen at a time. It had the same effect, though. When you’d switch, the app. you were using would “slide” out of view, and the new one would slide on right behind it. It was like you were shifting your gaze from one app. to the other. It worked really well for early Mac apps., because there was no reason to be doing background processing with what was available. It created the illusion that all apps. were “running” at the same time, when they really weren’t. All the other apps. were in suspended animation when they weren’t in view. Copying clippings and pasting between apps. became a breeze, though.

It was said of Steve Jobs that he had high standards that drove engineers at Apple nuts, but it seems to me he was willing to compromise on some things. The original Mac had a monochrome display, which I’m sure he knew wasn’t as exciting as color. It was a single-tasking machine, so in the beginning, people were running and quitting out of applications a lot. It had a small amount of memory for what it did, and so you couldn’t load multiple apps., which made multi-tasking impractical. You couldn’t cut and paste things between applications easily without the Switcher add-on, which came out about a year after the Mac was released. I’m sure all of these compromises were made to keep the price point low.

The Mac had its critics early on, calling it a “child’s toy.” “Productive people don’t need cute icons and menus to get work done. They get in the way.” There were a lot of advocates for the command-line interface over the GUI in those days. In a way, they were right. Alan Kay said years later that the reason they came up with an iconic, point-and-click graphical interface at Xerox PARC, which the Lisa and Mac drew inspiration from, was to create an easy-to-use environment for children. Not to say that a graphical interface has to be for children, but this is the model Apple drew from.

Jobs leaves Apple

The unthinkable happened at Apple in 1985. Jobs was ousted from the Mac project. He was replaced by his hand-picked CEO, John Sculley, and Jobs left. I remember reading about it in InfoWorld, and being kind of shocked. How could the man who started the company be ousted? How could they think that they could just get rid of this creative person, and keep it the same company? Would Apple survive without him? It felt like something was wrong with this. I wasn’t a big Apple fan at the time, but I knew enough to know that this was a big deal.

After this, I lost track of Jobs for a while. Apple seemed to just move along without him. As I mentioned earlier, they came out with newer, better computers. The Apple Mac had grown to 10% market share by the end of the 1980s, an impressive feat when PCs were growing in dominance by leaps and bounds. In hindsight, the only thing I can point to as a possible problem is they made only incremental improvements to the Mac. They coasted, and for a few years they got away with it.

The only thing about this period that I thought sucked at the time was Apple was suing every computer maker in sight that had a graphical interface, claiming they had violated its copyrights. It had the appearance that they were trying to kill off any competition. The only company they won against was Digital Research, with their GEM operating system, and then only on the PC version, which died out shortly thereafter. It was getting so bad that the technology press was calling on Apple to quit it, saying they were stifling innovation in the industry. It didn’t matter anyway, because Microsoft Windows was coming along, and it would eventually eat Apple’s lunch. Microsoft might’ve actually had Apple to thank for Windows’s success. Apple probably weakened Microsoft’s other competitors.

Nevertheless, Apple seemed to be succeeding without Jobs for a time. It was only when Sculley left in the early 1990s that things went downhill for them, from what I could see.

NeXT

I rediscovered Jobs a bit in college, when I heard about his new venture that created the NeXT computer in 1988.

The NeXT computer, from simson.net

The keyboard, monitor, and laser printer for the NeXT, from simson.net

The NeXTStep interface, from Wikipedia

Come to think of it, maybe Jobs was trying to communicate something in the name…

At the time that the NeXT came out, it seemed futuristic. The computer was shaped like a cube. The case was made out of magnesium, and it featured a removable magneto-optical drive (removable in the sense that you could take the magneto-optical disk, which was housed in a cartridge, out of the drive). Each disk held 256 MB, which was a lot in those days. Most people who had hard drives had 1/4 of that storage capacity at most. The disk was made out of a metal. The way the drive worked was a laser would heat a spot on the disk that it wanted to write to, to what’s called the Curie Point (a specific temperature), so that the magnetic write head could change its polarity. Pretty complicated! I guess this was the only way they could find at the time to achieve that kind of rewritable storage capacity on a removable disk, probably because it afforded a relatively large or imprecise read/write head. Only the part of the disk that was heated by a narrow laser beam would change. So the only part you had to worry about being terribly precise was the laser (and the read head).

Out of the gate, the NeXT computer’s operating system was based on Unix, using the Mach kernel, as I recall. It used Objective C as the standard development language for the system, and was accompanied by some visual user interface design tools. The display system used a technology called Display PostScript to create a good WYSIWYG (What You See Is What You Get) environment.

In 1990, NeXT made a splash by announcing a slimmer, sleeker model, coined the “pizza box,” because aside from its nice look, that’s what it looked like. The magneto-optical drive was gone. It was replaced by a hard drive, and a high-density floppy drive. The big feature that got developers’ attention was a Motorola digital signal processor (DSP) chip that was built into it. One of the ways it was used was to calculate complex mathematical equations at high speed, taking the load for that off of the main processor.

the second generation NeXT computer, from Wikipedia

I got only a few chances to use a NeXT, for a brief time. Again, the computer was way out of my price range. It seemed nice enough. It had that feel about it that was like the Mac, where it would do things–just fine touches–so you didn’t have to think about them. I remember having an “information” dialog open on a file, and doing something to the file. Rather than having to refresh the information window, it updated itself automatically in the background. We take stuff like this for granted now, but back then I noticed stuff like that, because no other computer acted that way.

Doing some research in retrospect about a year ago, I found a demo video that Jobs had created about the second generation NeXT computer. I discovered that they had designed software for it so it could be used as an office platform. You could embed things like spreadsheets, sales presentations, and audio clips in e-mails you’d send to people. This was before most people had even heard of the internet, and long before the MIME protocol was developed. They had advanced video hardware in it so that you could watch high-quality digital color video, which was really hard for most computers to do then. They had also shown an example of a subscription app., demonstrating how you could read your daily issue of the New York Times online. This was done around the same time that the very first web browser was invented. If this rings a bell, though, that’s because Apple has done demos like this within the last few years, as had Microsoft, when they first introduced Windows Vista.

A little trivia. Tim Berners-Lee wrote the world’s first web server, and the first web browser on a NeXT workstation.

The world’s first web browser

Once I got out into the work world, in the mid-90s, I read that things weren’t going so well for NeXT. They eventually sold off their hardware division to Canon. However, things weren’t looking totally down for Jobs. I learned that he had also been heading up a company called Pixar. “Toy Story” came out, and it was amazing. The computer graphics were not as good of an effect as “Jurassic Park,” which had come out a couple years earlier, but I was still pretty impressed with it, because it was the first feature-length movie to use only computer graphics for the whole thing. Mostly what appealed to me were the memories of the toys I had as a kid. The story was good, too.

It seemed like NeXT was on its last leg when Apple bought the company in late 1996. Apple wasn’t doing so hot, either, but it obviously had more cash on hand. The joke was on the people who did the deal, though, because in less than a year, Jobs was back on top and in charge at Apple.

In short order we had the iMacs, and amazingly they were selling like hotcakes. Apparently their shape and their color were what appealed most to customers, not what the computer actually did! No more beige boxes! Yay! Uh…and where did they come from? Eh, not important…

The first iMac, from Wikipedia

Jobs did some things that surprised me after he took over. He cancelled Hypercard, one of the most innovative pieces of software Apple ever produced. Hypercard was a multimedia authoring environment that enabled neophytes to programming to write their own programs on the Mac. You didn’t even need to know a programming language. It was a visual programming environment. You just needed to arrange media elements into a series of steps (“cards” in a “deck”), and set up buttons or links to trigger some actions. The closest equivalent to it on modern Macs is a program called “Automator.” I’ve tried using it, though, and it feels clunky. Hypercard had long been treated as a neglected stepchild at Apple, so in a way Jobs was putting it out of its misery.

He cancelled the Newton, Apple’s PDA. It had become the most popular handheld computer used by hospitals. As a result, they all had to find some other mobile platform to use, and all their mobile software had to be rewritten, because the Newton’s operating system architecture, and development language was proprietary.

Edit 10-13-2011: Thirdly, he cancelled Apple’s clone licensing program, which killed off Mac clone makers. This, to me, was more understandable from Apple’s perspective.

There had been efforts to make Mac clones in the 1980s. My memory is they were all the product of reverse-engineering. Apple finally allowed “genuine” Mac clones, under a license agreement, in the 1990s. Apple of course retained ownership over Mac OS. My memory is this happened after Sculley left. I could understand the appeal of this idea, since PCs (of the IBM variety) solidified their dominance in the market once clones came out. It didn’t work out the same way for Apple, however. A few things this strategy probably didn’t take into account. One is that Microsoft had to deal with a lot more variety in hardware in their operating system, in order to make the clone market work. I vaguely remember hearing about compatibility/stability problems with Mac clones. Secondly, the PC clone manufacturers had to accept much lower profit margins than IBM did when it owned the hardware market. Thirdly, Microsoft didn’t depend on hardware for its revenue. Apple’s business model did, and they were allowing competitors to undercut them on price. For Apple it was rather like Sun’s strategy with Java: have a loss-leader in a major, desirable technology, which the company owned the rights to, in hopes of gaining revenue on the back end in a market that was increasingly perceived as commoditized, which…didn’t really work out for them.

In a few years, NextStep would take over the Mac, with OS X. One of the things Jobs commented on recently was that after he left Apple in 1985, they just didn’t innovate like they had under his direction. It needed to catch up. So transplant NeXT’s work of 1992 into the Mac of ten years later!

Even though the OS X interface looks a lot different from the NeXT, under the covers it’s NeXTStep. The display technology is derivative from what was used on the NeXT. The NeXT operating system was Unix-based, as is OS X. Objective C was brought into the Mac platform as a supported language. In essence, OS X has been the “next” iteration of the NeXT computer. Like a phoenix, it rose again. This was apparently a part of the deal Apple made in buying the company. They recognized that some next-generation OS was needed for the Mac, since it was aging, and from the beginning they had planned to use NeXT technology to do that.

Old apps. written for Mac OS would no longer run on the new system, unless they were “carbonized.” This involved recompiling existing applications to use an emulation library. The problem was if you depended on a Mac OS app. written by a software company that was no longer in business, your best bet was not to upgrade.

Things were not so great in paradise. Apparently the transition from Mac OS to OS X was rough. I remember hearing vague complaints from Mac users about the switchover. They really didn’t get the memo that it was a whole new operating system that operated differently from what they had been used to for years. It may have not entirely been their fault, in the sense of not being willing to learn a new system. I remember hearing complaints about system instability as well.

To their credit, Apple quickly fixed a lot of the stability problems, from what I understand.

This was only for starters. Rather than focus solely on developing the desktop computer market, since Jobs said that Microsoft had “won that battle,” he took Apple in a whole new direction by saying that they should develop mobile devices “for the rest of us.” Apple has also been capturing the market for electronic publishing, with iTunes and the App Store. This combination has been the source of its meteoric success since then.

Unlike “the rest of the world,” I was never that enthused about Apple’s new direction. I haven’t owned an iPod, or any of their other mobile devices. I have an old Pocket PC, and a digital camera that I use. I bought my first Apple product, a MacBook Pro, in 2008, and aside from some kinks that needed to be worked out, it’s been a nice experience.

For the longest time I was not that big of an Apple fan. When I met other Apple users they often came across as elitist, like they had the bucks to buy the best technology, and they knew it. That turned me off. I used stuff from Apple from time to time, but I liked other technology better. That was because my priorities were different from most people. Nevertheless, there was something about Steve Jobs I liked. He had a creative, innovative spirit. I liked that he cared about quality. Ironically, Apple’s products always seemed more conservative than my tastes. It was an adjustment to use my current laptop. It’s allowed enough flexibility that I don’t feel totally hemmed in by its “ease of use,” but there are a few small things I miss.

Jobs was an inspiration. Like some other people I’ve seen around in my life, he was someone I followed and kept track of with interest for many years. He gave us technology that was worth our time to use. What I appreciated most about him was he pushed beyond what was widely thought of as “the way things are” in computing. Unlike most Apple fans and followers, I haven’t seen that much in the way of original ideas out of him. What I credit him with is taking the best ideas that others have come up with, trying to pare them down so that the average person can understand them, having the courage to make products out of them when no one else would, and then marketing the hell out of them. Part of what mattered to him was what computers made possible, and the experience of using them. In the beginning of all this, it seemed like his dreams were far out ahead of where most people were with respect to technology. In his return to Apple, he stayed out ahead, but he seemed to have a keen sense of not getting too far ahead of customers’ expectations. I think he discovered that there’s no virtue, at least in business, of getting too far out ahead of the crowd you’re trying to impress.

There were a couple really memorable moments with Jobs in the last 10 years that I’d like to cover. The first was his 2005 commencement address to the students at Stanford. Here he reveals some things about his life story that he had kept close to the vest for years. He had some good things to say about death as well. It’s one of the most inspirational speeches I’ve heard.

Below is a really great joint interview with Jobs and Bill Gates at the D5 Conference in 2007. It was interesting, engaging, and funny. It covers some of the history that I’ve talked about here, and what we’ve seen from Apple and Microsoft in the present.

This was a rare thing. I think this was one of only three times where Jobs and Gates had appeared together in public, and contrary to the mythology that they were rivals who hated each other,…well, they were rivals, but they got along swimmingly.

Just a little background, the intro. music you hear is from “Love Will Find A Way,” by Yes. Mitch Kapor, who’s introduced in the audience was the founder of Lotus Software. He developed the Lotus 1-2-3 spreadsheet for the PC (Lotus was bought by IBM in the 1990s). Last I heard some years ago, he had become a major advocate for open source software.

Jobs quoted Alan Kay, saying, “People that love software want to do their own hardware.” Maybe he did say that, but the quote I remember is, “People who are really serious about software *should* make their own hardware.” When I first heard that, I remember thinking Kay was putting a challenge to programmers, like, “Real programmers make their own hardware,” but I later realized what he probably meant was that software developers should take control away from the hardware engineers, because the hardware they had created, which was being used in computers, was a crappy design. So what he was probably saying was that really good software people would be better at making hardware to run their software. The way Jobs expressed this is a shallow interpretation of what Kay said, because Kay was very critical of what both Motorola and Intel did in their hardware designs. Apple has only used hardware from both of these companies for the main chipsets for their 16-, 32-, and 64-bit computers.

Note: There is a 7-minute introduction before the event with Jobs and Gates starts.

Gates said something towards the end that struck me, because it really showed the friendship between the two of them. He said, totally unprompted, that he wished he had Jobs’s taste. Jobs was famously quoted as saying in “Triumph of the Nerds”:

The only problem with Microsoft is they just have no taste. They have absolutely no taste. I don’t mean that in a small way. I mean that in a big way, in the sense that they don’t think of original ideas, and they don’t bring much culture into their product. And you say, “Why is that important?” Well, proportionally-spaced fonts come from typesetting and beautiful books. That’s where one gets the idea. If it weren’t for the Mac, they would never have that in their products. And so I guess I am saddened–not by Microsoft’s success. I have no problem with their success. They’ve earned their success, for the most part. I have a problem with the fact that they just make really third-rate products.

It felt like things had come full circle.

Saying goodbye

I was a bit shocked to hear of Jobs’s death last Wednesday. I knew that his health had been declining, but I thought he might live another year or so. He had only stepped down as Apple’s CEO in late August. In hindsight, though, it makes sense. He loved what he did. Rather than retire, and decline in obscurity, he held on until he couldn’t hold on any longer.

I’ve felt a little sad about his death at times. I know that Jobs’s favorite music was the Beatles and Bob Dylan, but on the day he died, this song, “It’s So Hard To Say Goodbye To Yesterday,” by Boyz II Men was running through my head. It expresses my sentiments pretty well.

Bye, Steve.

—Mark Miller, https://tekkie.wordpress.com

2 thoughts on “Remembering Steve Jobs and Apple Computer

  1. Pingback: Looking at the rare Apples | Tekkie

  2. Pingback: A history lesson on government R&D, Part 3 | Tekkie

Leave a comment