Some more of the computing industry’s greatest mistakes • The Register

Part 2: The 16-bit era Welcome back to The Reg FOSS desk’s roundup of the slip-ups and missteps from the dawn of the microcomputer industry onward – at least those that are most memorable to us.

In part 1 of this feature series, we took a look at some of the missed chances of the early era of mass-market microcomputers: the eight-bit machines. This overlaps substantially with the following generation of 16-bit machines.

Quantum Leaps… in wrong direc-TI-ons

Sinclair Research did very well with the inexpensive ZX81, and had another hit with the ZX Spectrum. The next machine it launched, though, was one of Sir Clive’s most critical blunders.

Aside from references to George Orwell’s totalitarian nightmare fuel, nerdier types remember 1984 as the year the first Apple Macintosh debuted. Apple launched it on January 24 – nearly two weeks after Sinclair Research launched its next-generation 16-bit computer, the Sinclair QL.

The big mistake? Sinclair bet that multitasking would be the killer feature for this market. By the end of the month, when the machine was launched – long before it actually started shipping – it was clear that what buyers really wanted was not multiple apps at once, but the point-and-click ease of a GUI.

And yes, we do remember that Texas Instruments launched the 16-bit TI-99 range years earlier, long before the QL, but the TI-99 was so cut down and crippled that the IEEE called it TI’s biggest blunder.

Coupling a 16-bit CPU with eight-bit main RAM accessed through the eight-bit video controller chip, the TI-99/4A ended up priced like an eight-bit – with the sluggish performance to match. We’re afraid that we consider the 16-bit TI-99 machines to be honorary eight-bitters.

Atari, the Japanese for check!

The stories of Atari and Commodore could be interpreted as a braid of intertwined mistakes, disagreements, and misunderstandings. Commodore’s founder, the formidable Jack Tramiel, built the company into a billion-dollar business. He then left to start Tramiel Technology Inc, bought what was left of Atari from Warner, and rebranded as Atari Corporation.

Atari had been a fading force; several years before the C64, it launched the Atari 400 and Atari 800. Cursed with a “keyboard” – we use the word loosely – as nasty as the ZX81’s, the low-end model didn’t look it, but these early eight-bit machines were the most capable of their era. For the late 1970s, they had amazing graphics and sound, largely thanks to their custom chips: ANTIC, POKEY, and CTIA. But Atari didn’t capitalize on its technological lead, and lost its design team, which quit and created the Hi-Toro Lorraine.

But just 18 months after the original $2,495 Macintosh, the new, revivified Atari’s rejoinder was the Atari ST. The ST had 512 KB of RAM – four times as much as the Mac – but it cost just $799, under a third of Apple’s launch price.

Nicknamed the “Jackintosh” after Tramiel, the ST’s apt slogan was “Power without the Price.” With inexpensive hardware add-ons containing Apple ROM chips, it could even run MacOS and Mac apps.

A month later, Tramiel’s previous company launched a renamed version of the Hi-Toro Lorraine, now dubbed the Commodore Amiga. The Amiga had much better graphics and sound, thanks to Atari’s former chipset design team, but even though the original Amiga 1000 model had just half as much RAM as the ST, it cost a cool $1,295.

At the beginning, there was clear demarcation. The ST was a clever agglomeration of off-the-shelf hardware and software and a bought-in OS from Digital Research, loosely based on DR’s CP/M-68K. It used PC FAT16-format diskettes and had built-in MIDI ports. For its time, it was quite fast but remarkably affordable.

In contrast, the Amiga was the more capable machine. One example of its professional chops, outside the home and gaming markets, was the industry-redefining Video Toaster. Used to create Babylon 5’s special effects, the Video Toaster was one of the Amiga’s killer devices in the pro market. Its video prowess even led to a post-Commodore Amiga-compatible computer, MacroSystem’s DraCo.

In 1990, Commodore launched a new top-end Amiga, the A3000. These could even run Amiga Unix, and no less than Sun exhibited A3000s at Uniforum. In a splendid example of Commodore’s skillful cooperation with industry partners, the new machine’s case was too small to hold a Video Toaster.

In this period, both companies spent far too much effort trying to outdo one another. For instance, not wanting to be left out, Atari also briefly offered a UNIX machine, based on its TT030 workstation.

Both companies also offered ranges of x86 PC-compatible computers. Atari offered a whole own-brand PC range, while Commodore started with 8088 and 8086 machines, and later the Select 286 and even a few 386SX models. To us, looking back from the 21st century, this seems like an admission that their proprietary models weren’t viable business machines.

We feel that Atari should have leaned in harder on its DOS-like OS and PC media compatibility, and embraced PC standards for cases, expansion, and so on, while Commodore should have moved upmarket into workstation-class machines, but this is merely idle speculation. In the end, rather than trying to differentiate their product ranges, Commodore and Atari went all-in on a price war… and both lost.

Acorn’s secret 16-bit fling

In the mid-1980s, the market was moving to 16-bit machines. As far as its popular machines went, Acorn seemed to skip this.

Between the influential eight-bit BBC Micro range and the pioneering 32-bit Archimedes series, Acorn did briefly dabble in 16-bit kit.

The widely told story is that Acorn surveyed the market, evaluating 16-bit chips from Intel, Motorola, National Semiconductor, and others, before deciding to go it alone with the original ARM processor. As the Register put it when commemorating the ARM1:

Making that decision was a little more involved than it sounds. The company that Furber and Wilson visited was the Western Design Center, and the chip they went to look at was the 16-bit WDC 65C816 [PDF].

But Acorn didn’t just evaluate the chip. It designed, built, and launched an entire computer based on it, its sole 16-bit system and one of the scarcest and least-known Acorn models ever: the Acorn Communicator. As well as the hardware, the Communicator included a port of the Acorn MOS operating system, plus BBC BASIC, the View word-processor, ViewSheet spreadsheet, and a Teletext terminal emulator.

… And little (known) Apples

There was a more visible use of the 65C816, though. In 1986, soon after the Macintosh Plus, Apple also launched a more multimedia-capable rival machine: the 16-bit successor to the 6502-based Apple II range, the Apple IIGS.

The IIGS ran at a modest 2.8 MHz – contrary to widely believed myths, not due to concerns about competing with the Macintosh, but due to the yield problems with the new CPU, as Dan Vincent’s in-depth history, The Apple IIGS Megahertz Myth, explains in detail. This also mentions the canceled predecessor machine, the Apple IIX.

Apple has made enough mistakes that a decade ago, The Register published our own list of Apple’s 10 greatest fails, and while that list features such ill-conceived machines as the Apple III and the Pippin games console, it doesn’t even include the IIGS.

That’s almost a testament: the IIGS was a remarkably capable computer, considering that it could run a lot of software built for the very much simpler eight-bit Apple II range.

That was its killer problem. In the early years of the Macintosh range, Apple was kept in profit by sales of the no-longer-glamorous Apple II range. The IIGS capped that range off with a machine that was so capable, it could compete with the Macintosh. In some respects, it outdid the Mac, such as its 4096-color graphics and 32-channel sound.

Arguably, in the IIGS, Apple made a mistake that Commodore, Atari, and even Acorn successfully avoided. It was possible to design a next-generation 16-bit machine that remained compatible with the older generation, while substantially increasing its capabilities… but the resulting systems would be significantly compromised compared to clean, ground-up redesigns that dispensed with backwards compatibility.

Only Acorn had the sheer CPU grunt to offer software emulation of its older range in its next-generation systems, but we will return to that in the third and final part of this series, looking at the 32-bit period. ®