By Ron Wilson, Editor-in-Chief, Altera Corporation
In 1983, Altera officially came into existence. The year is within the memory of many people today—the first flights of the space shuttles Challenger and Columbia, the assassination of Philippine leader Benigno Aquino Jr., the US invasion of Grenada. In contrast, in the racing time-scale of technology, 30 years can seem imponderably long. TCP/IP became the official protocol of the ARPANET—there was, as yet, no Internet. IBM introduced the PC-XT—hard disks on personal computers were still in the future. The GNU project was announced.
In electronic design, microprocessors had come to dominate many embedded applications, displacing minicomputers at the high end and handfuls of small-scale digital chips at the low end. The typical digital system comprised a microprocessor or microcontroller surrounded by interface circuits that connected the chip’s bus to external memory chips and interfaces. Small-scale logic chips and medium-scale functional-block ICs were still used, but increasingly either to create the interfaces that surrounded the microprocessor, to bridge between buses, or for performance-critical applications that the microprocessors couldn’t handle.
Also in this period, a new implementation alternative was spreading through the design community. Gate arrays—partially prefabricated arrays of logic gates, configured by a series of customer-defined metal layers during the final production steps—allowed designers to pack thousands of gates of logic and memory into one chip. The design process was unfamiliar, relying on workstation-based EDA tools for schematic capture, picking library elements, and simulation, and the front-end charges were too great for many projects. But for larger, better-funded design teams, gate arrays offered a valuable alternative that could slash the chip count, boost the performance, and reduce the power consumption of a design all at once.
Design styles at the time were remarkably diverse, at least by today’s standards. Engineers trained in digital logic generally designed using formal constructs: Boolean algebra and expression minimization to define combinatorial logic, and state machines to describe sequential logic. These engineers might use either manual techniques or a growing list of computer software tools to capture and analyze their designs. But engineers coming from different backgrounds—especially analog engineers—often used a more intuitive approach. These designers preferred schematics for design capture, and tended to design by starting on the left margin and following input signals through to outputs, adding in gates, flipflops, medium-scale devices, and microprocessors as they went. Gate-array users tended toward the schematic approach also, simply because it was an approximate description of the circuits they were creating on the chip. These two contrasting design styles would play a role in the evolution of yet another innovation that was gradually maturing in the industry of 1983.
In 1978, that other alternative had appeared to relatively little notice. A chip-design team at Monolithic Memories had created a device they called a PAL (for “programmable array logic”, the more obvious term “programmable logic array” having already been taken). The chip was a one-time-programmable digital device designed to implement the minimized standard form of Boolean expressions: the sum-of-products format. A PAL contained a number of macrocells, each of which contained a switch array that could connect any of the chip’s inputs and outputs into each of up to eight product terms. The macrocell then summed the product terms in a wide OR gate and provided a configurable flipflop to register the output of the OR gate, with some multiplexers for selecting clocks, bypassing the register, and so forth. By 1983, PALs had proliferated into an impressive range of sizes, speeds, and configurations, giving designers a perhaps too-rich feast.
These programmable devices offered digital designers an interesting proposition: better logic density than small-scale gates and flipflops, better flexibility than purpose-built off-the-shelf medium-scale devices like counters, registers, and decoders, and a more familiar design flow and no front-end expense compared to gate arrays. The vendors provided software tools for minicomputers and mainframes that could translate Boolean expressions, state machines, and some schematics into switch maps for the chips.
Enter Altera
This was the environment in 1984 when Altera offered its first product. That chip, the EP300, (Figure 1) was a programmable-logic chip, but it differed from PALs in four significant ways. First, the EP300 was reprogrammable—a seemingly minor convenience that would prove to be a major factor in the industry. A quartz window in the package allowed users to shine a UV lamp on the die, erasing the EPROM cells that held the device configuration, so the chip could be programmed again.
Figure 1. The Altera EP300 offered erasability by shining a UV lamp through the window above the die.