(Most of the material in this article has come from Raytheon publicity. We thought that many of you who lived through this period would be interesting in some of the details. We anticipate adding additional information on our ARR web page.)

First Application of Integrated Circuits in Computers
Raytheon played a key role in making the Apollo Moon Program successful. “Apollo’s computer was the first to apply integrated circuits in the technological revolution that led to desktop computers and modern electronic gadgets,” wrote Jayne Partridge Hanley, formerly of the MIT Instrumentation Laboratory staff, in the foreword of the book “Journey to the Moon” by Eldon Hall, MIT’s leader of hardware design for the Apollo computer.

The choice to use integrated circuits wasn’t without risk. The emerging technology was unproven in comparison to the transistors that ran computers of the time. “Microelectronics was absolutely at the infantile level. There were, as far as I knew, only two companies that were making integrated circuits of any kind,” said former Raytheon engineer and MIT fellow Herb Thaler, the Apollo computer’s logic circuit designer. Thaler based his integrated circuits on the architecture for a transistor-based computer that had originally been designed for an unmanned mission to photograph Mars.

Because of the importance of the project — and President John F. Kennedy’s challenge to put a man on the moon within the decade — research and work on the technology was pushed to the limit.

Raytheon, chosen to manufacture the computer after its close work with MIT on the Polaris ballistic missile, put the parts through rigorous testing that helped determine and increase reliability. The company and its partners also used some of the first modern “clean rooms” to deal with contamination issues from particulates and moisture that were being sealed into, and ruining, parts. Clean rooms are used in the production of nearly all electronics today.

By 1971, just two years after Apollo 11 astronauts Neil Armstrong and Buzz Aldrin stepped on the moon, the Intel 4004 chip became the world’s first microprocessor. The decades that followed saw a variety of electronics emerge, from the first personal computers to today’s cellphones, high-tech cars and smart appliances. And they all had one thing in common: the integrated circuit.

Gallium Nitride Expands Performance
Today, Raytheon is still revolutionizing chip technology, most notably with its 15-year, $300 million-plus investment in using the semiconductor material gallium nitride, or GaN, for high-performance circuits. At its U.S. Department of Defense-accredited foundry, the company uses photolithography and other techniques to convert wafers of silicon carbide and GaN into monolithic microwave integrated circuits. GaN can be used for power, radio frequency and light emitting diode circuits. It is much more efficient and has five times the radio frequency of the tech it’s replacing. In short: less power needed, more powerful results. Raytheon is using this technology to make radars that see farther and to improve communications with precision and defensive weapons.

A Raytheon technician holds a circuit wafer made with the futuristic material gallium nitride, or GaN. Raytheon has innovated chip technology since it built the computers that guided Apollo spacecraft.

On the consumer side, GaN isn’t as advanced as the defense version, but still has the potential to power big changes; stronger cellphone signals, wireless charging or powering of household appliances, improved autonomous vehicle LiDAR, efficient electric car power converters, improved medical devices such as MRI machines and low-power computing devices and data centers. “(Raytheon) mainly uses GaN to take small signals and make them bigger,” said Steven Bernstein, an Advanced Technology engineer at Raytheon. “Right now, the biggest use for this material commercially is LED light bulbs and displays…and another big consumer application that it’s starting to be used in is cellphones.”