Last month, Electronic Design brought together three of this industry’s foremost designers and most respected thinkers to participate in an informal panel discussion about the future of electronics engineering in America. The following is a summary of that informative, enlightening, entertaining, and sometimes profound conversation.
BARRIE GILBERT, an IEEE Life Fellow, holds more than 65 patents (including the well-known Gilbert cell), has authored numerous papers in JSSC and other journals, and is a contributor to, and editor of, several books. He joined Analog Devices, as its first Fellow in 1979. He is Director of the Northwest Labs, ADI’s first remote design center in Beaverton, Ore., developing a wide variety of IC products for the communications industry. For his work on merged logic–a precursor to I2L–he received the IEEE Outstanding Achievement Award (in 1970). For his contributions to nonlinear signal processing, he received the IEEE Solid-State Circuits Council Outstanding Development Award (in 1986). He was Oregon Researcher of the Year in 1990 and received the Solid-State Circuits Award in 1992, the ISSCC Outstanding Paper Award on five occasions, the Best Paper Award at ESSCIRC twice, and various awards for Best Product of the Year. In 1997, he received an Honorary Doctorate from Oregon State University.
TED HOFF (Marcian "Ted" Hoff), inventor of the microprocessor, has earned many honors during his long career. Currently the chief technologist of FTI Teklicon, Hoff was one of the first employees of Intel Corp. At the latter, he developed an architecture for a 4-bit CPU, with the help of engineers Stan Mazor and Federico Faggin. And so was born the first microprocessor, the 4004, which in effect launched the microelectronics era in November 1971. Hoff became the first Intel Fellow in 1980, the company's highest technical position. His numerous honors include induction into the National Inventors Hall of Fame and the Stuart Ballantine Medal from the Franklin Institute. After receiving a bachelor's degree in electrical engineering from Rensselaer Polytechnic Institute, he was awarded a National Science Foundation Fellowship to attend Stanford University. There he earned an MS and PhD, both in electrical engineering. Hoff stayed on to work at Stanford for another four years, conducting research on neural networks and integrated circuits.
BOB PEASE (Robert A. Pease), well-known as a champion of common-sense analog design, created the first adjustable negative regulator at National Semiconductor Corp. After graduating from MIT with a BSEE, he joined George A. Philbrick Researches. In a 14-year tenure there, he designed many leading-edge operational amplifiers, analog computing modules, and voltage-to-frequency converters. Since Pease arrived at NSC in 1976, he's designed several leading-edge analog ICs, including power regulators, voltage references, voltage-to-frequency converters, temperature sensors, and amplifiers. Currently staff scientist at National, Pease holds 21 patents. His definitive book on resolving analog design problems, Troubleshooting Analog Circuits, is in its sixteenth printing. Moreover, his column in Electronic Design, "Pease Porridge," received a Jesse H. Neal Certificate of Merit in 1992.
TED HOFF: I see three influences at work. Technology will continue to present very exciting challenges. In the semiconductor industry, Moore's Law has applied for a very long time, yet it looks like we still have some ways to go to continue to make things smaller. With today's technology, you can now carry the Encyclopedia Britannica in a package about the size of a postage stamp. However, we're getting pretty close to some of the limits, like noise problems, and those limits are likely to become a problem, so one of the issues is how long this will take. Current estimates seem to be running in the order of 10 to 20 years, and maybe another order of magnitude reduction in size. In fact, as we approach these limits, the rate of shrinking is probably going to slow down and become more asymptotic.
Further, when you look at the speeds at which digital circuits are running today, it raises questions about the analog world. Can we do some interesting things like radar and so on with integrated circuits? Who's using it to improve, let's say, automobile safety or other things of that type? The challenges are there just waiting to be answered.
One of the other major influences is outsourcing. It's one thing to outsource certain aspects of engineering and design to try to remain competitive. But I'm concerned that if we ship the whole package to other parts of the world, we're going to essentially facilitate a reduction in our own standard of living. If another nation has all the pieces in place and its people are willing to work very hard and tolerate a much lower standard of living than we have, we're basically setting up our own competition.
In California where I live and work, we've had a unique set of conditions with venture capital, with some fantastic universities, and with a kind of entrepreneurial spirit and policy that companies have had to really get their employees involved in the ownership of the company through stock options, plans, and other programs. As a result, there's been a high degree of motivation as well as a set of conditions that have been ideal for the development of new companies and of people who are encouraged to foster new ideas.
Along those same lines, it seems like there are a number of programs in this country that, if anything, are trying to discourage this concept of ownership. In other words, the emphasis is on finding new ways to tax stock options or to discourage them, like causing companies to take a negative reporting on their balance sheets if they issue them. All of these things would seem to discourage the concept of granting employees ownership in their companies and which, I believe, is a major motivating factor in getting people to be willing to take that extra bit of effort to make their companies successful.
BOB PEASE: Ted commented that Moore's Law is going to go quite a little bit further, maybe slowing down. I think Moore's Law is basically dead or dying because Intel had said, "Oh, boy! We can go faster, faster, faster, every series we can have more and more complexity in every new series." And now Intel has said, "We can't any longer. We're going to provide better usability, better flexibility for our users; and we're not going to go faster, faster, faster, smaller, smaller, smaller." So, I think Intel has wisely conceded that Moore's Law carried us as far as it could, and now it's slowing way down. It's not completely dead yet, but it's slowing way down. And many computers will be more useful even though they are slower. I mean, how many of us can type at 700 megabytes a second? How about 7 million megabytes per second? You can't do it! But typing at 7 kbits per second would be neat as heck—and this would let you slow down the clock to save power.
Now, let's talk about realities. Is the future of electrical/electronics engineering in the U.S. in a crisis? Yeah! Is the future of electronics in the U.S. in a crisis? Yeah! And is the future of engineering in the U.S. a crisis? Yes, again! Why? Because of world competition and because of outsourcing. And we have to learn to deal with that as well as we can.
So, why are engineering enrollments still declining? Because of all that competition. Supply and demand is a heck of a cruel mistress, but supply and demand is real across the world. Would I recommend electronics engineering for a good student? Yes, I would for somebody with superior math skills or mechanical skills or electrical skills. Do we need more engineers? I can't talk about the need for more digital engineers, because I don't know a blasted thing about digital engineers, and you know it. But do we need more analog engineers? Most definitely, we could use more analog engineers. However, we don't need more bad analog engineers; we need more very good or excellent analog engineers.
I'm in favor of schools that produce excellent engineers, but how do you distinguish them from schools that put out mediocre engineers? We know there are bad engineers out there because we bump into them, so to speak, every day, every week, every month, and we know that they're clueless, except we try to help them be less clueless. But we are more enthusiastic in helping the excellent engineers, because they just need a little bit of education to surmise a lot. "Oh yeah, you can do that, but nobody ever told you about this, did they?" Good analog engineers come not just from the best schools and not just from colleges, they come from many, many places. And they don't just come from colleges as an excellent engineer. They come out of colleges and universities as a pretty good, pretty well-educated, half-trained engineer. And then they get to be good engineers from internship at a good, tough company.
I'm sure large companies have many kinds of mentoring for good engineers who arrive in fairly raw form from good schools and good colleges and good universities. Then industry teaches them what they need to know because we're so far ahead of universities. These graduate engineers don't even know what they don't know. They don't know anything. And yet, if the kids had been given a good training, a good education, they have a chance to start learning real fast as soon as we start teaching them.
BARRIE GILBERT: I would like to take a far more philosophical view of the situation. In response to Bob's last comment, I do think it's uncharitable to suggest that students don't know anything. Our experience is that graduates are very smart young men who know a great deal, and sometimes they know things we don't know. But it's true they don't know much about the business, and I think one of the failings of formal education in electronics and technology in general is the failure to emphasize the realities of the business.
This is not playing with circuits with a stem vase with a red rose in it to your left and a glass of fine California Chardonnay on your right, with a keyboard in the middle. It has very much become a dog-eat-dog business. Competition is the dominant theme. Time-to-market is likewise a dominant theme.
Those are the things they don't know, but what we can learn from them are new ideas for doing things that we never thought of. For example, the delta-sigma concept—although Bob would say some of his converters used it years ago. I used it years ago too, but neither of us can lay claim to seeing the potency of the modern form, which employs sophisticated use of high-order filtering in many of today’s advanced applications.
Likewise, the switched-capacitor technique came out of Berkeley. From a long time ago, I can remember Paul Gray (the noted Berkeley professor) and I at a bus stop in Belgium when he told me about this new technique of his and asked, "What do you think?" I said, "Well, it sounds like a good thesis project, but I can't see it being very practical." So here's this wise old owl condemning a piece of research he didn't know anything about.
Let me say this. I'm principally a technologist. I'm not so old that I cease being thrilled by my involvement in advancing technology, but I'm old enough to understand the profound sociological and economic implications of modern electronics and to view electronics in those terms on equal footing with the technical issues. And I'm just about the right age to understand the transformations now taking place in this particular business.
Electronics is a relatively new art on the face of this planet; it's little more than 150 years old. And up until today, electronics of all kinds has made use of a common technology. For instance, for many years, the telephone and the telegraph used magnets, coils, batteries, relays, that kind of thing. Along came the thermionic triode and things changed dramatically with the development of the vacuum-tube computer followed by the invention of the transistor.
The fact is some 100 years after the invention of the triode tube, we're still using the very same basic technologies for both digital systems and for analog systems, yet these are totally disparate areas of endeavor. Analog, I like to say, is close to nature because it's dimensional—it's all about real physical variables that have dimension: length, width, height, time, frequency, whatever. All those things have physical dimensions. Likewise, the components in which these variables, these signals, these state variables exist are dimensional: resistors, inductors, capacitors. These have dimension, whereas in digital electronics, dimension has no meaning.
A logical variable is dimensionless, and if we could make digital computers using, say, conductive peanut butter, they'd be just as good, I suppose, provided they could be as fast as those made in silicon (even though Bob, in a minority of one, says he doesn’t need them that fast). We don't fundamentally care about the choice of technology for digital electronics. Silicon is just the handservant of electronics, at least as far as microprocessors are concerned. Its transistors are very fast, very cheap, and very tiny. Apart from that, they and the CPUs and DSPs that utilize them, are only incidentally electronic.
My point is that now we're at a state where I believe a divergence is inevitable. That is to say, we can expect to see technologies with digital implementation of logic—the information processing of all kinds—to go down a different path than that required for analog electronics. The needs of these two disciplines are simply far too different to avoid this divergence.
Now, if anyone should tell you, as people like to say, that analog is dead, tell them they have got their carburetor stuck up their tailpipe; because clearly, that viewpoint is nowhere near the truth. There's more analog electronics being designed today and more analog products being sold today than ever before. And it's not going to die. Radios are forever, and as long as radios are a part of human society, they are not going to be done with bits! Even “time-domain” radio cannot be implemented entirely in the domain of bits. And it ought to be quite apparent to even the causal observer that today’s “digital” cell-phone contains immeasurably more analog circuitry that yesterday’s “analog” phone.
So there's a rich future for analog and, as Bob says, there's a need for young, bright analog engineers. Unfortunately, most of those in service today came out of World War II and military training, and the members of this generation are dying by the dozen. I’m going to go pretty soon, too. I don’t know what’s going to happen to replace us all, but I don’t believe that it’s a desperate situation.
I’ve got a lot of ideas about the future, and a lot of ideas about the long-term effects of outsourcing on our industry, but I better stop here and let the conversation continue.
Electronic Design: Do you feel that today's young people realize the challenges that technology still presents if they enter electronics engineering?
GILBERT: I think I can make a comment on that. As a youngster, I was very poor. My father was killed in the war when I was three, leaving my mother with four children. So I had an extreme paucity of means, and I think that is a very important contributing factor to creativity. I was forced to make my airplanes out of bits of wood from an orange box. Nowadays, young children have everything presented to them on a plate. If they do approach electronics, they'll be hard-pressed to find a place to get the stuff. Radio Shack doesn't sell that stuff anymore; all they have are blister packs with ready-made parts in them. It's very hard for youngsters to be exposed in a creative way to electronics. When I ask young people, “Is electronics your hobby? Have you built a radio?” the majority say, “No.” That, I think, is a very great shame, and I don’t see that changing. I can’t see us turning back the clock.
HOFF: I think it goes even beyond that. We've become so frightened of technology, like the concern about lead in the environment, and so on. In my youth, a lot of us became interested in science through chemistry. I have three daughters, and some of them had chemistry courses in college, but they never were allowed to go into a lab—everything was out of a textbook. They missed all the fun and excitement. Sure, maybe it's a little dangerous if you spill some acid and burn a hole in your clothes, but you'll learn something from that.
We are so risk-adverse today that we're discouraging things like material research, which could lead to an awful lot of new developments, whether it be high-temperature superconductors or new materials to replace silicon. We've become so risk-adverse that we're afraid to develop new technology in many cases.
GILBERT: Yes, there is this hands-off attitude in modern teaching. I am very well aware of that, and the fact that practically everything is done by simulation. A young man may think he has finished with the design of the circuit because it has been simulated. (So many papers I receive for review are just simulations. And the concepts presented in them as novel are so chronically, hopelessly obvious). But, going back to your point, I too messed with saucepans in the kitchen, and salt, and vinegar, and bicarbonate, and made stuff fizz and pop. Once, I remember putting strong brine into a jam jar, then putting in two copper plates and plugging them into a 240-volt British power outlet. You know, it was fun—it frothed a lot, then there was this lovely green slime that formed on the top, and I put my nose into it. There were many times I could have killed myself.
As long as I'm talking about my youth, I'll tell you one more anecdote. My pal and I used to go down to the shed at the bottom of his house, and we would connect one side of a transmitter transformer (with a 700-V / 0/ 700-V secondary) to a clothesline—a steel clothesline—and put the other side into a copper pole in the ground. Then we would go to the other end of the garden, grab that clothesline, and take a fireplace poker. The one who could poke deepest into the ground was "the winner." Most definitely, we could have both been losers.
PEASE: I guess Barrie lost, and that's why he's still here.
GILBERT: Right, I'm glad to say. Anyway, there is another factor that I think is a very important influence on young people. After the war, there was a plethora of wonderful military equipment, absolutely inspiring just to look at—so beautifully made, so strong, so inspiring, so full of potential. You know, I built my first TV set out of old radar sets, like many of us did. To do that today would be absolutely impossible. I mean you just cannot build a modern TV set from junk. Well, you might, but it's very, very hard. You have to be close to genius grade to do it today. Certainly there's little hope of actually devising an HDTV receiver in your garage. Rather, you just go to the store and, well, buy one. I think that's another aspect of electronics on which we may need to concentrate—that is, it has become very much a commodity business, much less of a science that fosters invention.
As Bob mentioned earlier, the demise of Moore's Law (which of course was 100% predictable to anybody that had his eyes open), computer makers are simply going to have to get more innovative and move forward into more effective utilization of whatever technologies are economically justifiable. For example, while there are many things that can be done with computers, they are dumb. They don't interact with us. Even a urinal knows we’re standing in front of it; but a computer doesn’t. What’s needed is more affective computing–the provision of computers that understand us better. That direction of progress invariably doesn’t require faster computers. Many applications don’t require faster computers.
HOFF: I disagree with your point about the need for faster computers. It isn't the typing you worry about, it's the visual images that you want to create. Those still take a lot of computing. When you consider just the number of pixels, our display screens today are still so bad compared to let's say the resolution on a sheet of paper—in other words, ordinary photography versus digital photography. We want the ability to compute in real-time images rapidly enough so that we believe we're essentially in a live scenario, I think we still have a long ways to go. And that's where I see the additional computing power being used. Even in areas like speech recognition, some of the algorithms are very computational intensive, and we're just getting to the point now where we have enough power.
PEASE: Go get 'em, Barrie.
GILBERT: Well, I understand that, of course. But the problem, I think, is the mindset of the digital algorithmists who believe everything must be done in the same sledgehammer way. What makes neural networks so interesting is that they are non-algorithmic: the medium IS the message, the hardware IS the program. I think there’s a great deal to be done here. I’m not a neural network fanatic. On the other hand, what makes US interesting, as humans, is that our awareness and our consciousness arise from the concurrent state of a very, very large number of elements that are connected continuously. In dramatic contrast, even if you provide today’s microprocessor access to all the wealth of wisdom in the world – terabytes, even petabytes of “data – it wouldn’t be an iota smarter, because it only sees the word a few pitiful bits at a time: it’s completely unaware of what’s “out there” in the vibrant world of the living. What makes a neural network so interesting is that it is aware of everything concurrently. And we humans don’t do speech recognition using pre-orchestrated algorithms; certainly not digital bit-by-bit algorithms.
HOFF: I worked in neural nets before I went to Intel. I was at Stanford working in this area, and I've tended to keep up with it. I believe one of the problems is there's almost too much hype in this technology. What we're talking about is what you can do when you have an incredible amount of computing power. No matter what, whether you want to simulate computing power or you want to duplicate it, do it with analog circuits or digital simulation, there is an incredible amount of material that has to be processed to duplicate even relatively crude neural net circuits.
PEASE: My turn. But today this incredible amount of "computering" power is largely used to make video games, with the highest possible resolution that kids can pay for. And a lot of people don't quite need that. The major microprocessor companies and the major computer companies are pandering to kids who have money to spend for video games, and it's not necessarily profitable for the rest of the world, except for the manufacturers who want to sell to kids who want to play blankety-blank video games.
GILBERT: I agree. Entertainment is a huge factor in modern electronics, almost pathologically so. The kids don’t understand the technology behind the game; they understand only the game and the rules of the game. They’ve become so satiated with technology that they almost have turned against it, in the sense of ever wanting to know how it works.
HOFF: Another factor nowadays is the lack of manuals and schematics, like the ones that used to come with TV sets and personal computers. Manufacturers would include manuals showing complete schematics and every component inside. Today, everything tends to be kept proprietary. Except for what's disclosed in pamphlets, there's very little public information. We don't even teach our kids how to look up patents.
GILBERT: Well, it’s a nice thought, but even if they did, they wouldn’t understand them. I mean, it’s very hard for any of us to understand even our own patents, right?
PEASE: Okay, well let's get together and form a consortium to pay for the comeback of Heathkit. That company could never make any money these days because of all the factors we've just been discussing, and yet should it be brought back? Most definitely. How much would it cost? Oh Lordy, that'd be difficult.
GILBERT: However, some of that is happening. Lego sponsored MIT, and between them they developed Mindstorm, which is a make-it-yourself kit of mechanical and electrical component parts for kids who want to build a variety of different, and original, forms of robot. These kids don’t understand how the microprocessor works, of course. It’s not really a genuine exposure to electronics, but it’s a move in the right direction. A while back, in America, Erector sets used to be a big hit with kids, but those sorts of things don’t exist much anymore, and if they do, they’re not very popular. But I don’t think that reflects the business; I think it reflects the reaction of the businesses to the market. In other words, I don’t think an Erector set would sell today, because kids don’t want to do things with nuts and bolts; they want things to work instantly.
HOFF: Yes, instant gratification seems to be a characteristic of our society. When I was in school, even I was told by a supposed guidance counselor, "Oh, don't go off to an engineering college, that's much too difficult." In other words, take an easier course. Today, I think that difficulty is emphasized in every aspect of our life. Kids are being told mathematics and engineering are way too difficult, and they're not cool either.
PEASE: Although that might be changing a little bit, we still have to encourage the men and the women who are technically competent—if they're good at math, they're good at mechanical things, they're good at analyzing things. Definitely, we must positively encourage the kids.
GILBERT: From a global point of view, we’re talking exclusively in these last few minutes about the state of affairs in the USA. Let’s consider China, which is obviously going to be the next great superpower. There, there is still a paucity of means. There, there are still kids making airplanes out of orange boxes. Maybe that’s the fomenting grounds in which the new electronics is going to arise.
PEASE: Makes sense.
GILBERT: And we should be concerned about that, just a little bit.
PEASE: True, 'cause we're running out of orange boxes. Man, you can't even buy an orange box anymore. It's difficult.
GILBERT: Yeah, the ones I used weren't very orange either.
Electronic Design: The youth in the U.S. have the ability to control so much via their joysticks that they might be more focused on the programming, and so they will come into electronics via the aspect of wanting to control via software rather than through hardware and electronics. Is that something you are seeing with college graduates joining the companies and the programs you're involved with?
GILBERT: No, I don't. I think software is just the spells that you cast over the hardware, but beyond that, there's no connection.
PEASE: There's no correlation between getting kids interested in electronics or any other kind of science. I mean computers are simply a way to make the monster kill the young lady, or whatever it is they're doing in today's video games these days, and that's nothing to do with science.
GILBERT: That’s what’s really the key issue here—there is this divide. Earlier, I mentioned the divide between technologies for information processing versus those required for analog, and I believe that there’s a huge distinction between those needs. But in addition to this divide in technologies, there is this division between the world of technology as it impacts us in our daily lives—the microprocessors that are hidden away in the washing machine and that sort of thing, which is not really an exposure at all, because most of us could care less about how a washing machine works—and the people who design those things. In other words, the people who make the microprocessors believe that the next generation of devices—by virtue of this dramatically improved clock cycle speed—is going to do more marvelous things. I suggest sheer speed is a dead end.
Sure, I understand that today’s algorithms are extremely complex, and I frequently wish that I had a machine ten times as fast on which to do my circuit simulations. But that’s not the whole story, it’s only one aspect of the total reality. As long as program developers believe they have power in abundance, to fritter away on a multitude of trivial features—such as Microsoft’s little “HELP” characters—they will gleefully burn-up every last megaflop. For the professional user of a computer in design studies, a re-vectoring of software to run on a thousand little CPUs is a much more promising path to “high speed” from that user’s perspective. It’s not one that’s been neglected entirely. If I want to simulate a very large network, I’d rather have several thousand little processors, each of which knows independently how to behave as, say, a transistor or a resistor or whatever, that are intimately connected together, and communicate asynchronously and essentially concurrently. That’s like the old days of analog computers except, rather than integrators and summers and the like, each element knows the detailed equations of a transistor, and can solve them to 64-bit precision in an instant. This is perfectly possible, but people aren’t doing much about this, because at the moment the market for high-speed simulators is relatively limited to perhaps a few thousand copies. It’s the mass market that always draws the attention.
HOFF: I think it's a bigger problem—that is, the general-purpose relatively ordinary computer has been driven down in cost so much that it's hard for someone like yourself to justify even the development of the kind of combination of processors that you just described. It takes people like you with sufficient motivation and funds to develop something like that. Once you have it, probably other people will find uses for it.
PEASE: I recently ran into a very good quote. It said when you're trying to plow a field, would you rather have two oxen or one-thousand chickens? I'd rather have a breadboard because I can make a microprocessor, an analog microprocessor. I can use a thousand transistors to emulate one thousand transistors, and I know how to make it work at true scale speeds despite capacitance strays. And it ain't easy, but it's an interesting way to go it because sometimes it helps you avoid the stupid problems that Spice gets you into.
GILBERT: We won't talk about that.
HOFF: We won't?
PEASE: I just did, sorry.