Bernard Widrow
Professor of Electrical Engineering, Emeritus
Bio
Bernard Widrow is Professor Emeritus in the Electrical Engineering Department at Stanford University. His research focuses on adaptive signal processing, adaptive control systems, adaptive neural networks, human memory, cybernetics, and human-like memory for computers. Applications include signal processing, prediction, noise cancelling, adaptive arrays, control systems, and pattern recognition. Before coming to Stanford in 1959, he taught at MIT where he received the Doctor of Science Degree in 1956.
Honors & Awards
-
Citation Classic for paper "Adaptive Antenna Systems,'' Proceedings of the IEEE, December 1967, Institute of Electrical and Electronics Engineers (IEEE)
-
Benjamin Franklin Medal, The Franklin Institute (2001)
-
IEEE Millenium Medal, Institute of Electrical and Electronics Engineers (IEEE) (2000)
-
Donald O. Hebb Award, International Neural Network Society,
-
Signal Processing Society Award, Institute of Electrical and Electronics Engineers (IEEE) (1999)
-
Silicon Valley Engineering Hall of Fame, Silicon Valley Engineering Council (1999)
-
Member, National Academy of Engineering (1995)
-
Neural Networks Pioneer Medal, Institute of Electrical and Electronics Engineers (IEEE) (1991)
-
Alexander Graham Bell Medal, Institute of Electrical and Electronics Engineers (IEEE) (1986)
-
Centennial Medal, Institute of Electrical and Electronics Engineers (IEEE) (1984)
-
Fellow, American Association for the Advancement of Science (1980)
-
Fellow, Institute of Electrical and Electronics Engineers (IEEE) (1976)
-
Franqui Lecture Chair, University of Louvain, Belgium, (1967)
Boards, Advisory Committees, Professional Organizations
-
Editorial Board, Neural Networks (2014 - 2018)
-
Associate Editor, Pattern Recognition (2014 - 2016)
-
Associate Editor, Information Sciences (2014 - 2015)
-
Chair, Silicon Valley Engineering Council Hall of Fame Awards Committee (2006 - 2008)
-
Associate Editor, Circuits, Systems and Signal Processing (2014 - 2015)
-
President, International Neural Network Society (1989 - 1990)
-
Governing Board Member, International Neural Network Society (1988 - 1991)
-
Chairman, DARPA Neural Network Study (1987 - 1988)
Professional Education
-
Sc.D., Massachusetts Institute of Technology, Electrical Engineering (1956)
-
S.M., Massachusetts Institute of Technology, Electrical Engineering (1953)
-
S.B., Massachusetts Institute of Technology, Electrical Engineering (1951)
Patents
-
B. Widrow, J.C. Aragon, B.M. Percival. "United States Patent 7,333,963 Cognitive Memory and Auto-Associative Neural Network Based Search Engine for Computer and Network Located Images and Photographs", Feb 1, 2008
-
B. Widrow. "United States Patent 7,187,907 Simultaneous Two-Way Transmission of Information Signals in the Same Frequency Band", Mar 1, 2007
-
M.A. Lehr and B. Widrow. "United States Patent 5,793,875 Directional Hearing System", Aug 1, 1998
-
B. Widrow. "United States Patent 5,737,430 Directional Hearing Aid", Apr 1, 1998
-
J. Rector, B. Marion, B. Widrow, and I.A. Salehi. "United States Patent 5,191,557 Signal Processing to Enable Utilization of a Rig Reference Sensor with a Drill Bit Seismic Source", Mar 1, 1993
-
J. Rector, B. Marion, B. Widrow, and I.A. Salehi. "United States Patent 5,050,130 Signal Processing to Enable Utilization of a Rig Reference Sensor with a Drill Bit Seismic Source", Sep 1, 1991
-
B. Widrow. "United States Patent 4,964,087 Seismic Processing and Imaging with a Drill-Bit Source", Oct 1, 1990
-
J. Rector, B. Marion, B. Widrow, and I.A. Salehi. "United States Patent 4,926,391 Signal Processing to Enable Utilization of a Rig Reference Sensor with a Drill Bit Seismic Source", May 1, 1990
-
B. Widrow. "United States Patent 4,858,130 Estimation of Hydraulic Fracture Geometry f rom Pumping Pressure Measurements", Aug 1, 1989
-
B. Widrow. "United States Patent 4,849,945 Seismic Processing and Imaging with a Drill- Bit Source", Jul 1, 1989
-
B. Widrow and M.N. Brearley. "United States Patent 4,751,738 Directional Hearing Aid", Jun 1, 1988
-
B. Widrow. "United States Patent 4,556,962 Seismic Exploration Method and Apparatus for Cancelling Interference from Seismic Vibration Source", Dec 1, 1985
-
B. Widrow. "United States Patent 4,537,200 ECG Enhancement by Adaptive Cancellation of Electrosurgical Interference", Aug 1, 1985
-
B. Widrow. "United States Patent 4,363,112 Apparatus and Method for Determining the Posi tion of a Gas-Saturated Porus Rock in the Vicinty of a Deep Borehole in the Earth", Dec 1, 1982
-
B. Widrow. "United States Patent 4,365,322 Apparatus and Method for Determining the Position of a Gas-Saturated Porus Rock in the Vicinty of a Deep Borehole in the Earth", Dec 1, 1982
-
J.R. Zeidler, J.M. McCool, and B. Widrow. "United States Patent 4,355,368 Adaptive Correlator", Oct 1, 1982
-
J.M. McCool, B. Widrow, J.R. Zeidler, R.H. Hearn, D.M. Cha bries, and R.H. Moore. "United States Patent 4,243,935 Adap- tive Detector", Jan 1, 1981
-
J.M. McCool, B. Widrow, J.R. Zeidler, R.H. Hearn, and D.M. Chabries. "United States Patent 4,238,746 daptive Line Enhancer", Dec 1, 1980
-
B. Widrow, M.E. Hoff, Jr.. "United States Patent 3,454,753 Analog Multiplier and Modulati ng Circuits Employing Electrolytic Elements", Jul 1, 1969
-
B. Widrow, G. Frick, R.H. Gordon. "United States Patent 3,395,402 Adaptive Memory Element", Jul 1, 1968
-
B. Widrow and M.E. Hoff, Jr. "United Statesogic Circuit and Electrolyt ic Memory Element", Dec 1, 1965
Current Research and Scholarly Interests
Prof. Widrow's research focuses on adaptive signal processing, adaptive control systems, adaptive neural networks, human memory, and human-like memory for computers. Applications include signal processing, prediction, noise cancelling, adaptive arrays, control systems, and pattern recognition. Recent work is about human learning at the synaptic level.
Projects
-
Hearing Aid Device, Stanford University
A directional acoustic receiving system is constructed in the form of a necklace, including an array of two or more microphones mounted on a housing supported on the chest of the user by a conducting loop encircling the user's neck. This method enables the design of highly-directive-hearing instruments which are comfortable, inconspicuous, and convenient to use. The array provides the user with a dramatic improvement in speech perception over existing hearing aid designs, particularly in the presence of background noise, reverberation, and feedback.
B. Widrow, ``A Microphone Array for Hearing Aids,'' IEEE Circuits and Systems Magazine, 1(2):26-32, 2001.Location
Stanford, California
-
Quantization Noise, Stanford University
Prof. Widrow's most recent book, Quantization Noise, co-authored with Istvan Kollar, is available for purchase at the Cambridge University Press website
Location
Stanford, California
2023-24 Courses
-
Independent Studies (8)
- Directed Research and Writing in Aero/Astro
AA 190 (Sum) - Independent Study in Aero/Astro
AA 199 (Sum) - Master's Thesis and Thesis Research
EE 300 (Aut) - Special Studies and Reports in Electrical Engineering
EE 191 (Aut) - Special Studies and Reports in Electrical Engineering
EE 391 (Aut, Sum) - Special Studies and Reports in Electrical Engineering (WIM)
EE 191W (Aut) - Special Studies or Projects in Electrical Engineering
EE 190 (Aut, Win) - Special Studies or Projects in Electrical Engineering
EE 390 (Aut, Sum)
- Directed Research and Writing in Aero/Astro
All Publications
-
The Hebbian-LMS Learning Algorithm
IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE
2015; 10 (4): 37-53
View details for DOI 10.1109/MCI.2015.2471216
View details for Web of Science ID 000363206100005
- The Back-Prop and No-Prop Training Algorithms COGNITIVE COMPUTATION 2015
-
Cognitive memory
NEURAL NETWORKS
2013; 41: 3-14
Abstract
Regarding the workings of the human mind, memory and pattern recognition seem to be intertwined. You generally do not have one without the other. Taking inspiration from life experience, a new form of computer memory has been devised. Certain conjectures about human memory are keys to the central idea. The design of a practical and useful "cognitive" memory system is contemplated, a memory system that may also serve as a model for many aspects of human memory. The new memory does not function like a computer memory where specific data is stored in specific numbered registers and retrieval is done by reading the contents of the specified memory register, or done by matching key words as with a document search. Incoming sensory data would be stored at the next available empty memory location, and indeed could be stored redundantly at several empty locations. The stored sensory data would neither have key words nor would it be located in known or specified memory locations. Sensory inputs concerning a single object or subject are stored together as patterns in a single "file folder" or "memory folder". When the contents of the folder are retrieved, sights, sounds, tactile feel, smell, etc., are obtained all at the same time. Retrieval would be initiated by a query or a prompt signal from a current set of sensory inputs or patterns. A search through the memory would be made to locate stored data that correlates with or relates to the prompt input. The search would be done by a retrieval system whose first stage makes use of autoassociative artificial neural networks and whose second stage relies on exhaustive search. Applications of cognitive memory systems have been made to visual aircraft identification, aircraft navigation, and human facial recognition. Concerning human memory, reasons are given why it is unlikely that long-term memory is stored in the synapses of the brain's neural networks. Reasons are given suggesting that long-term memory is stored in DNA or RNA. Neural networks are an important component of the human memory system, and their purpose is for information retrieval, not for information storage. The brain's neural networks are analog devices, subject to drift and unplanned change. Only with constant training is reliable action possible. Good training time is during sleep and while awake and making use of one's memory. A cognitive memory is a learning system. Learning involves storage of patterns or data in a cognitive memory. The learning process for cognitive memory is unsupervised, i.e. autonomous.
View details for DOI 10.1016/j.neunet.2013.01.016
View details for Web of Science ID 000318209900002
View details for PubMedID 23453302
-
The No-Prop algorithm: a new learning algorithm for multilayer neural networks.
Neural networks
2013; 37: 182-188
Abstract
A new learning algorithm for multilayer neural networks that we have named No-Propagation (No-Prop) is hereby introduced. With this algorithm, the weights of the hidden-layer neurons are set and fixed with random values. Only the weights of the output-layer neurons are trained, using steepest descent to minimize mean square error, with the LMS algorithm of Widrow and Hoff. The purpose of introducing nonlinearity with the hidden layers is examined from the point of view of Least Mean Square Error Capacity (LMS Capacity), which is defined as the maximum number of distinct patterns that can be trained into the network with zero error. This is shown to be equal to the number of weights of each of the output-layer neurons. The No-Prop algorithm and the Back-Prop algorithm are compared. Our experience with No-Prop is limited, but from the several examples presented here, it seems that the performance regarding training and generalization of both algorithms is essentially the same when the number of training patterns is less than or equal to LMS Capacity. When the number of training patterns exceeds Capacity, Back-Prop is generally the better performer. But equivalent performance can be obtained with No-Prop by increasing the network Capacity by increasing the number of neurons in the hidden layer that drives the output layer. The No-Prop algorithm is much simpler and easier to implement than Back-Prop. Also, it converges much faster. It is too early to definitively say where to use one or the other of these algorithms. This is still a work in progress.
View details for DOI 10.1016/j.neunet.2012.09.020
View details for PubMedID 23140797
- Quantization Noise: Round Off Error in Digital Computation, Signal Processing, Control and Communications Cambridge University Press. 2008
-
Statistical efficiency of adaptive algorithms
INNS/IEEE International Joint Conference on Neural Networks (IJCNN 03)
PERGAMON-ELSEVIER SCIENCE LTD. 2003: 735–44
Abstract
The statistical efficiency of a learning algorithm applied to the adaptation of a given set of variable weights is defined as the ratio of the quality of the converged solution to the amount of data used in training the weights. Statistical efficiency is computed by averaging over an ensemble of learning experiences. A high quality solution is very close to optimal, while a low quality solution corresponds to noisy weights and less than optimal performance. In this work, two gradient descent adaptive algorithms are compared, the LMS algorithm and the LMS/Newton algorithm. LMS is simple and practical, and is used in many applications worldwide. LMS/Newton is based on Newton's method and the LMS algorithm. LMS/Newton is optimal in the least squares sense. It maximizes the quality of its adaptive solution while minimizing the use of training data. Many least squares adaptive algorithms have been devised over the years, but no other least squares algorithm can give better performance, on average, than LMS/Newton. LMS is easily implemented, but LMS/Newton, although of great mathematical interest, cannot be implemented in most practical applications. Because of its optimality, LMS/Newton serves as a benchmark for all least squares adaptive algorithms. The performances of LMS and LMS/Newton are compared, and it is found that under many circumstances, both algorithms provide equal performance. For example, when both algorithms are tested with statistically nonstationary input signals, their average performances are equal. When adapting with stationary input signals and with random initial conditions, their respective learning times are on average equal. However, under worst-case initial conditions, the learning time of LMS can be much greater than that of LMS/Newton, and this is the principal disadvantage of the LMS algorithm. But the strong points of LMS are ease of implementation and optimal performance under important practical conditions. For these reasons, the LMS algorithm has enjoyed very widespread application. It is used in almost every modem for channel equalization and echo cancelling. Furthermore, it is related to the famous backpropagation algorithm used for training neural networks.
View details for DOI 10.1016/S0893-6080(03)00126-6
View details for Web of Science ID 000184011900028
View details for PubMedID 12850029
- Least-Mean-Square Adaptive Filters Wiley-Interscience. 2003
- Statistical Efficiency of Adaptive Algorithms Neural Networks 2003: 735-744
- Neurointerfaces IEEE Transactions on Control Systems Technology 2002: 221-228
- A Microphone Array for Hearing Aids IEEE Circuits and Systems Magazine 2001: 26-32
-
Adaptive inverse control based on linear and nonlinear adaptive filtering
International Workshop on Neural Networks for Identification, Control, Robotics, and Signal/Image Processing
IEEE COMPUTER SOC. 1996: 30–38
View details for Web of Science ID A1996BG16U00004
- Nonlinear Control with Neural Networks Backpropagation: Theory, Architectures, and Applications Erlbaum Associates. 1995
- Noise Canceling and Channel Equalization Handbook of Brain Theory and Neural Networks MIT Press. 1995
- 30 Years of Adaptive Neural Networks: Perceptron, Madaline, and Backpropagation Neural Networks: Theoretical Foundations and Analysis IEEE Press. 1992
- 30 Years of Adaptive Neural Networks: Perceptron, Madaline, and Backpropagation Artificial Neural Networks: Paradigms, Applications, and Hardware Implemenation IEEE Press. 1992: 82–108
- 30 Years of Adaptive Neural Networks: Perceptron, Madaline, and Backpropagation 1990: 1415–42
- Fundamental Relations Between the LMS Algorithm and the DFT IEEE Transactions on Circuits and Systems 1987: 814-820
- Adaptive Signal Processing Prentice Hall. 1985
- On the Statistical Efficiency of the LMS Algorithm with Nonstationary Inputs IEEE Transactions on Information Theory 1984: 211-221
- Adaptive Filters Aspects of Network and System Theory Holt, Rinehart and Winston. 1971
- Adaptive Antenna Systems [a citation classic] 1967: 2143–59
- Statistical Analysis of Amplitude-Quantized Sampled-Data Systems AIEE Transactions on Applications and Industry 1961: 1-14
- Adaptive Switching Circuits RE WESCON Convention Record 1960: 96–104
- A Study of Rough Amplitude Quantization by Means of Nyquist Sampling Theory IRE Transactions on Circuit Theory 1956; CT-3(4): 266-276