Bio


Bernard Widrow is Professor Emeritus in the Electrical Engineering Department at Stanford University. His research focuses on adaptive signal processing, adaptive control systems, adaptive neural networks, human memory, cybernetics, and human-like memory for computers. Applications include signal processing, prediction, noise cancelling, adaptive arrays, control systems, and pattern recognition. Before coming to Stanford in 1959, he taught at MIT where he received the Doctor of Science Degree in 1956.

Academic Appointments


Honors & Awards


  • Citation Classic for paper "Adaptive Antenna Systems,'' Proceedings of the IEEE, December 1967, Institute of Electrical and Electronics Engineers (IEEE)
  • Benjamin Franklin Medal, The Franklin Institute (2001)
  • IEEE Millenium Medal, Institute of Electrical and Electronics Engineers (IEEE) (2000)
  • Donald O. Hebb Award, International Neural Network Society,
  • Signal Processing Society Award, Institute of Electrical and Electronics Engineers (IEEE) (1999)
  • Silicon Valley Engineering Hall of Fame, Silicon Valley Engineering Council (1999)
  • Member, National Academy of Engineering (1995)
  • Neural Networks Pioneer Medal, Institute of Electrical and Electronics Engineers (IEEE) (1991)
  • Alexander Graham Bell Medal, Institute of Electrical and Electronics Engineers (IEEE) (1986)
  • Centennial Medal, Institute of Electrical and Electronics Engineers (IEEE) (1984)
  • Fellow, American Association for the Advancement of Science (1980)
  • Fellow, Institute of Electrical and Electronics Engineers (IEEE) (1976)
  • Franqui Lecture Chair, University of Louvain, Belgium, (1967)

Boards, Advisory Committees, Professional Organizations


  • Editorial Board, Neural Networks (2014 - 2018)
  • Associate Editor, Pattern Recognition (2014 - 2016)
  • Associate Editor, Information Sciences (2014 - 2015)
  • Chair, Silicon Valley Engineering Council Hall of Fame Awards Committee (2006 - 2008)
  • Associate Editor, Circuits, Systems and Signal Processing (2014 - 2015)
  • President, International Neural Network Society (1989 - 1990)
  • Governing Board Member, International Neural Network Society (1988 - 1991)
  • Chairman, DARPA Neural Network Study (1987 - 1988)

Professional Education


  • Sc.D., Massachusetts Institute of Technology, Electrical Engineering (1956)
  • S.M., Massachusetts Institute of Technology, Electrical Engineering (1953)
  • S.B., Massachusetts Institute of Technology, Electrical Engineering (1951)

Patents


  • B. Widrow, J.C. Aragon, B.M. Percival. "United States Patent 7,333,963 Cognitive Memory and Auto-Associative Neural Network Based Search Engine for Computer and Network Located Images and Photographs", Feb 1, 2008
  • B. Widrow. "United States Patent 7,187,907 Simultaneous Two-Way Transmission of Information Signals in the Same Frequency Band", Mar 1, 2007
  • M.A. Lehr and B. Widrow. "United States Patent 5,793,875 Directional Hearing System", Aug 1, 1998
  • B. Widrow. "United States Patent 5,737,430 Directional Hearing Aid", Apr 1, 1998
  • J. Rector, B. Marion, B. Widrow, and I.A. Salehi. "United States Patent 5,191,557 Signal Processing to Enable Utilization of a Rig Reference Sensor with a Drill Bit Seismic Source", Mar 1, 1993
  • J. Rector, B. Marion, B. Widrow, and I.A. Salehi. "United States Patent 5,050,130 Signal Processing to Enable Utilization of a Rig Reference Sensor with a Drill Bit Seismic Source", Sep 1, 1991
  • B. Widrow. "United States Patent 4,964,087 Seismic Processing and Imaging with a Drill-Bit Source", Oct 1, 1990
  • J. Rector, B. Marion, B. Widrow, and I.A. Salehi. "United States Patent 4,926,391 Signal Processing to Enable Utilization of a Rig Reference Sensor with a Drill Bit Seismic Source", May 1, 1990
  • B. Widrow. "United States Patent 4,858,130 Estimation of Hydraulic Fracture Geometry f rom Pumping Pressure Measurements", Aug 1, 1989
  • B. Widrow. "United States Patent 4,849,945 Seismic Processing and Imaging with a Drill- Bit Source", Jul 1, 1989
  • B. Widrow and M.N. Brearley. "United States Patent 4,751,738 Directional Hearing Aid", Jun 1, 1988
  • B. Widrow. "United States Patent 4,556,962 Seismic Exploration Method and Apparatus for Cancelling Interference from Seismic Vibration Source", Dec 1, 1985
  • B. Widrow. "United States Patent 4,537,200 ECG Enhancement by Adaptive Cancellation of Electrosurgical Interference", Aug 1, 1985
  • B. Widrow. "United States Patent 4,363,112 Apparatus and Method for Determining the Posi tion of a Gas-Saturated Porus Rock in the Vicinty of a Deep Borehole in the Earth", Dec 1, 1982
  • B. Widrow. "United States Patent 4,365,322 Apparatus and Method for Determining the Position of a Gas-Saturated Porus Rock in the Vicinty of a Deep Borehole in the Earth", Dec 1, 1982
  • J.R. Zeidler, J.M. McCool, and B. Widrow. "United States Patent 4,355,368 Adaptive Correlator", Oct 1, 1982
  • J.M. McCool, B. Widrow, J.R. Zeidler, R.H. Hearn, D.M. Cha bries, and R.H. Moore. "United States Patent 4,243,935 Adap- tive Detector", Jan 1, 1981
  • J.M. McCool, B. Widrow, J.R. Zeidler, R.H. Hearn, and D.M. Chabries. "United States Patent 4,238,746 daptive Line Enhancer", Dec 1, 1980
  • B. Widrow, M.E. Hoff, Jr.. "United States Patent 3,454,753 Analog Multiplier and Modulati ng Circuits Employing Electrolytic Elements", Jul 1, 1969
  • B. Widrow, G. Frick, R.H. Gordon. "United States Patent 3,395,402 Adaptive Memory Element", Jul 1, 1968
  • B. Widrow and M.E. Hoff, Jr. "United Statesogic Circuit and Electrolyt ic Memory Element", Dec 1, 1965

Current Research and Scholarly Interests


Prof. Widrow's research focuses on adaptive signal processing, adaptive control systems, adaptive neural networks, human memory, and human-like memory for computers. Applications include signal processing, prediction, noise cancelling, adaptive arrays, control systems, and pattern recognition. Recent work is about human learning at the synaptic level.

Projects


  • Hearing Aid Device, Stanford University

    A directional acoustic receiving system is constructed in the form of a necklace, including an array of two or more microphones mounted on a housing supported on the chest of the user by a conducting loop encircling the user's neck. This method enables the design of highly-directive-hearing instruments which are comfortable, inconspicuous, and convenient to use. The array provides the user with a dramatic improvement in speech perception over existing hearing aid designs, particularly in the presence of background noise, reverberation, and feedback.

    B. Widrow, ``A Microphone Array for Hearing Aids,'' IEEE Circuits and Systems Magazine, 1(2):26-32, 2001.

    Location

    Stanford, California

  • Quantization Noise, Stanford University

    Prof. Widrow's most recent book, Quantization Noise, co-authored with Istvan Kollar, is available for purchase at the Cambridge University Press website

    Location

    Stanford, California

All Publications


  • The Hebbian-LMS Learning Algorithm IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE Widrow, B., Kim, Y., Park, D. 2015; 10 (4): 37-53
  • The Back-Prop and No-Prop Training Algorithms COGNITIVE COMPUTATION Widrow, B. 2015
  • Cognitive memory NEURAL NETWORKS Widrow, B., Aragon, J. C. 2013; 41: 3-14

    Abstract

    Regarding the workings of the human mind, memory and pattern recognition seem to be intertwined. You generally do not have one without the other. Taking inspiration from life experience, a new form of computer memory has been devised. Certain conjectures about human memory are keys to the central idea. The design of a practical and useful "cognitive" memory system is contemplated, a memory system that may also serve as a model for many aspects of human memory. The new memory does not function like a computer memory where specific data is stored in specific numbered registers and retrieval is done by reading the contents of the specified memory register, or done by matching key words as with a document search. Incoming sensory data would be stored at the next available empty memory location, and indeed could be stored redundantly at several empty locations. The stored sensory data would neither have key words nor would it be located in known or specified memory locations. Sensory inputs concerning a single object or subject are stored together as patterns in a single "file folder" or "memory folder". When the contents of the folder are retrieved, sights, sounds, tactile feel, smell, etc., are obtained all at the same time. Retrieval would be initiated by a query or a prompt signal from a current set of sensory inputs or patterns. A search through the memory would be made to locate stored data that correlates with or relates to the prompt input. The search would be done by a retrieval system whose first stage makes use of autoassociative artificial neural networks and whose second stage relies on exhaustive search. Applications of cognitive memory systems have been made to visual aircraft identification, aircraft navigation, and human facial recognition. Concerning human memory, reasons are given why it is unlikely that long-term memory is stored in the synapses of the brain's neural networks. Reasons are given suggesting that long-term memory is stored in DNA or RNA. Neural networks are an important component of the human memory system, and their purpose is for information retrieval, not for information storage. The brain's neural networks are analog devices, subject to drift and unplanned change. Only with constant training is reliable action possible. Good training time is during sleep and while awake and making use of one's memory. A cognitive memory is a learning system. Learning involves storage of patterns or data in a cognitive memory. The learning process for cognitive memory is unsupervised, i.e. autonomous.

    View details for DOI 10.1016/j.neunet.2013.01.016

    View details for Web of Science ID 000318209900002

    View details for PubMedID 23453302

  • The No-Prop algorithm: a new learning algorithm for multilayer neural networks. Neural networks Widrow, B., Greenblatt, A., Kim, Y., Park, D. 2013; 37: 182-188

    Abstract

    A new learning algorithm for multilayer neural networks that we have named No-Propagation (No-Prop) is hereby introduced. With this algorithm, the weights of the hidden-layer neurons are set and fixed with random values. Only the weights of the output-layer neurons are trained, using steepest descent to minimize mean square error, with the LMS algorithm of Widrow and Hoff. The purpose of introducing nonlinearity with the hidden layers is examined from the point of view of Least Mean Square Error Capacity (LMS Capacity), which is defined as the maximum number of distinct patterns that can be trained into the network with zero error. This is shown to be equal to the number of weights of each of the output-layer neurons. The No-Prop algorithm and the Back-Prop algorithm are compared. Our experience with No-Prop is limited, but from the several examples presented here, it seems that the performance regarding training and generalization of both algorithms is essentially the same when the number of training patterns is less than or equal to LMS Capacity. When the number of training patterns exceeds Capacity, Back-Prop is generally the better performer. But equivalent performance can be obtained with No-Prop by increasing the network Capacity by increasing the number of neurons in the hidden layer that drives the output layer. The No-Prop algorithm is much simpler and easier to implement than Back-Prop. Also, it converges much faster. It is too early to definitively say where to use one or the other of these algorithms. This is still a work in progress.

    View details for DOI 10.1016/j.neunet.2012.09.020

    View details for PubMedID 23140797

  • Statistical Efficiency of Adaptive Algorithms Neural Networks Widrow, B., Kamenetsky, M. 2003: 735-744
  • Neurointerfaces IEEE Transactions on Control Systems Technology Widrow, B. 2002: 221-228
  • A Microphone Array for Hearing Aids IEEE Circuits and Systems Magazine Widrow, B. 2001: 26-32
  • Fundamental Relations Between the LMS Algorithm and the DFT IEEE Transactions on Circuits and Systems Widrow, B., Baudrenghien, P., Vetterli, M., Titchener, P. 1987: 814-820
  • On the Statistical Efficiency of the LMS Algorithm with Nonstationary Inputs IEEE Transactions on Information Theory Widrow, B., Walach E. 1984: 211-221
  • Statistical Analysis of Amplitude-Quantized Sampled-Data Systems AIEE Transactions on Applications and Industry Widrow, B. 1961: 1-14
  • A Study of Rough Amplitude Quantization by Means of Nyquist Sampling Theory IRE Transactions on Circuit Theory Widrow, B. 1956; CT-3(4): 266-276