Biological vs Artificial Neurons

Understanding the Inspiration Behind Neural Networks

🧠 Biological Neuron

Nucleus Dendrites Cell Body Axon Terminal

Key Components:

  • Dendrites: Receive electrical signals from other neurons
  • Cell Body (Soma): Processes incoming signals and integrates information
  • Nucleus: Contains genetic material and controls cell function
  • Axon: Transmits electrical impulses away from cell body
  • Myelin Sheath: Insulates axon and speeds up signal transmission
  • Axon Terminals: Release neurotransmitters to communicate with next neuron

How It Works:

  • Receives chemical signals through synapses
  • Converts to electrical signals in dendrites
  • Sums signals in cell body
  • Fires action potential if threshold reached
  • All-or-nothing response (binary)
  • Signal travels down axon at ~100 m/s
  • Releases neurotransmitters at terminals

🤖 Artificial Neuron (Perceptron)

x₁ x₂ x₃ xₙ w₁ w₂ w₃ wₙ Σ + bias f(x) activation y y = f(Σ(xᵢ × wᵢ) + b) Inputs Weights Output

Key Components:

  • Inputs (x₁, x₂, ..., xₙ): Numerical values representing features
  • Weights (w₁, w₂, ..., wₙ): Learned parameters that determine importance
  • Bias (b): Threshold adjustment for activation
  • Summation (Σ): Weighted sum of all inputs plus bias
  • Activation Function (f): Non-linear transformation (ReLU, sigmoid, etc.)
  • Output (y): Final prediction or signal to next layer

How It Works:

  • Receives numerical inputs
  • Multiplies each input by its weight
  • Sums all weighted inputs plus bias
  • Applies activation function
  • Produces output value
  • Weights adjusted during training (backpropagation)
  • Can be connected to form networks
Key Insight: While artificial neurons are inspired by biological neurons, they are mathematical simplifications. Biological neurons use chemical and electrical signals, while artificial neurons use numerical computations. The power comes from connecting many simple artificial neurons into complex networks!

Side-by-Side Comparison

Aspect Biological Neuron Artificial Neuron
Input Chemical signals (neurotransmitters) Numerical values
Processing Electrochemical integration Mathematical weighted sum
Threshold Action potential threshold (~-55mV) Bias parameter (learned)
Activation All-or-nothing spike Continuous function (sigmoid, ReLU)
Output Electrical impulse + neurotransmitter release Numerical value
Learning Synaptic plasticity (chemical changes) Weight adjustment (gradient descent)
Speed ~100 m/s (slower but parallel) Near-instant (sequential in basic models)
Number in System ~86 billion in human brain Thousands to billions in large networks
Energy Very efficient (~20W for entire brain) Can be power-intensive for large models
Fault Tolerance Highly resilient, can compensate for damage Sensitive to errors, needs careful design
Precision Noisy, approximate High numerical precision
Adaptability Continuously adapts throughout life Requires explicit training phase

Teaching Notes:

Evolve AI Institute | Lesson 4: Introduction to Neural Networks

Free AI Education for All