Lesson 4: Introduction to Neural Networks
Explore how artificial neural networks mimic the human brain to recognize patterns and make predictions, with hands-on coding activities and visual demonstrations.
Learning Objectives
-
Understand the biological inspiration for artificial neural networks and how they mimic neuron structure and function
-
Explain the architecture of neural networks including input layers, hidden layers, output layers, weights, and biases
-
Implement a simple neural network using Python to solve a classification problem
-
Analyze how training works through forward propagation, loss calculation, and backpropagation
Standards Alignment
-
CSTA 3B-AP-08: Describe how artificial intelligence drives many software and physical systems
-
CSTA 3B-AP-16: Demonstrate code reuse by creating programming solutions using libraries and APIs
-
Common Core Math HSF-IF.C.7: Graph functions expressed symbolically and show key features of the graph
-
NGSS HS-LS1-2: Develop and use a model to illustrate the hierarchical organization of interacting systems
Materials Needed
-
Computer with Python 3.x installed (one per student or pair)
-
Jupyter Notebook or Google Colab access
-
Required libraries: NumPy, Matplotlib, scikit-learn (installation guide provided)
-
Neuron diagram handout (biological vs. artificial comparison)
-
Network architecture visualization poster
-
Dataset: Iris flowers or MNIST digits (included in starter code)
-
Projector for live coding demonstrations
Lesson Procedure
-
Hook - The Pattern Recognition Challenge (10 minutes)
Present students with pattern recognition tasks that humans do effortlessly but were historically difficult for computers:
- Recognize handwritten digits with varied styles
- Identify objects in cluttered images
- Distinguish between cat and dog photos
Discussion: "How does your brain know this is a '7' even though everyone writes it differently? Today we'll learn how to build artificial systems that can learn patterns like your brain does."
-
Biological Inspiration - From Brain to Code (20 minutes)
Part 1: Biological Neurons
- Structure: dendrites (receive signals), cell body (processes), axon (transmits)
- Function: electrical signals, threshold activation, network connections
- Learning: synaptic plasticity, strengthening/weakening connections
Part 2: Artificial Neurons (Perceptrons)
- Inputs (like dendrites) with weights (connection strength)
- Weighted sum + bias (like cell body processing)
- Activation function (threshold, like firing/not firing)
- Output (like axon transmission)
Mathematical Model: output = activation(Σ(input × weight) + bias)
Use the comparison handout to show side-by-side diagrams. Emphasize: Neural networks are inspired by biology but work quite differently!
-
Network Architecture Exploration (15 minutes)
Layer Types:
- Input Layer: Receives raw data (pixels, features, etc.)
- Hidden Layers: Extract increasingly complex patterns
- Output Layer: Produces final prediction or classification
Key Concepts:
- Weights: Strength of connections between neurons, learned during training
- Biases: Threshold adjustments for each neuron
- Activation Functions: ReLU, Sigmoid, Tanh - introduce non-linearity
- Deep Learning: Networks with multiple hidden layers
Draw network diagrams on board showing information flow. Use interactive visualization tools if available (TensorFlow Playground).
-
Hands-On Coding - Build Your First Network (45 minutes)
Guided Coding Exercise: Create a neural network to classify iris flowers
Step 1: Setup and Data Loading
- Import libraries: NumPy, scikit-learn
- Load Iris dataset
- Explore data: features, labels, dimensions
- Split into training and testing sets
Step 2: Build the Network
- Define network architecture: input size, hidden layers, output size
- Initialize weights and biases randomly
- Implement activation function (ReLU, Sigmoid)
- Code forward propagation
Step 3: Training Process
- Define loss function (cross-entropy)
- Implement gradient descent
- Code backpropagation (or use built-in library)
- Train for multiple epochs, track loss
Step 4: Evaluation and Visualization
- Test on held-out data
- Calculate accuracy
- Plot training loss over time
- Visualize decision boundaries
Teaching Strategy: Code together, explaining each section. Pause for questions. Students type along. Checkpoint: ensure everyone's code runs before moving forward.
-
Experimentation and Analysis (20 minutes)
Students modify their neural networks to explore how architecture affects performance:
Experiments to Try:
- Change number of hidden layers (1 vs. 2 vs. 3)
- Adjust neurons per layer (10 vs. 50 vs. 100)
- Try different activation functions
- Modify learning rate
- Increase/decrease training epochs
Analysis Questions:
- How does network depth affect accuracy?
- What happens with too few or too many neurons?
- Can you identify overfitting or underfitting?
- What is the training time tradeoff?
Students record observations in lab notebook format. Share interesting findings with class.
-
Reflection and Real-World Applications (10 minutes)
Discussion: "Now that you understand how neural networks work, where do you see them in the real world?"
Applications to Explore:
- Image recognition (medical diagnosis, facial recognition, autonomous vehicles)
- Natural language processing (translation, chatbots, voice assistants)
- Recommendation systems (Netflix, Spotify, Amazon)
- Game playing (chess, Go, video games)
- Scientific research (protein folding, climate modeling)
Exit Ticket: "Explain in your own words how a neural network learns. What surprised you most about how they work?"
Assessment Strategies
Formative Assessment
- Code functionality checks at each step
- Verbal explanations of concepts during coding
- Quality of experimentation and documentation
- Participation in discussions
- Troubleshooting approach when encountering errors
Summative Assessment
- Working neural network code with documentation
- Experimental results report with analysis
- Written explanation of network architecture and learning
- Optional: Apply network to new dataset (MNIST digits)
- Quiz on neural network concepts and mathematics
Success Criteria
Students demonstrate mastery when they can:
- Explain biological inspiration for neural networks
- Describe network architecture components
- Code a functional neural network from scratch
- Interpret training loss and accuracy metrics
- Analyze effects of hyperparameter changes
- Connect concepts to real-world applications
Differentiation Strategies
For Advanced Learners:
- Implement backpropagation from scratch without libraries
- Explore convolutional neural networks for image classification
- Research and present on advanced architectures (ResNet, Transformers)
- Optimize network using advanced techniques (dropout, batch normalization)
- Work with larger datasets like CIFAR-10
For Students New to Programming:
- Provide complete starter code with clear comments
- Use higher-level libraries (Keras/TensorFlow) that abstract complexity
- Focus on understanding concepts rather than implementation details
- Pair with experienced programmers
- Allow use of no-code neural network builders (teachable machine)
For Visual Learners:
- Use TensorFlow Playground for interactive visualization
- Emphasize graphing and visualization of results
- Create flowcharts of the training process
- Watch animated explanations (3Blue1Brown neural network series)
Extension Activities
Project Ideas:
- Custom Dataset: Train network on student-collected data (plant species, car models, etc.)
- Transfer Learning: Use pre-trained networks and fine-tune for specific tasks
- Comparative Study: Test different ML algorithms, compare to neural networks
- Real-World Problem: Apply networks to local issue (traffic prediction, energy usage, weather)
Advanced Topics:
- Recurrent neural networks for sequence data
- Generative adversarial networks for creating images
- Reinforcement learning for game agents
- Neural network interpretability and explainable AI
Download Lesson Materials
Access all lesson materials including Jupyter notebooks, datasets, diagrams, and assessment tools. Each file can be downloaded individually.
Coding Resources
- Neural Networks Starter Notebook (Jupyter)
- Neural Networks Solutions Notebook (Jupyter)
- Iris Sample Dataset (CSV)