Back to All Lessons
Grades 6-12 Computer Science / STEM 90-120 Minutes

Lesson 12: Build Your Own AI - Hands-On Project

Students will design and create their own simple AI application using beginner-friendly platforms and tools. This scaffolded project guides learners through the complete AI development process, from ideation to implementation and presentation, culminating in a showcase of student-created AI applications.

Learning Objectives

  • Create a functional AI application using beginner-friendly development tools and platforms
  • Apply the AI development lifecycle including planning, training, testing, and iteration
  • Demonstrate understanding of training data collection and model training processes
  • Evaluate their AI application's performance and identify areas for improvement
  • Present their AI project to peers, explaining design decisions and demonstrating functionality

Standards Alignment

  • CSTA 3A-AP-16: Design and iteratively develop computational artifacts for practical intent, personal expression, or to address a societal issue by using events to initiate instructions
  • CSTA 3B-AP-21: Develop and use a series of test cases to verify that a program performs according to its design specifications
  • ISTE 1.4.c: Students develop, test and refine prototypes as part of a cyclical design process
  • ISTE 1.5.c: Students break problems into component parts, extract key information, and develop descriptive models to understand complex systems or facilitate problem-solving
  • ISTE 1.6.b: Students create original works or responsibly repurpose or remix digital resources into new creations
  • NGSS MS-ETS1-2: Evaluate competing design solutions using a systematic process to determine how well they meet the criteria and constraints of the problem

Materials Needed

  • Computer or tablet with internet access for each student or small group (1:1 or 2:1 ratio recommended)
  • Webcam access for image or sound recognition projects (built-in or external)
  • Student accounts for chosen AI platform (Teachable Machine, MIT App Inventor, or Scratch - all free)
  • Project Planning Worksheet (included in downloadable materials)
  • AI Project Development Guide with step-by-step instructions for each platform (included in downloadable materials)
  • Testing and Evaluation Checklist (included in downloadable materials)
  • Project Presentation Template for showcase gallery (included in downloadable materials)
  • Projection system for demonstrations and presentations
  • Optional: Objects, images, or materials for training data collection (varies by project type)

Lesson Procedure

  1. Introduction and Project Overview (10 minutes)

    Begin with an engaging overview of the hands-on AI building project. Explain that students will become AI developers today, creating their own working AI applications that they can share with others.

    Opening Hook:

    • Show 2-3 examples of simple AI projects created by students (image classifiers, sound recognizers, gesture controls)
    • Demonstrate a completed project to show what's possible
    • Ask: "What kind of AI would you like to create? What problem could it solve?"

    Project Options Overview: Present the three main platform options, explaining that each has different strengths:

    • Teachable Machine (Google): Best for image, sound, or pose recognition projects. Very beginner-friendly with quick results
    • MIT App Inventor: Create mobile apps with AI features like image recognition or chatbots
    • Scratch with ML Extensions: Integrate machine learning into interactive games and stories

    Distribute the Project Planning Worksheet and explain that students will start by planning their AI project before beginning development.

  2. Project Planning and Design (15 minutes)

    Guide students through the planning process using the Project Planning Worksheet. Emphasize that good planning leads to better projects and helps avoid common pitfalls.

    Planning Components:

    • Problem Identification: What problem or need will your AI address? Who is your user?
    • AI Type Selection: Will your AI recognize images, sounds, poses, or text? Why is this the best approach?
    • Platform Choice: Which tool will you use? (Help students match their idea to the appropriate platform)
    • Training Data Plan: What categories will your AI distinguish between? What examples will you collect?
    • Success Criteria: How will you know if your AI works well? What accuracy is acceptable?

    Project Idea Examples to Inspire Students:

    • Recycling sorter that identifies plastic, paper, metal, and glass
    • Emotion detector that recognizes happy, sad, surprised, and neutral faces
    • Musical instrument classifier that identifies guitar, piano, drums, and voice
    • Hand gesture controller for games (rock, paper, scissors, thumbs up, peace sign)
    • Plant identifier for common houseplants or trees
    • Sign language alphabet recognizer

    Circulate to approve project plans, ensuring they are achievable within the time frame and appropriate for the chosen platform. Provide guidance on scope - projects should be challenging but completable.

  3. Platform Tutorial and Setup (15-20 minutes)

    Provide targeted tutorials based on which platform(s) students have chosen. Use a station approach if students are using different platforms, or conduct whole-class instruction if everyone is using the same tool.

    For Teachable Machine Users:

    • Navigate to teachablemachine.withgoogle.com
    • Demonstrate creating a new image, audio, or pose project
    • Show how to add classes (categories to distinguish between)
    • Explain capturing training samples: aim for 50-100 samples per class with variety
    • Demonstrate training the model and testing it
    • Show how to export the model for use in other projects

    For MIT App Inventor Users:

    • Log into appinventor.mit.edu with student accounts
    • Demonstrate creating a new project
    • Show how to add the Image Classification or Personal Image Classifier extension
    • Explain the Designer vs. Blocks interface
    • Walk through a simple example: adding a button, camera, and image classifier
    • Show how to test the app using the MIT AI2 Companion app on mobile devices

    For Scratch with ML Extension Users:

    • Access Scratch at scratch.mit.edu and sign in
    • Show how to add extensions, specifically the Video Sensing or Machine Learning extensions
    • Demonstrate creating sprites and backgrounds
    • Explain how to train a model within Scratch using the ML extension
    • Show how to connect ML blocks to sprite actions and events

    Key Tutorial Points for All Platforms:

    • Emphasize the importance of diverse training data
    • Explain how to test as you go
    • Show how to save progress regularly
    • Demonstrate troubleshooting common issues

    Distribute the AI Project Development Guide with platform-specific step-by-step instructions students can reference during independent work.

  4. Guided Development - Training Data Collection (15-20 minutes)

    Support students as they collect and organize training data for their AI models. This is a critical phase that significantly impacts model performance.

    Training Data Best Practices:

    • Quantity: Aim for at least 50 samples per category, more is better
    • Variety: Include different angles, lighting conditions, backgrounds, and variations
    • Balance: Ensure roughly equal numbers of samples for each category
    • Quality: Images should be clear and properly represent the category

    Data Collection Strategies:

    • For image projects: Use the webcam to capture live samples, vary positions and backgrounds
    • For sound projects: Record in different locations, try different volumes and distances
    • For pose projects: Include multiple people if possible, try different clothing and lighting

    Circulate to ensure students are collecting quality, diverse training data. Point out when categories might be too similar or when more variety is needed. Remind students that good training data is the foundation of a good AI model.

    Common Issues to Watch For:

    • Insufficient samples (less than 30 per category)
    • Too much similarity in samples (all from same angle or lighting)
    • Unbalanced categories (50 samples of one class, 10 of another)
    • Background interference (model learning background instead of object)
  5. Model Training and Initial Testing (15-20 minutes)

    Guide students through training their AI models and conducting initial testing. This phase involves iteration and refinement.

    Training Process:

    • Click the "Train Model" button (Teachable Machine) or equivalent in other platforms
    • Explain that training may take 1-5 minutes depending on the amount of data
    • Discuss what's happening during training: the AI is learning patterns from the examples
    • Point out the progress indicators and any accuracy metrics shown

    Initial Testing Protocol:

    1. Test with samples similar to training data - does it recognize them correctly?
    2. Test with slightly different variations - does it still work?
    3. Test edge cases - what happens with ambiguous inputs?
    4. Test with items not in any category - does it handle unknowns appropriately?

    Evaluation Questions for Students:

    • What percentage of the time does your AI make the correct prediction?
    • Are there any categories it confuses frequently?
    • Does it work in different lighting or at different angles?
    • What types of inputs does it struggle with?

    Have students document their testing results using the Testing and Evaluation Checklist. Encourage them to identify specific areas for improvement.

    Iteration Guidance:

    • If accuracy is low (below 70%), collect more diverse training data
    • If certain categories are confused, add more contrasting examples
    • If it works in one setting but not another, add examples from that setting
    • Remember: iteration is a normal part of AI development!
  6. Refinement and Integration (15-20 minutes)

    Support students as they refine their models and integrate them into complete applications with user interfaces and functionality.

    Refinement Activities:

    • Add more training data to categories with lower accuracy
    • Remove or replace poor quality training samples
    • Re-train the model with improved data
    • Test again and compare results to previous version

    Integration Tasks (Platform-Specific):

    Teachable Machine:

    • Add labels or visual indicators for each classification
    • Implement confidence thresholds (only show result if confidence is high)
    • Add user instructions or help text
    • Consider exporting to use in a webpage or app

    MIT App Inventor:

    • Design an attractive user interface with buttons, labels, and images
    • Add sound effects or text-to-speech for feedback
    • Implement logic to handle different classification results
    • Add features like a reset button, history, or score tracking

    Scratch:

    • Create sprites that respond to AI classifications
    • Add animations, sounds, and visual effects
    • Build a game or interactive story around the AI functionality
    • Include instructions and user feedback

    Polish and User Experience:

    • Add clear instructions for users
    • Implement error handling for unusual inputs
    • Make sure the interface is intuitive and attractive
    • Test with a peer to get feedback on usability

    Encourage peer testing at this stage - have students test each other's AI applications and provide constructive feedback.

  7. Project Showcase and Reflection (15-20 minutes)

    Conduct a gallery walk or formal presentations where students showcase their AI projects to classmates. This provides an opportunity for students to practice explaining their work and learn from others.

    Showcase Format Options:

    • Gallery Walk: Students set up stations with their AI running; classmates rotate through to try each project
    • Lightning Talks: Each student gives a 2-3 minute presentation and demonstration
    • Small Group Presentations: Divide class into groups of 4-6 for more intimate sharing
    • Digital Showcase: Create a class website or shared document with project descriptions and links

    Presentation Components (using Project Presentation Template):

    • Project Title and Purpose: What does your AI do and why is it useful?
    • How It Works: What type of AI is it? What categories does it recognize?
    • Live Demonstration: Show your AI in action with live input
    • Development Process: What challenges did you face? How did you overcome them?
    • Results and Accuracy: How well does your AI perform? What's its accuracy rate?
    • Future Improvements: What would you add or change if you had more time?

    Peer Feedback Activity:

    • Provide feedback forms where students note one strength and one suggestion for each project they view
    • Encourage questions: "How did you collect your training data?" "What was the hardest part?"
    • Have students identify their favorite project and explain why

    Class Reflection Discussion:

    • What did you learn about how AI is created?
    • What surprised you about the development process?
    • How is creating AI different from using AI?
    • What made some AI projects more successful than others?
    • How could you improve your AI project with more time or resources?
    • What real-world applications could benefit from AI similar to what you created?

    Conclude by celebrating student achievements and emphasizing that they are now AI creators, not just users. Encourage them to continue developing their projects or start new ones outside of class.

Assessment Strategies

Formative Assessment

  • Observation of project planning process and quality of planning worksheet completion
  • Monitoring during training data collection: quantity, variety, and quality of samples
  • Evaluation of testing procedures and use of Testing and Evaluation Checklist
  • Observation of iteration process: ability to identify problems and implement solutions
  • Peer feedback during testing phase and incorporation of suggestions
  • Checkpoint conferences with instructor at each project phase

Summative Assessment

  • Completed AI project with functional model that performs as intended (40%)
  • Project presentation explaining design decisions, development process, and results (25%)
  • Written project report including planning, testing documentation, and reflection (20%)
  • Peer evaluation feedback on usability and creativity (15%)

Success Criteria

Students demonstrate mastery when they:

  • Create a functional AI model that correctly classifies inputs with at least 70% accuracy
  • Collect diverse, balanced training data with minimum 50 samples per category
  • Complete at least one iteration cycle of testing, evaluation, and improvement
  • Clearly explain their AI's purpose, functionality, and development process
  • Demonstrate understanding of how training data affects AI performance
  • Identify limitations of their AI and propose meaningful improvements

Differentiation Strategies

For Advanced Learners:

  • Challenge them to create multi-class models with 5+ categories instead of 3-4
  • Encourage export and integration of models into webpages using JavaScript
  • Have them explore transfer learning or more advanced ML concepts
  • Ask them to research and implement confidence thresholds and error handling
  • Suggest creating a series of connected AI models for more complex applications
  • Encourage experimentation with different AI types (combining image and sound recognition)

For Struggling Learners:

  • Provide pre-selected project ideas with detailed step-by-step guides
  • Start with 2-3 categories instead of 4-5 for simpler classification tasks
  • Offer template projects they can modify rather than starting from scratch
  • Pair with a peer buddy for collaborative development and troubleshooting
  • Focus on Teachable Machine which has the most streamlined interface
  • Provide additional one-on-one support during data collection and training phases

For English Language Learners:

  • Provide platform tutorials in their native language if available
  • Use visual step-by-step guides with screenshots and minimal text
  • Allow presentation of projects with visual demonstrations rather than extensive verbal explanation
  • Pre-teach technical vocabulary: model, training, classification, accuracy, confidence
  • Provide sentence frames for project presentations and reflections
  • Pair with bilingual students when possible for peer support

For Students with Special Needs:

  • Ensure all platforms and interfaces meet accessibility standards (screen reader compatible)
  • Provide alternative input methods for data collection if needed
  • Allow extended time for project completion and multiple work sessions
  • Break the project into smaller checkpoints with clear success criteria for each
  • Offer alternative presentation formats (video recording, written report, poster)
  • Provide adaptive equipment for data collection if physical limitations exist

Extension Activities

Real-World Application Development:

Challenge students to identify a genuine problem in their school, community, or home that could be solved with AI. Have them develop a more sophisticated version of their project specifically designed to address this need. Students should interview potential users, gather requirements, and iterate based on real feedback. This connects classroom learning to practical problem-solving.

Cross-Curricular Connections:

  • Science: Create AI to classify plants, animals, rocks, or other scientific specimens; develop AI for experimental data analysis or pattern recognition
  • Mathematics: Explore the statistical foundations of AI, analyze accuracy metrics, graph training data distributions, and calculate confidence intervals
  • Art: Develop AI that recognizes art styles, creates generative art, or classifies artistic techniques; explore AI's role in creative processes
  • Language Arts: Write technical documentation for AI projects, create user guides, develop marketing materials, or write science fiction stories about AI development
  • Social Studies: Research how AI is used in different cultures, explore ethical considerations of AI development, or create AI for language translation

Advanced AI Development Series:

Create a multi-week extended project where students progressively improve their AI applications. Each week introduces new concepts: Week 1 - Basic classification; Week 2 - Transfer learning; Week 3 - Multi-modal AI (combining image and sound); Week 4 - Cloud deployment; Week 5 - User testing and iteration. This scaffolded approach builds expertise over time and allows for increasingly sophisticated projects.

AI Ethics and Impact Analysis:

Have students analyze their own AI projects through an ethical lens. Questions to explore: What biases might exist in your training data? How could your AI be misused? Who benefits from this technology and who might be harmed? What privacy considerations exist? Students create an "AI Impact Statement" documenting potential benefits, risks, and mitigation strategies for their projects.

Class AI Exhibition or Competition:

Organize a school-wide AI Fair where students showcase their projects to other classes, parents, and community members. Create categories for awards: Most Creative, Most Practical, Best Technical Implementation, Best User Interface, Most Improved. Invite local tech professionals or university AI researchers to judge and provide feedback. Document the event with photos and videos for a digital portfolio.

Teacher Notes and Tips

Common Misconceptions to Address:

  • Misconception: "More training data always means better AI."
    Clarification: Quality and diversity matter more than quantity. 100 nearly identical images are less valuable than 50 varied ones. Emphasize the importance of balanced, representative datasets.
  • Misconception: "AI should be 100% accurate to be useful."
    Clarification: Even professional AI systems have error rates. Discuss acceptable accuracy levels for different applications and how humans also make mistakes. Focus on iterative improvement rather than perfection.
  • Misconception: "The AI understands what it's classifying."
    Clarification: AI recognizes patterns but doesn't truly "understand." A dog-recognizing AI doesn't know what a dog is - it just matches visual patterns. Use analogies to help students grasp this distinction.
  • Misconception: "Training an AI is like programming - you tell it exactly what to do."
    Clarification: AI learns from examples rather than following explicit instructions. This is a fundamentally different paradigm from traditional programming. Demonstrate the difference with concrete examples.

Preparation Tips:

  • Test all platforms beforehand and create sample projects to anticipate student challenges
  • Prepare backup project ideas for students who struggle with ideation or whose first idea proves too complex
  • Set up student accounts in advance to avoid wasting class time on account creation
  • Have a backup plan for technical difficulties (offline activity, paper prototyping, or platform alternatives)
  • Create a troubleshooting guide with solutions to common technical issues specific to each platform
  • Gather physical objects students can use for training data if internet-based image searches are challenging
  • Prepare examples of excellent, good, and needs-improvement projects to calibrate student expectations

Classroom Management:

  • Establish clear expectations for device use and on-task behavior during independent work time
  • Use visible timers for each project phase to keep students on track
  • Implement a "three before me" policy: students must ask three peers before asking the teacher
  • Create designated "expert" students for each platform who can provide peer support
  • Schedule regular check-in points where all students must show progress before continuing
  • Have a system for managing noise levels during collaborative work and testing

Troubleshooting:

  • Problem: Model accuracy is very low (below 50%)
    Solution: Check if categories are too similar, if lighting varies too much, or if backgrounds are interfering. Often solved by adding more diverse training data or simplifying categories.
  • Problem: Platform is running slowly or freezing
    Solution: Clear browser cache, close unnecessary tabs, reduce number of training samples temporarily, or switch to a different device. Some platforms work better on certain browsers.
  • Problem: Students can't think of project ideas
    Solution: Provide a menu of 10-15 pre-approved project ideas with varying difficulty levels. Show more example projects. Have students interview classmates about problems that need solving.
  • Problem: Webcam not working or access denied
    Solution: Check browser permissions, try different browser, ensure webcam isn't being used by another application. Have students use mobile devices as alternative.
  • Problem: Projects are too ambitious for time available
    Solution: Help students identify "MVP" (Minimum Viable Product) - the simplest version that still works. They can always expand later. Encourage 3 categories instead of 6.
  • Problem: Students finish at very different rates
    Solution: Have extension challenges ready for early finishers: "Can you improve accuracy to 90%?" "Can you add a new feature?" "Can you help troubleshoot a classmate's project?"

Time Management Recommendations:

  • This lesson works best split across 2-3 class periods rather than rushed in one extended session
  • Consider: Day 1 (Planning and Data Collection), Day 2 (Training and Testing), Day 3 (Refinement and Showcase)
  • If time is limited, focus on Teachable Machine only as it has the fastest learning curve
  • Build in buffer time - some students will need it for iteration and troubleshooting

Assessment Tips:

  • Focus assessment on the process as much as the final product - learning happens during iteration
  • Take photos/videos of projects since some platforms don't save permanently without export
  • Have students submit their Testing and Evaluation Checklist showing their iteration process
  • Consider giving participation credit for peer testing and constructive feedback provided