Back to All Lessons
Grades 6-8 Digital Citizenship 60 Minutes

Lesson 9: Your Data and AI: Privacy in the Digital Age

In this critical digital literacy lesson, students explore how artificial intelligence systems collect, analyze, and use their personal data. Through a hands-on privacy audit, real-world scenarios, and examination of legal frameworks, students learn to protect their privacy, understand their digital rights, and make informed decisions about their online presence in an AI-driven world.

Learning Objectives

  • Analyze how AI systems collect, process, and utilize personal data from various digital platforms and services, identifying specific data collection methods and their purposes
  • Evaluate the privacy implications of sharing different types of personal information online, distinguishing between necessary and unnecessary data disclosure
  • Conduct a personal privacy audit of their own digital footprint across multiple platforms, documenting findings and identifying areas of concern
  • Apply practical privacy protection strategies to secure personal information, including privacy settings configuration, permission management, and safe online practices
  • Understand key privacy laws and regulations (COPPA, FERPA, GDPR basics) that protect personal data, explaining their relevance to students' daily digital interactions

Standards Alignment

  • ISTE 2.b (Digital Citizen): Students engage in positive, safe, legal and ethical behavior when using technology, including social interactions online or when using networked devices
  • ISTE 2.d (Digital Citizen): Students manage their personal data to maintain digital privacy and security and are aware of data-collection technology used to track their navigation online
  • CSTA 3A-IC-24: Evaluate the ways computing impacts personal, ethical, social, economic, and cultural practices
  • CSTA 3A-IC-30: Evaluate the social and economic implications of privacy in the context of safety, law, or ethics
  • CCSS.ELA-LITERACY.RI.7.7: Compare and contrast a text to an audio, video, or multimedia version of the text, analyzing each medium's portrayal of the subject
  • CCSS.ELA-LITERACY.SL.7.2: Analyze the main ideas and supporting details presented in diverse media and formats and explain how the ideas clarify a topic, text, or issue under study

Materials Needed

  • Computer or tablet with internet access for each student or pair of students (minimum 1:2 device ratio)
  • Digital projector or interactive whiteboard for whole-class demonstrations and discussions
  • Personal Privacy Audit Worksheet (1 per student, included in downloadable materials)
  • Real-World Privacy Scenarios Cards (1 set per small group of 3-4 students, included in downloadable materials)
  • Privacy Protection Action Plan Template (1 per student, included in downloadable materials)
  • Student-accessible accounts for common platforms (social media, email, gaming, educational apps) for privacy settings exploration - ensure school policy compliance
  • Privacy Laws Reference Guide handout (1 per student, included in downloadable materials)
  • Data Collection Visualization Chart (poster or digital display, included in downloadable materials)
  • Optional: Guest speaker or video interview with cybersecurity or privacy professional

Lesson Procedure

  1. Hook and Introduction: The Data We Leave Behind (8 minutes)

    Begin with an engaging thought experiment: "Imagine if everything you did online today was printed out and posted on the classroom wall for everyone to see. What would people learn about you?"

    Interactive Opening Activity:

    • Display a sample "data trail" visualization showing one hypothetical student's digital activities from morning to afternoon (searches, app usage, location data, social media interactions)
    • Ask students to estimate how many pieces of data they think they generate daily (reveal the surprising answer: average person generates 1.7 megabytes per second)
    • Show brief video clip (2-3 minutes) demonstrating how AI uses this data for personalization, advertising, and prediction
    • Pose the essential question: "Who has access to your data, and what are they doing with it?"

    Transition to learning objectives by explaining that today's lesson will empower students to understand, audit, and protect their personal information in an AI-powered digital world.

  2. Direct Instruction: How AI Collects and Uses Your Data (12 minutes)

    Present comprehensive overview of AI data collection methods using the Data Collection Visualization Chart, covering multiple collection points and purposes.

    Key Concepts to Cover:

    • Explicit Data Collection: Information users actively provide (account registration, profile creation, form submissions, search queries)
    • Implicit Data Collection: Information gathered through usage patterns (browsing history, click-through rates, time spent on pages, scroll depth, mouse movements)
    • Passive Data Collection: Background information captured without active interaction (location tracking via GPS, device information, IP addresses, cookies and tracking pixels)
    • Inferred Data: Information AI systems deduce about users based on patterns (personality traits, purchasing likelihood, political leanings, health conditions, social connections)

    How AI Uses This Data:

    • Personalization and recommendations (streaming services, online shopping, content feeds)
    • Targeted advertising and marketing campaigns
    • Predictive analytics and user behavior forecasting
    • Product development and improvement
    • Data aggregation and selling to third parties
    • Security and fraud detection (positive uses)

    Show real examples from familiar platforms: how Netflix recommendations work, why YouTube suggests certain videos, how TikTok's "For You" page is personalized, and why ads seem to "follow" users across websites.

    Critical Discussion Questions:

    • What are the benefits of personalization? (convenience, discovering new content, time-saving)
    • What are the potential risks? (filter bubbles, privacy invasion, manipulation, discrimination)
    • Where should the line be drawn between helpful and invasive?
  3. Guided Activity: Personal Privacy Audit (15 minutes)

    Students conduct a hands-on audit of their own digital footprint using the Personal Privacy Audit Worksheet. This activity helps students discover what information they've shared and where it's accessible.

    Audit Components (students work through each section):

    • Platform Inventory: List all apps, websites, and services used in the past week (social media, gaming, educational platforms, messaging apps)
    • Account Analysis: For 3-5 main platforms, document what personal information was provided during registration (name, age, location, email, phone number, interests, photos)
    • Permissions Check: Review what permissions each app has requested (camera, microphone, contacts, location, photos, notifications)
    • Privacy Settings Exploration: Examine current privacy settings on at least two major platforms, noting what options are available and what's currently enabled
    • Search History Review: Look at browser history or search history (if comfortable) to see what data trail exists
    • Third-Party Connections: Identify which services are connected to each other (apps that allow "sign in with Google/Facebook," data sharing between platforms)

    Teacher Facilitation:

    • Model the audit process with a sample student account (using dummy data) projected for the class
    • Circulate to provide support and answer questions
    • Remind students they're not required to share sensitive findings with others - this is for their own awareness
    • Encourage students to use privacy mode or incognito windows if checking personal accounts at school
    • For students without personal accounts, provide sample scenarios or have them audit family members' accounts (with permission)

    Reflection Questions on Worksheet:

    • Were you surprised by anything you discovered?
    • Which apps have access to more information than you realized?
    • Are there any permissions you gave that you're uncomfortable with now?
    • What information would you like to keep more private moving forward?
  4. Small Group Work: Real-World Privacy Scenarios (12 minutes)

    Divide class into small groups of 3-4 students. Each group receives a set of Real-World Privacy Scenarios Cards featuring authentic situations students might encounter. Groups analyze scenarios, identify privacy risks, and determine appropriate responses.

    Sample Scenarios Included in Cards:

    • Scenario 1 - Free App Offer: A popular photo editing app is free but requests access to your entire photo library, contacts, and location even though it doesn't seem to need that information. What should you do?
    • Scenario 2 - Social Media Quiz: A fun personality quiz on social media asks you to answer questions and share results. It says it needs to access your friend list and profile information. What are the risks?
    • Scenario 3 - Smart Speaker: Your family gets a smart speaker that's always listening for voice commands. Where should it be placed in your home, and what privacy concerns should your family discuss?
    • Scenario 4 - Gaming Platform: While playing an online game, someone asks for your personal information to "send you free game credits." How should you respond?
    • Scenario 5 - School Technology: Your school uses AI-powered learning software that tracks everything you do, how long you spend on each problem, and predicts your test scores. What privacy considerations exist?
    • Scenario 6 - Social Media Pressure: Your friends encourage you to share your location continuously through a social media app so everyone knows where you are. What factors should you consider?

    Group Discussion Framework (on each card):

    • What data is being collected in this scenario?
    • Who has access to this data, and how might they use it?
    • What are the immediate and long-term privacy risks?
    • What questions should you ask before making a decision?
    • What would be the safest course of action?
    • Are there alternative solutions that balance convenience and privacy?

    Groups prepare brief presentations (1-2 minutes each) sharing their scenario and recommended actions with the class.

  5. Legal Framework Overview: Your Digital Rights (8 minutes)

    Present age-appropriate overview of key privacy laws and regulations that protect students' personal data, using the Privacy Laws Reference Guide handout as a visual aid.

    Key Laws and Protections:

    • COPPA (Children's Online Privacy Protection Act): Requires parental consent for collecting data from children under 13. Explain why many social media platforms have age minimums of 13. Discuss what protections exist and why age verification is important.
    • FERPA (Family Educational Rights and Privacy Act): Protects student educational records and gives parents/students rights to access and control their school data. Relevant to classroom technology and learning management systems.
    • GDPR Basics (General Data Protection Regulation - Europe): Introduce the concept of "right to be forgotten" and data portability. Explain how this European law has influenced privacy practices globally, including in apps students use.
    • State Privacy Laws: Brief mention that many U.S. states are creating their own privacy laws (California Consumer Privacy Act as example). Students have rights even as minors.

    Key Rights Students Have:

    • Right to know what data is collected about them
    • Right to access their personal data
    • Right to request deletion of their data
    • Right to opt-out of data sharing or sales
    • Right to not be discriminated against for exercising privacy rights

    How to Exercise These Rights: Demonstrate where privacy policies are typically located, how to read them (focus on key sections), and how to submit data access or deletion requests. Show examples of privacy policy pages from familiar platforms.

    Transparency and Data Literacy: Discuss how companies must be transparent about data practices but often make policies hard to understand. Emphasize the importance of reading agreements before clicking "I accept."

  6. Action Planning and Closure: Take Control of Your Privacy (5 minutes)

    Students create a Personal Privacy Protection Action Plan using the template worksheet, committing to specific steps they will take to protect their privacy going forward.

    Action Plan Components:

    • Immediate Actions (This Week): List 3 specific steps to take immediately (review privacy settings on two platforms, delete unused apps, turn off unnecessary location services)
    • Ongoing Practices (This Month): Identify habits to develop (reading privacy policies before signing up, checking app permissions monthly, using strong passwords)
    • Family Conversation: One privacy topic to discuss with parents/guardians (smart home devices, family social media guidelines, location sharing policies)
    • Peer Education: One privacy tip to share with a friend who might not be aware of these issues

    Class Commitment: Create a collective class charter of privacy principles - students suggest one privacy practice for the class to commit to (e.g., "We will think before we post," "We will respect each other's digital boundaries," "We will question before we share personal information")

    Closing Discussion:

    • What was the most surprising thing you learned today?
    • How has this lesson changed your thinking about your digital presence?
    • What one action will you commit to taking this week?

    Remind students that privacy is an ongoing practice, not a one-time fix. Encourage them to regularly review their digital footprint and stay informed about new technologies and privacy concerns as AI continues to evolve.

    Preview of Extension Activities: Briefly mention the ongoing privacy challenge and opportunities for deeper exploration of digital rights activism and privacy advocacy.

Assessment Strategies

Formative Assessment

  • Privacy Audit Completion: Review students' Personal Privacy Audit Worksheets for thoroughness, depth of reflection, and identification of privacy concerns (check for completion of all sections, not content of personal accounts)
  • Scenario Analysis Participation: Observe small group discussions during Real-World Privacy Scenarios activity, noting students' ability to identify risks, analyze implications, and propose solutions
  • Legal Framework Comprehension Checks: Use thumbs up/down or think-pair-share during legal rights overview to assess understanding of key concepts like COPPA, consent, and data rights
  • Exit Ticket Questions: Students respond to quick prompts: "Name two ways AI collects your data" and "Identify one privacy setting you will change this week and why"
  • Class Discussion Quality: Monitor students' questions and contributions during whole-class discussions for depth of critical thinking about privacy implications

Summative Assessment

  • Privacy Protection Action Plan: Evaluate completed action plans for specificity, feasibility, and demonstration of understanding privacy protection strategies (rubric provided in downloadable materials)
  • Privacy Scenario Written Response: Students receive a new complex privacy scenario and write a 1-2 paragraph analysis identifying data collection methods, privacy risks, legal considerations, and recommended actions
  • Digital Citizenship Reflection Essay: Short essay (300-500 words) addressing: "How has your understanding of data privacy changed? What responsibilities do you have as a digital citizen in an AI-powered world?"
  • Privacy Settings Documentation Project: Students create a before-and-after documentation of privacy settings changes on one platform, explaining their decisions and the expected impact on data protection

Success Criteria

Students demonstrate mastery when they:

  • Accurately identify at least four different methods AI systems use to collect personal data and explain the purpose of each collection method
  • Complete a comprehensive privacy audit documenting their digital footprint across multiple platforms and identifying specific privacy concerns
  • Analyze real-world privacy scenarios by identifying data risks, considering multiple perspectives, and proposing appropriate protective actions based on learned principles
  • Explain the relevance of at least two privacy laws (such as COPPA and FERPA) to their own digital activities and can describe specific rights these laws provide
  • Create and begin implementing a personalized privacy protection action plan with at least three specific, actionable steps demonstrating practical application of lesson concepts
  • Articulate the balance between convenience and privacy in digital services, demonstrating critical thinking about when data sharing is acceptable and when it poses unacceptable risks

Differentiation Strategies

For Advanced Learners:

  • Research and present on advanced privacy topics such as differential privacy, homomorphic encryption, or blockchain-based identity management
  • Analyze actual privacy policies from major tech companies, comparing their practices and identifying gaps between stated policies and actual data collection
  • Investigate the technical mechanisms behind data tracking (cookies, fingerprinting, tracking pixels) and create educational materials explaining these to peers
  • Explore the relationship between AI bias and data privacy, examining how data collection practices can perpetuate or amplify discrimination
  • Conduct an extended privacy audit including data broker searches, examining what third-party companies have collected information about them

For Struggling Learners:

  • Provide a simplified, visual version of the Privacy Audit Worksheet with sentence starters and checkboxes rather than open-ended questions
  • Offer pre-selected scenario cards with fewer complex variables, focusing on one or two clear privacy issues rather than multiple interconnected concerns
  • Create a graphic organizer or visual flowchart to help students work through decision-making process for privacy scenarios
  • Partner struggling students with peer mentors during audit activity to provide support and model the thinking process
  • Break the legal framework section into smaller chunks with comprehension checks after each law, using concrete examples students can relate to immediately
  • Provide video tutorials demonstrating step-by-step how to change privacy settings on common platforms

For English Language Learners:

  • Pre-teach key vocabulary with visual supports: data collection, privacy, permission, consent, encryption, anonymity, tracking, footprint, audit, disclosure
  • Provide Privacy Laws Reference Guide in students' home languages where possible, or create simplified English versions with picture supports
  • Use visual diagrams and flowcharts extensively to illustrate data collection processes and privacy concepts, minimizing reliance on text
  • Allow ELL students to complete written components in their home language or use translation tools, focusing assessment on conceptual understanding rather than English proficiency
  • Create small groups intentionally mixing language proficiency levels, encouraging peer explanation and collaborative sense-making
  • Provide sentence frames for scenario discussions: "This data is risky because___," "I would protect my privacy by___," "The company should___"

For Students with Special Needs:

  • Provide digital versions of all worksheets that are compatible with screen readers and text-to-speech software for students with visual impairments
  • Offer audio recordings of privacy scenarios for students with reading difficulties, allowing them to listen and respond verbally or through dictation
  • Allow extended time for completion of privacy audit and action plan, or break these activities across multiple sessions with clear stopping points
  • Provide a privacy settings checklist with explicit step-by-step instructions and screenshots for students who benefit from procedural support
  • Offer alternative formats for final assessment (video presentation, oral explanation, visual poster) instead of written essay for students with writing challenges
  • Create a quiet workspace option for students with sensory sensitivities during group work, allowing them to participate via shared digital document
  • Use color-coding and icons consistently throughout materials to support students with cognitive processing differences

Extension Activities

30-Day Privacy Challenge:

Students commit to a month-long privacy improvement journey, documenting their progress weekly. Each week focuses on a different aspect: Week 1 - Audit and Clean Up (delete unused accounts, review settings); Week 2 - Strengthen Security (update passwords, enable two-factor authentication); Week 3 - Mindful Sharing (pause before posting, consider long-term implications); Week 4 - Advocacy (teach family member or friend about privacy practices). Students maintain a digital journal or blog (with appropriate privacy settings!) documenting their experiences, challenges, and insights. Culminate with presentations sharing the impact of their privacy transformation.

Privacy Policy Translation Project:

Students select a privacy policy from a commonly-used app or website (TikTok, Instagram, Roblox, etc.) and create a "translation" that middle schoolers can actually understand. Using plain language, visual aids, infographics, and real-world examples, they break down the complex legal text into digestible information answering key questions: What data does this company collect? How do they use it? Who do they share it with? What choices do users have? Students can create videos, one-page fact sheets, or interactive presentations to share with peers and younger students.

AI Privacy Debate Series:

Organize structured debates on controversial privacy issues related to AI: "Should schools use AI surveillance cameras to improve safety?" "Is personalized advertising helpful or manipulative?" "Do parents have the right to track their teenagers' locations at all times?" "Should AI be allowed to make decisions about people based on their data?" Students research multiple perspectives, develop evidence-based arguments, and practice respectful dialogue about complex ethical issues. Invite community members, parents, or school administrators to observe or participate in debates.

Cross-Curricular Connections:

  • Social Studies: Examine privacy rights throughout history and across cultures. How have privacy expectations changed from the pre-digital era to today? Compare privacy laws and cultural attitudes toward data protection in different countries.
  • Language Arts: Analyze persuasive techniques used in privacy policies and terms of service. Study how companies use language to make data collection seem benign or necessary. Create persuasive writing advocating for stronger privacy protections.
  • Mathematics: Calculate and visualize data generation rates, storage requirements, and the economic value of personal data. Create graphs showing the growth of data collection over time or comparing privacy practices across platforms.
  • Science: Investigate the biological and psychological effects of surveillance and privacy loss on human behavior, stress levels, and decision-making. Explore how constant data collection might affect adolescent brain development and social interactions.
  • Art: Create visual representations of abstract privacy concepts - what does a "digital footprint" look like? Design posters, digital art, or multimedia projects that communicate privacy messages to peers. Explore surveillance art and artists who address privacy themes.

Digital Rights Youth Advocacy Project:

Students form a Digital Rights Committee to advocate for privacy protections in their school and community. Possible projects include: creating privacy awareness campaigns for younger students; developing recommendations for school technology policies; writing letters to technology companies requesting more youth-friendly privacy controls; creating a student privacy rights handbook; organizing a Digital Privacy Awareness Week with activities, speakers, and information sessions. This long-term project develops civic engagement skills while applying privacy knowledge to create real-world change.

Privacy-Preserving Technology Exploration:

Introduce students to tools and technologies designed to protect privacy: browser extensions that block trackers, encrypted messaging apps, virtual private networks (VPNs), search engines that don't track users, password managers. Students research these tools, evaluate their effectiveness, create tutorials for peers, and make informed recommendations about which tools are appropriate and useful for middle school students. This activity combines technical exploration with critical evaluation skills.

Teacher Notes and Tips

Common Misconceptions to Address:

  • Misconception: "I have nothing to hide, so privacy doesn't matter to me."
    Clarification: Privacy is about control and autonomy, not secrecy. Explain that privacy allows us to develop our identity, make mistakes, and change without permanent public record. Even innocent data can be misused or taken out of context. Use analogies: you close bathroom doors not because you're doing something wrong, but because some things should be private. Privacy protects freedom and dignity.
  • Misconception: "If I delete something online, it's gone forever."
    Clarification: Digital information can persist indefinitely through backups, caches, archives, screenshots, and data sharing. Deletion removes visibility but may not remove all copies. Emphasize the importance of thinking carefully before posting rather than relying on deletion to undo mistakes. Discuss the "Internet Archive" and how information can be captured and preserved.
  • Misconception: "Privacy settings completely protect my information."
    Clarification: Privacy settings control who can see your content directly, but platforms themselves still collect extensive data. Companies can change privacy policies, experience data breaches, or use information in ways not immediately visible to users. Privacy settings are important but aren't a complete solution - we also need strong laws, ethical company practices, and personal vigilance.
  • Misconception: "Free apps and services don't cost anything."
    Clarification: Introduce the concept "If you're not paying for the product, you are the product." Free services monetize through data collection and advertising. This isn't inherently wrong, but users should understand the exchange: access to the service in return for their data and attention. Help students make informed decisions about whether this trade-off is worthwhile for each service.
  • Misconception: "Technology companies are trying to spy on me personally."
    Clarification: Most data collection is automated and algorithmic, not individual surveillance. Companies are interested in patterns across millions of users, not specific individuals (unless you're a high-profile person). However, this data can still be used in ways that affect individuals through profiling, discrimination, or manipulation. Balance paranoia with realistic understanding of data practices.

Preparation Tips:

  • Review your school's acceptable use policy and technology guidelines before this lesson to ensure all activities comply with school requirements and you can speak knowledgeably about school-specific privacy practices
  • Test all platforms and privacy settings in advance on the devices students will use - privacy settings can vary significantly between mobile apps and web versions, and between devices
  • Create a sample student account with dummy data to use for demonstrations - never use real student accounts or your personal accounts in front of the class
  • Prepare to address questions about specific situations students may be experiencing - have resources ready for serious concerns like stalking, harassment, or identity theft
  • Consider coordinating with your school counselor or technology coordinator to be available for students who discover concerning privacy issues during their audit
  • Preview all video content thoroughly for age-appropriateness and accuracy - privacy topics can sometimes include fear-mongering or outdated information
  • Set up a system for students to ask sensitive questions privately (anonymous question box, after-class consultation time) rather than requiring all questions to be public

Classroom Management:

  • Establish clear norms about respecting privacy during the audit activity - students should not look at peers' screens or share what they observe about others' accounts without permission
  • Be prepared for students to discover concerning information during their audit (excessive location tracking, unknown followers, concerning messages) - have a plan for how to respond supportively and appropriately
  • Monitor group discussions during scenario activity to ensure conversations stay productive and respectful - privacy topics can become heated when students have different values or experiences
  • Create a "parking lot" for questions that are off-topic or too detailed for class time, committing to address them later via email, office hours, or follow-up lesson
  • Have alternative activities ready for students who become overwhelmed or anxious about privacy topics - this lesson may cause worry for some students, and they may need a break or reassurance

Sensitivity and Safety Considerations:

  • Be aware that some students may have safety reasons for limiting their digital presence (domestic violence situations, custody disputes, witness protection) - don't require students to explain why they don't use certain platforms or share certain information
  • Recognize that privacy expectations and values vary across cultures and families - present information about privacy protection without imposing a single perspective on what level of privacy is "correct"
  • This lesson may reveal concerning situations requiring mandated reporting (grooming, exploitation, threats) - be prepared to follow your school's reporting procedures if needed
  • Some students may face parental surveillance that makes them uncomfortable - navigate this carefully, acknowledging that family rules vary while discussing general principles of trust and autonomy
  • Students from communities that have experienced discrimination or surveillance by authorities may have heightened concerns about data collection - validate these concerns and discuss how privacy protections can help address systemic inequities

Troubleshooting:

  • Problem: Students can't access their accounts at school due to network restrictions.
    Solution: Provide fictional scenario accounts, use screenshots and walkthroughs instead of live demos, or have students complete the audit portion as homework with parent permission.
  • Problem: Students claim they don't use any technology or have any online accounts.
    Solution: Expand the discussion to include family members' devices that students might use, gaming systems, smart TVs, or school technology platforms. Alternatively, provide hypothetical student profiles for analysis.
  • Problem: The lesson causes excessive anxiety or fear about technology use.
    Solution: Balance risk discussion with empowerment and practical solutions. Emphasize that awareness and action are more helpful than worry. Highlight positive uses of AI and data that respect privacy. End with specific, actionable steps students can take to feel more in control.
  • Problem: Students dismiss privacy concerns as "not a big deal" or exhibit apathy.
    Solution: Share age-appropriate real-world consequences of privacy violations: college admissions officers checking social media, employers screening candidates, insurance companies adjusting rates based on data, identity theft affecting credit scores, manipulation of elections through targeted misinformation. Make it personal and relevant to their future.
  • Problem: Discussion becomes dominated by students sharing personal stories or specific privacy violations they've experienced.
    Solution: While validating these experiences, redirect to the broader learning objectives. Offer to discuss individual situations privately after class. Keep the focus on general principles and skills applicable to multiple situations rather than solving specific problems during class time.

Additional Resources:

  • Common Sense Media's Digital Citizenship Curriculum provides additional privacy lessons and resources
  • Electronic Frontier Foundation (EFF) offers student-friendly materials about digital rights
  • Federal Trade Commission (FTC) has educational resources about privacy laws and protections
  • Privacy Rights Clearinghouse provides comprehensive information about privacy issues
  • Consider inviting a guest speaker: cybersecurity professional, privacy lawyer, or digital rights advocate