Skip to the content.

5.3 Team Teach Notes

My notes for big idea 5.3 team teach

CSP Big Idea 5.3

Computer Bias

Unfair preferences built into a computer system, result of how people design, test, or use computing systems

  • Computer Bias is developed through Data Issues, Design Choices, and Testing Gaps

Everyday Examples of Bias

    1. Netflix Recommendations
      • The same genre will be suggested and this will keep the user in the same genre, preventing exposure to other genres even if they will like it
    1. Virtual Assistants with Female Voices
      • Subtly reinforces stereotypes that women are “helpers” or “assistants”
      • can exclude people who prefer a different voice or feel more comfortable with other default options
    1. Social Media Age Gaps
      • Designing of social media can prioritize certain demographics over others, especially in age groups

The HP Camera Incident

HP laptop camera couldn’t reliably track faces of people with darker skin tones

    1. It wasn’t intentional, but it happened because of limited tracking data
    1. It is harmful, it excludes certain demographics and describes how technology wasn’t made for them
    1. Should be fixed, prevent

Avoiding Bias in Tech

    1. Expand data, gather many different types of samples and make algorithms that recommend broader varieties
    1. Encourage Diverse teams, people with more perspectives will prevent computer bias further
    1. Constant Testing
    1. Document Assumptions, and be transparent about how algorithm makes decisions

Why Bias matters