Computer Bias
Unfair preferences built into a computer system, result of how people design, test, or use computing systems
- Computer Bias is developed through Data Issues, Design Choices, and Testing Gaps
Everyday Examples of Bias
-
- Netflix Recommendations
- The same genre will be suggested and this will keep the user in the same genre, preventing exposure to other genres even if they will like it
- Netflix Recommendations
-
- Virtual Assistants with Female Voices
- Subtly reinforces stereotypes that women are “helpers” or “assistants”
- can exclude people who prefer a different voice or feel more comfortable with other default options
- Virtual Assistants with Female Voices
-
- Social Media Age Gaps
- Designing of social media can prioritize certain demographics over others, especially in age groups
- Social Media Age Gaps
The HP Camera Incident
HP laptop camera couldn’t reliably track faces of people with darker skin tones
-
- It wasn’t intentional, but it happened because of limited tracking data
-
- It is harmful, it excludes certain demographics and describes how technology wasn’t made for them
-
- Should be fixed, prevent
Avoiding Bias in Tech
-
- Expand data, gather many different types of samples and make algorithms that recommend broader varieties
-
- Encourage Diverse teams, people with more perspectives will prevent computer bias further
-
- Constant Testing
-
- Document Assumptions, and be transparent about how algorithm makes decisions