AI Systems Trained on Descriptive Data Can Make Harsher Decisions Than Humans
Artificial intelligence is becoming more prevalent in our lives, from Alexa to sentencing algorithms. But AI systems trained on descriptive data can actually make much harsher judgments than humans would. This could have disastrous consequences if the flaws in AI training data are not addressed.
AI Models Trained on Descriptive Data Make Harsher Judgments
A recent experiment by computer scientists from the University of Toronto and MIT highlighted a major issue in how AI models are trained. When humans labeled data descriptively (just stating facts without context), the resulting AI systems made harsher judgments compared to humans told the full context.
For example, when labeling photos of dogs without being told an apartment's "no aggressive dogs" policy, humans labeled more dogs as aggressive. The AI system trained on this data banned 20% more dogs.
Biases in Data and Training Can Hurt Marginalized Groups
Biased data produces biased AI systems. This could negatively impact marginalized groups in areas like criminal justice, hiring, loans, and healthcare.
For example, a widely used algorithm for sentencing predictions in the US falsely flagged black defendants as more likely to reoffend. MIT research also found algorithms could not detect black faces.
The AI Data Labeling Process Needs Improvement
The experiment shows humans make different judgments when labeling data descriptively versus contextually. AI trained on flawed descriptive data makes worse decisions than biased humans.
To avoid perpetuating societal biases, the AI data training process needs to be improved. Companies need to ensure proper context is provided to human labelers.
The Future Depends on Ethical and Fair AI
As AI becomes more ubiquitous, it's crucial that it makes fair and ethical decisions. Harsh, biased algorithms could ruin lives and reinforce discrimination. But with responsible data training, AI can be a positive force. The time to fix this issue is now.
Lorem ipsum dolor sit amet, consectetur adipiscing elit.
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique.
Lorem ipsum dolor sit amet, consectetur adipiscing elit.