Thursday, January 26, 2017

How Humans Bias AI (and How AI Might Help Us Be Less Biased)

It’s easy to think of AI as cold, unbiased, objective. Not quite, suggests Narrative Science Chief Scientist Kris Hammond explains, because we never know when AI will repeat our biases back to us.

“Just as our biases creep into how we talk to, we train, we teach our children, they creep into the way we talk to, train, and teach our AI systems,” says Hammond, also a professor of Computer Science at Northwestern University and founder of the University of Chicago’s Artificial Intelligence Laboratory.

Narrative Science uses machine learning to turn data into stories that help people better understand the world around them. Its natural language generation platform, Quill, has generated headlines by literally generating headlines: automating the production of earnings reports and sports stories, among other tasks.