MIT has taken offline a massive and highly-cited dataset that trained AI systems to use racist and misogynistic terms to describe people, The Register reports. The training set — called 80 Million Tiny Images, as that’s how many labeled images it’s scraped from Google Images — was created in 2008 to develop advanced object detection techniques. It has been used to teach machine-learning models to identify the people and objects in still images. As The Register’s Katyanna Quach wrote: “Thanks to MIT‘s cavalier approach when assembling its training set, though, these systems may also label women as whores or bitches,…
This story continues at The Next Web
No comments:
Post a Comment