Machine learning and artificial intelligence make extraordinary discoveries possible, and autonomous systems are being rolled out in vital business and social settings including healthcare, policing, and education. Assumptions about the neutrality and objectivity of data may encode serious social and political bias into the results. High-profile examples show how these systems already incorporate into their design human flaws, biases, and assumptions, especially about women and their role in society. In this talk, Professor Gina Neff will show that explicitly thinking about gender in AI will help designers make AI systems that help humans make better—and fairer—decisions.
- @lnakamur And walls. There is always room on the walls. 08:37:32 AM September 20, 2021 from Twitter Web App in reply to lnakamur ReplyRetweetFavorite
- @lnakamur My gardening style is eclectic English granny cottage. 08:37:06 AM September 20, 2021 from Twitter Web App in reply to lnakamur ReplyRetweetFavorite
- Timeline break? Cobea scandens growing on my garden wall. https://t.co/glYOh6xypB 07:44:24 AM September 20, 2021 from Twitter for Android ReplyRetweetFavorite
- @ftripodi I share your anger. Wishing you and yours speedy and full recovery 02:11:56 PM September 19, 2021 from Twitter for Android in reply to ftripodi ReplyRetweetFavorite
- @lnakamur (And something something cement pour with new stone on top) 02:08:51 PM September 19, 2021 from Twitter for Android in reply to lnakamur ReplyRetweetFavorite