If technological progress is to be socially just and democratic, data engineers will have to reach beyond their typical pathways of learning into the social sciences and humanities, says author and professor Wendy Chun.
Chun was the keynote speaker in the second Critical Tech Talk, a series of candid online conversations about how innovators can strive for positive change – the “tech for good” ethos.
Chun is the Canada 150 Research Chair in New Media at Simon Fraser University, head of the Digital Democracies Institute, and author of the book Discriminating Data: Neighbourhoods, Individuals and Proxies, which probes the role race, gender, class and sexuality play in big data and network analytics, and which kicked off Thursday’s discussion.
Produced by the Critical Media Lab at the University of Waterloo, the Critical Tech Talk series addresses society’s technological transformations. The university’s faculties of arts, environment, engineering, health, math, and science sponsor the series, with Communitech and the UW Office of Research.
The tone was set by moderator Marcel O’Gorman, UW Research Chair and Founding Director of the Critical Media Lab, and UW Faculty of Engineering Chair Mary Wells.
O’Gorman told the roughly 400 attendees that while “thinking critically about tech innovation can be a little bit uncomfortable, or very uncomfortable, maybe, in an innovation-centred community like ours. . . we may have to acknowledge discomfort . . . as a way to create new forms of connection and cohabitation.”
Wells echoed that, saying the series is intended to help create responsible engineers “who go out and think about the responsibilities and the obligations they carry, in their duties as engineers to create technology that is inclusive and can be trusted.”
Opening observations were shared by two UW students who had explored Chun’s book:
Bri Wiens (postdoc in communication arts) and Queenie Wu (fourth year, systems design engineering).
Both spoke to Chun’s work showing that biases have been baked into data systems that are often misunderstood to be neutral. In fact, those systems can amplify and automate discrimination, with Wiens pointing to homophily – the theory that like-minded people are attracted to each other (the “birds of a feather flock together” model) that drives some recommender systems. Homophily was used in the early and mid-20th century to promote eugenics and segregation, yet also underpins today’s predictive tech that creates both playlists and the polarized extremes we see on social media and elsewhere.
Wu agreed that homophily can reduce the diversity that some platforms purport to promote, with predictor software actually altering how identities can be activated and developed. And there is the concern that some can “game” the algorithms to support predictable behaviour.
Chun responded that the presence of angry polarized clusters is a consequence of recommender systems based on homophily: “polarization is the goal, not an error.”
She cited the COMPAS software used in the American justice system to predict recidivism of inmates, which has been criticized for using race, education and job status as predictors of the risk of returning to crime after release from prison.
“What’s so exciting about working across disciplines is that the humanities and the social sciences have such rich concepts of history and learning.” She argued that predictive AI is often based on the past, and if you are just repeating what has happened before, “you’re not learning.”
She said that we are at the point with AI that “we can take these rich concepts and try to do something else.” She added that just adding social sciences and the humanities is not the answer.
“If we have unethical tech now, it’s not because there is no social sciences or humanities in them, but there’s bad social sciences and humanities in them,” she said, citing the example of homophily as a product of social science. Sentiment analysis, developed as a way to help rule unruly internment camps and workplaces in the last century, is now used by AI-driven recommender engines to predict people’s preferences. “We need to embrace different concepts.”
In response to a question, Chun said that transdisciplinary thinking “is so important because you get challenged …. Never let your assumptions stay in place.” Transdisciplinary thinking helps you “take on bigger problems than you thought you’d ever solve.”
Asked how a student intern should respond to something they saw as wrong, Chun said work-term experiences might influence how one thinks and acts later. She recalled her own internship at an Ottawa modem-maker, when she was given a misogynistic mnemonic, or pattern of letters to help remember a sequence of tasks, to help with modem assembly. That moment was key to her pivot to humanities.
“This stuff is embedded in technology, how can I react to it? But at that moment, I didn’t say anything.”
Part of the reason she turned to humanities was “that I didn’t have the skills within what I was taught in engineering to respond to these adequately. A lot of the ways forward can only happen if we (the humanities and engineering) work together.”
Chun advised new hires to “never ignore it. If something is bothering you, you might not react immediately. You may need to think about it. You may need mentorship to help you. … Never lose that sense that something is wrong and something needs to be done.”
Discriminating data descriptors
Recommender systems – A subgroup of information filtering systems that predict the rating or preference that someone would give an item. Recommender systems can be used for anything from generating playlists to finding collaborators for joint projects.
Homophily – The tendency for people who are socially connected to share some tendencies or characteristics. The social media “echo chamber” effect is an expression of homophily.